博文

目前显示的是 2016的博文

Fine-tuning pre-trained ResNet model with different learning rate

图片
Fine-tuning Facebook Torch implementation of ResNet model with different learning rate. with learning rate 0.1 with learning rate 0.01 with learning rate 0.001

One note for building Caffe RC3 on Ubuntu 14.04

When building Caffe RC3 on Ubuntu 14.04, you may run into this error: CXX/LD -o .build_release/tools/extract_features.bin CXX/LD -o .build_release/tools/device_query.bin CXX/LD -o .build_release/tools/net_speed_benchmark.bin CXX/LD -o .build_release/tools/test_net.bin //usr/lib/x86_64-linux-gnu/libunwind.so.8:对‘lzma_index_buffer_decode@XZ_5.0’未定义的引用 //usr/lib/x86_64-linux-gnu/libunwind.so.8:对‘lzma_index_size@XZ_5.0’未定义的引用 //usr/lib/x86_64-linux-gnu/libunwind.so.8:对‘lzma_index_uncompressed_size@XZ_5.0’未定义的引用 //usr/lib/x86_64-linux-gnu/libunwind.so.8:对‘lzma_stream_footer_decode@XZ_5.0’未定义的引用 解决方法是在~/.bashrc文件最后添加路径 export LD_LIBRARY_PATH=/lib/x86_64-linux-gnu/:$LD_LIBRARY_PATH

Some notes about building Caffe RC3 with Mac OS X 10.11.3, Anaconda, CUDA 7.5, cuDNN 4, Intel MKL and MATLAB R2015b

Environment: Mac OS X 10.11.3, Xcode 7.2, Anaconda Python 2.7.11, CUDA 7.5, cuDNN 4, Intel parallel_studio_xe_2016.1.043 mkl, homebrew boost 1.6.0, homebrew OpenCV 2.4.12, MATLAB R2015b. Note 1 : If your run into this problem afterwards "ld: warning: directory not found for option '-L/opt/intel/mkl/lib/intel64'", you can solve this problem as: cd /opt/intel/mkl/lib/ sudo ln -s . /opt/intel/mkl/lib/intel64 Suppose the environment is setup as above, first follow the official Caffe OS X Installation guide  step by step. Then modify your .bash_profile file as: export PATH=/usr/local/bin:/Applications/MATLAB_R2015b.app/bin:/Developer/NVIDIA/CUDA-7.5/bin:$PATH export PATH="/Users/ylzhao/anaconda/bin:$PATH" export DYLD_LIBRARY_PATH=/Developer/NVIDIA/CUDA-7.5/lib:$DYLD_LIBRARY_PATH export DYLD_FALLBACK_LIBRARY_PATH=/usr/local/cuda/lib::$HOME/anaconda/lib:/usr/local/lib:/usr/lib:$DYLD_FALLBACK_LIBRARY_PATH export PYTHONPATH=/usr/local/lib/python