Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is Torch faster than Caffe? #6

Closed
nitish11 opened this issue Aug 16, 2016 · 7 comments
Closed

Is Torch faster than Caffe? #6

nitish11 opened this issue Aug 16, 2016 · 7 comments

Comments

@nitish11
Copy link

nitish11 commented Aug 16, 2016

Hi,

I am using the gender detection model in Torch and in Caffe for detection from live camera.

Running the code on CPU and keeping the same models file, I am getting different prediction times.
For Caffe, it is ~1.30 seconds per frame.
For Torch, it is ~0.45 seconds per frame.

What could be the possible reason for the time difference? Is Torch faster than Caffe?

@admercs
Copy link

admercs commented Aug 17, 2016

LuaJIT is faster than Python and Torch is generally faster than Caffe from what I've seen in benchmarks, which are a bit outdated: https://github.com/soumith/convnet-benchmarks. I hear Nvidia gives the best support to Torch, which they use for much of their own work (e.g., autonomous car demonstrations), as others like Google and Nervana/Intel compete on hardware. cuDNN 5 speeds up Torch quite a bit: https://devblogs.nvidia.com/parallelforall/optimizing-recurrent-neural-networks-cudnn-5/.

@nitish11
Copy link
Author

nitish11 commented Aug 17, 2016

@adam-erickson : 👍 thanks.

But, the huge difference in only one frame is the concern. The benchmark has not compared LuaJIT with Caffe.

@GilLevi
Copy link
Owner

GilLevi commented Aug 17, 2016

Hi @nitish11 ,

Thanks for your interest in our work and thanks for this cool repository:
https://github.com/nitish11/GenderRecognition

It's embarrassing to admit it, but I never worked with Torch and I really can't say which one is faster (or in this case - why you get faster run-times in Torch).

Best,
Gil.

@nitish11
Copy link
Author

I checked torch and Caffe computation engine called BLAS.

In Ubuntu 14.04,

ldd /home/nitish/caffe/build/lib/libcaffe.so
ldd /home/nitish/torch/install/lib/libTH.so

From the output, I observed that torch is linked against openblas, and caffe is linked against libcblas, which might be the reason for slower Caffe.

Solution : Build Caffe with OpenBlas

@admercs
Copy link

admercs commented Aug 18, 2016

Are you sure it's not simply the difference in looping speed between LuaJIT and Python? It can be quite large. Similar to Julia, LuaJIT is closer to C.

@nitish11
Copy link
Author

I am not sure about looping speed between LuaJIT and Python.

@MohanaRC
Copy link

MohanaRC commented Nov 9, 2016

There is a better way to call torch from python code using wrapper.
Please check

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants