Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Looking for pthread.h - not found #10

Closed
zubairahmed-ai opened this issue Jan 31, 2019 · 22 comments
Closed

Looking for pthread.h - not found #10

zubairahmed-ai opened this issue Jan 31, 2019 · 22 comments

Comments

@zubairahmed-ai
Copy link

CMakeError.log
CMakeOutput.log

I am getting the errors below

CMake Error at CMakeLists.txt:131 (add_subdirectory):
The source directory E:/Documents/Projects/neo-ai-dlr/3rdparty/neo-ai-treelite does not contain a CMakeLists.txt file.

I tried git clone of neo-ai-treelite in this but then I still get this error

-- Looking for pthread.h
-- Looking for pthread.h - not found
-- Found Threads: TRUE
-- TVM_RUNTIME_LINKER_LIBS:
-- Configuring incomplete, errors occurred!
See also "E:/Documents/Projects/neo-ai-dlr/build/CMakeFiles/CMakeOutput.log".
See also "E:/Documents/Projects/neo-ai-dlr/build/CMakeFiles/CMakeError.log".

Not sure how to fix this

@zubairahmed-ai
Copy link
Author

Even after my cmake build isnt successful and got these errors

(tfod) E:\Documents\Projects\neo-ai-dlr\build>cmake ..; make -j4; cd ..
-- Building for: Visual Studio 15 2017
-- Selecting Windows SDK version 10.0.16299.0 to target Windows 10.0.17134.
-- The C compiler identification is MSVC 19.13.26129.0
-- The CXX compiler identification is MSVC 19.13.26129.0
-- Check for working C compiler: D:/ProgramData/VisualStudio2017/BuildTools/VC/Tools/MSVC/14.13.26128/bin/Hostx86/x86/cl.exe
-- Check for working C compiler: D:/ProgramData/VisualStudio2017/BuildTools/VC/Tools/MSVC/14.13.26128/bin/Hostx86/x86/cl.exe -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: D:/ProgramData/VisualStudio2017/BuildTools/VC/Tools/MSVC/14.13.26128/bin/Hostx86/x86/cl.exe
-- Check for working CXX compiler: D:/ProgramData/VisualStudio2017/BuildTools/VC/Tools/MSVC/14.13.26128/bin/Hostx86/x86/cl.exe -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build with RPC support...
-- Build with Graph runtime support...
-- VTA build is skipped in Windows..
-- Found OpenMP_C: -openmp (found version "2.0")
-- Found OpenMP_CXX: -openmp (found version "2.0")
-- Found OpenMP: TRUE (found version "2.0")
CMake Error at 3rdparty/neo-ai-treelite/CMakeLists.txt:87 (add_subdirectory):
The source directory

E:/Documents/Projects/neo-ai-dlr/3rdparty/neo-ai-treelite/3rdparty/fmt

does not contain a CMakeLists.txt file.

-- Looking for clock_gettime in rt
-- Looking for clock_gettime in rt - not found
-- Looking for fopen64
-- Looking for fopen64 - not found
-- Looking for C++ include cxxabi.h
-- Looking for C++ include cxxabi.h - not found
-- Looking for nanosleep
-- Looking for nanosleep - not found
-- Looking for backtrace
-- Looking for backtrace - not found
-- Could NOT find Backtrace (missing: Backtrace_LIBRARY Backtrace_INCLUDE_DIR)
-- Check if the system is big endian
-- Searching 16 bit integer
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking for stdint.h - found
-- Looking for stddef.h
-- Looking for stddef.h - found
-- Check size of unsigned short
-- Check size of unsigned short - done
-- Using unsigned short
-- Check if the system is big endian - little endian
-- E:/Documents/Projects/neo-ai-dlr/3rdparty/neo-ai-treelite/dmlc-core/cmake/build_config.h.in -> E:/Documents/Projects/neo-ai-dlr/3rdparty/neo-ai-treelite/dmlc-core/include/dmlc/build_config.h
-- Found PythonInterp: D:/ProgramData/Anaconda3/envs/tfod/python.exe (found version "3.6.5")
CMake Error at 3rdparty/neo-ai-treelite/runtime/native/CMakeLists.txt:94 (install):
install Library TARGETS given no DESTINATION!

-- Could NOT find JNI (missing: JAVA_AWT_LIBRARY JAVA_JVM_LIBRARY JAVA_INCLUDE_PATH JAVA_INCLUDE_PATH2 JAVA_AWT_INCLUDE_PATH)
CMake Error at 3rdparty/neo-ai-treelite/CMakeLists.txt:142 (install):
install Library TARGETS given no DESTINATION!

-- Looking for pthread.h
-- Looking for pthread.h - not found
-- Found Threads: TRUE
-- TVM_RUNTIME_LINKER_LIBS:
-- Configuring incomplete, errors occurred!
See also "E:/Documents/Projects/neo-ai-dlr/build/CMakeFiles/CMakeOutput.log".
See also "E:/Documents/Projects/neo-ai-dlr/build/CMakeFiles/CMakeError.log".

I tried running the python setup.py install command but I got the following error

(tfod) E:\Documents\Projects\neo-ai-dlr\python>python setup.py install
D:\ProgramData\Anaconda3\envs\tfod\lib\distutils\dist.py:261: UserWarning: Unknown distribution option: 'root_script_source_version'
warnings.warn(msg)
D:\ProgramData\Anaconda3\envs\tfod\lib\distutils\dist.py:261: UserWarning: Unknown distribution option: 'default_python'
warnings.warn(msg)
running install
running bdist_egg
running egg_info
writing dlr.egg-info\PKG-INFO
writing dependency_links to dlr.egg-info\dependency_links.txt
writing requirements to dlr.egg-info\requires.txt
writing top-level names to dlr.egg-info\top_level.txt
reading manifest file 'dlr.egg-info\SOURCES.txt'
writing manifest file 'dlr.egg-info\SOURCES.txt'
installing library code to build\bdist.win-amd64\egg
running install_lib
running build_py
running build_ext
installing package data to build\bdist.win-amd64\egg
running install_data
error: can't copy '..\build\lib\libdlr.dll': doesn't exist or not a regular file

@zubairahmed-ai
Copy link
Author

Still got this error after you fixed the issue #3

-- Could NOT find JNI (missing: JAVA_AWT_LIBRARY JAVA_JVM_LIBRARY JAVA_INCLUDE_PATH JAVA_INCLUDE_PATH2 JAVA_AWT_INCLUDE_PATH)
CMake Error at 3rdparty/neo-ai-treelite/CMakeLists.txt:142 (install):
install Library TARGETS given no DESTINATION!

-- Looking for pthread.h
-- Looking for pthread.h - not found
-- Found Threads: TRUE
-- TVM_RUNTIME_LINKER_LIBS:
-- Configuring incomplete, errors occurred!

Please see these

CMakeError.log
CMakeOutput.log

@hcho3
Copy link
Contributor

hcho3 commented Feb 1, 2019

@zubairahmed-ai POSIX Thread (pthread.h) is not available on Windows. We need to locate where it is used and replace with a cross-platform alternative.

@hcho3
Copy link
Contributor

hcho3 commented Feb 1, 2019

Also, this line won't work in Windows:

libpath = os.path.join(os.path.dirname(
os.path.abspath(os.path.expanduser(__file__))), 'libdlr.so')

TODO: test DLR in Windows

@zubairahmed-ai
Copy link
Author

Did you see that your colleague fixed the libdir issue? Is this related? #3

@hcho3
Copy link
Contributor

hcho3 commented Feb 2, 2019

No, this is a different issue.

@zubairahmed-ai
Copy link
Author

Alright so please test it on windows, I really need to have it working on Windows. Thanks

@zubairahmed-ai
Copy link
Author

Any update to this?

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

Working on it.

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

@zubairahmed-ai I've submitted two patches to get Windows builds working: neo-ai/treelite#1, neo-ai/tvm#1. With these patches, I was able to build dlr.dll on my Windows machine.

Once the patches are reviewed and merged, we'll let you know.

@hcho3 hcho3 mentioned this issue Feb 5, 2019
@zubairahmed-ai
Copy link
Author

Sweet. Please do update the instructions to use this on Windows if required

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

@zubairahmed-ai How do you plan to compile ML models? (Keep in mind that DLR is a runtime only, so compilation will need to occur somewhere.) Currently, Amazon SageMaker Neo only supports Linux targets:

screen shot 2019-02-04 at 11 51 44 pm

So to compile ML models to run in Windows machines, we'll need to use TVM and Treelite directly. Would you be able to explain your potential use cases, so that we can provide some pointers as to how to proceed with compilation.

@zubairahmed-ai
Copy link
Author

Thanks for the explanation, is it possible to use these models on mobile devices?
I see that Raspberry Pi is supported along with Jetson and DeepLens, is that all from the edge devices list?

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

@zubairahmed-ai

I see that Raspberry Pi is supported along with Jetson and DeepLens, is that all from the edge devices list?

Yes, we currently support 4 edge device targets (Jetson TX 1/2, DeepLens, Raspberry Pi 3b). Edge devices generally run a variant of Linux, so we have been focusing on Linux support mainly.

is it possible to use these models on mobile devices?

Not yet. Running ML models on mobile devices requires close collaboration with phone makers. There is active effort under way toward this end.

@zubairahmed-ai
Copy link
Author

Alright make sense, then I will try running them on Windows itself with TVM and Treelite, please do let me know how to do that exactly

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

Treelite lets you compile XGBoost (decision tree ensemble) models in a few lines of code:

import treelite
model = treelite.Model.load('my_model.model', model_format='xgboost')
model.export_lib(toolchain='msvc', libpath='./mymodel.dll', verbose=True)

(from Treelite Quick Start)

TVM lets you compile deep learning models written in MXNet, TensorFlow, etc. Here is the link for tutorials. Compiling deep learning models is considerably more complex than XGBoost models, and not all models may be supported (due to existence of many DL operators). Which DL framework are you using, and what kind of DL architecture are you planning to use? In general, computer vision models are well supported.

@zubairahmed-ai
Copy link
Author

zubairahmed-ai commented Feb 5, 2019

I am using SSD MobileNet v2 mostly, but the one I'm struggling with most is RetinaNet because its huge in size, would this be supported?

I used Keras with Tensorflow backend to train RetinaNet and SSDMobileNetv2 is trained with Tensorflow using its configuration file

Thanks for the links

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

Thanks for your reply. I will reach out to my teammates and get back to you regarding SSD and RetinaNet support.

@hcho3
Copy link
Contributor

hcho3 commented Feb 5, 2019

@zubairahmed-ai To answer your question:

  • TensorFlow models with SSD (Single-Shot Detector) is currently not supported in upstream TVM, since SSD in TensorFlow uses control flows such as loops. Note that TVM does support SSD models trained with MXNet. Support for TensorFlow SSD is in the works.
  • RetinaNet uses a custom loss called focal loss, and currently TVM does not implement this loss. So
    RetinaNet won't work out-of-the-box with TVM. To compile RetinaNet with TVM you'll need to write the focal loss function in Python and register it in TVM. See tutorial for hybrid script. If you decide to go with this route, I suggest that you utilize the TVM user forum to reach out to the TVM user community for help.

@hcho3 hcho3 mentioned this issue Feb 6, 2019
@zubairahmed-ai
Copy link
Author

Tough luck
Alright I will check it out

@hcho3 hcho3 closed this as completed in #17 Feb 7, 2019
hcho3 added a commit that referenced this issue Feb 7, 2019
* Fix Windows build

Fixes #10.

* Use correct paths for model artifacts

* Further refine library detection logic

* Ignore subdirectories in model artifact directory, don't throw
@hcho3
Copy link
Contributor

hcho3 commented Feb 7, 2019

Resolving this now, since Windows support has been added. Will follow up with another PR to add build instructions for Windows

@zubairahmed-ai
Copy link
Author

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants