hi, MUGUNDAN
ok, if your model is a imgae relate model, you can use the TIDL to compile it direct, (or may change a littel bit of the pipline), if your model is not a image relate model, you have to build the pipline by yoursel
best
hi, MUGUNDAN
ok, if your model is a imgae relate model, you can use the TIDL to compile it direct, (or may change a littel bit of the pipline), if your model is not a image relate model, you have to build the pipline by yoursel
best
Bro can you please share me the steps to follow from scratch for compilation, i am having my onnx model and my prototext file along with it, what is the actual process, kindly please share me, what is the system setup and the process, if it possible kindly share with some resources what you have
File “tflrt_delegate.py”, line 157, in run_model
experimental_delegates=[tflite.load_delegate(os.path.join(tidl_tools_path, ‘tidl_model_import_tflite.so’), delegate_options)])
File “/home/mugu1234/.local/lib/python3.6/site-packages/tflite_runtime/interpreter.py”, line 175, in load_delegate
delegate = Delegate(library, options)
File “/home/mugu1234/.local/lib/python3.6/site-packages/tflite_runtime/interpreter.py”, line 83, in init
self._library = ctypes.pydll.LoadLibrary(library)
File “/usr/lib/python3.6/ctypes/init.py”, line 426, in LoadLibrary
return self._dlltype(name)
File “/usr/lib/python3.6/ctypes/init.py”, line 348, in init
self._handle = _dlopen(self._name, mode)
OSError: /home/mugu1234/tidl_tools/tidl_model_import_tflite.so: cannot open shared object file: No such file or directory
Exception ignored in: <bound method Delegate.del of <tflite_runtime.interpreter.Delegate object at 0x7f800da5ce80>>
Traceback (most recent call last):
File “/home/mugu1234/.local/lib/python3.6/site-packages/tflite_runtime/interpreter.py”, line 118, in del
if self._library is not None:
AttributeError: ‘Delegate’ object has no attribute ‘_library’
when i am running sample compilation i got these errors
this problem it seem your haven’t finish setup the TIDL, you have follow the step from here, make sure your BBAI-64 Edge SDK version is same with TIDL version, normally BBAI-64 use Edge SDK r8.2, after your success setup the TIDL enviroment, you would get the delegate files(X_86), but you use ONNX, so you should not use tflite delegate, you have change the pipline for onnx delegate in source code as example. you can imagine the delegate file is a compiltor(saddly it is a close source binary file) which auto analysis your model layer by layer and general the artifast folder, with this artifact folder the model can success inference by DSP on BBAI-64
mugu1234@ubuntu:~/edgeai-tidl-tools$ export DEVICE=am62
mugu1234@ubuntu:~/edgeai-tidl-tools$ source ./setup.sh --load_armnn
X64 Architecture
Installing python packages…
Defaulting to user installation because normal site-packages is not writeable
Collecting git+https://github.com/kumardesappan/caffe2onnx (from -r ./requirements_pc.txt (line 12))
Cloning https://github.com/kumardesappan/caffe2onnx to /tmp/pip-req-build-tha84hrh
Running command git clone --filter=blob:none --quiet https://github.com/kumardesappan/caffe2onnx /tmp/pip-req-build-tha84hrh
Resolved https://github.com/kumardesappan/caffe2onnx to commit b7e73feed3bbc5ddbdf25b87af93a2bae596055d
Preparing metadata (setup.py) … done
Collecting dlr==1.10.0 (from -r ./requirements_pc.txt (line 13))
Using cached https://software-dl.ti.com/jacinto7/esd/tidl-tools/08_05_00_00/ubuntu18_04_x86_64/pywhl/dlr-1.10.0-py3-none-any.whl (1.2 MB)
ERROR: tvm-0.9.dev0-cp36-cp36m-linux_x86_64.whl is not a supported wheel on this platform.
what is your PC CPU platfrom? it only support X86-64, and which version of TIDL you use?,
okok, the whl is python3.6, use should create a python3.6 venv and then setup it
i do similar task with you, my model is a speech recognition model, but anyway, if you can success fix the pipline, the model import delegate would success general the artifacts for you, make sure your model layer are support by TIDL, if not you have to add that layers to densy_list, you can set debug_level = 4 to check.
Happy to see your response, i have completed setup and i could see tflite compilation and inference is working smoothly, can you please any custom script for compiling my onnx model?
hi, you can have a look this script https://github.com/TexasInstruments/edgeai-tidl-tools/blob/master/examples/osrt_python/ort/onnxrt_ep.py
Whether i have to make new setup for this or existing setup is enough?
no, you just need to set up once, just confirm the version same with your device SDK is enough