BBAI-64 building Tensorflow Lite custom model artifacts for libtidl_tfl_delegate

NB: I’m messing with this bbai64-emmc-flasher-debian-11.8-xfce-edgeai-arm64-2023-10-07-10gb snapshot on my Beaglebone AI-64 from here

sudo apt-get install libyaml-cpp-dev
sudo apt-get install cmake
export SOC=am68pa
source ./setup.sh
  • Added my model to models_configs in common_utils.py:
'my_model' : {
	'model_path' : os.path.join(models_base_path, 'saved_model.tflite'),
	'source' : {'model_url': '/path_to_my_model/saved_model.tflite', 'opt': True},
	'mean': [0, 0, 0],
	'scale' : [1, 1, 1],
	'num_images' : numImages,
	'num_classes': 4,
	'session_name' : 'tflitert',
	'model_type': 'classification'
},
  • Installed python requirenments:
pip install -r requirements_pc.txt
  • Dealt with some usual python’s “we never heard of backward compatibility” stuff.
  • Fixed models value in examples/osrt_python/tfl/tflrt_delegate.py with:
models = ['my_model']
  • And executed this python script with:
cd examples/osrt_python/tfl/
python3 tflrt_delegate.py -c
  • Got my_model at model-artifacts folder:
177_tidl_io_1.bin
177_tidl_net.bin
allowedNode.txt
param.yaml
saved_model.tflite
  • Added those files to my repo.
  • Modified my initModel function with all those links in main.go
1 Like