Run an inference with tflite-runtime inside a Raspberry Pi 4B.
Written by George Soloupis ML GDE.
This is a post on how you can use TensorFlow Lite Interpreter inside a single board computer such as the Raspberry Pi. Due to the limited resources this computer has we can skip the whole TensorFlow installation and proceed with the lighter tflite-runtime. You can find more details about the versions that are available here.
The version that you are going to install inside the Raspberry is critical because then you have to train the models that you are going to use with the same version of TensorFlow. Some times the dependencies/prerequisites inside Pi forbides you to install the latest version. You can update Python and Numpy packages though which will allow you to install the latest software.
You have two options to install the tflite-runtime. The first one is use the terminal and execute:
python3 -m pip install tflite-runtime
This will check the environment and install the adequate version. For my case 2.5.0 was the one that got installed.
Another option is to use the very cool IDE named Thonny that comes along with the Raspberry. You can open Tools -> Manage Packages and you will get a screen that you can view the packages that are already installed and you can search Pypi for the ones you desire. From there you can find information about the different versions of the tflite-runtime and install one of it.

After installation you are good to use the Interpreter inside a python file. Check some general information about this library at the official site.
At a previous blog post where we processed logs from simple air sensors with machine learning we used a Colab notebook. At the very bottom we ran inference with the TFLite Interpreter:
# Load the TFLite model and allocate tensors.
interpreter = tf.lite.Interpreter(model_path="/content/food_model_250.tflite")
interpreter.allocate_tensors()
# Get input and output tensors details.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
print(input_details)
print(output_details)
interpreter.set_tensor(input_details[0]['index'], tf.cast(test_features, tf.float32))
interpreter.invoke()
# The function `get_tensor()` returns a copy of the tensor data.
# Use `tensor()` in order to get a pointer to the tensor.
output_data = interpreter.get_tensor(output_details[0]['index'])
print(output_data)
The same procedure we are going to use at the Raspberry but instead of the ‘tf.lite.Interpreter’ we are going to use ‘tflite_runtime.Interpreter’. The manipulation of the data has to be of the same and as a result we are getting the exact same outcome:
input_array = []
for item in lst:
input_array.append(float(item)) test_features = np.array(input_array).astype(np.float32) test_features = np.expand_dims(test_features, axis=0)
# Load the TFLite model and allocate tensors.
interpreter = tflite_runtime.Interpreter(model_path="food_model_250.tflite") interpreter.allocate_tensors()
# Get input and output tensors details.
input_details = interpreter.get_input_details() output_details = interpreter.get_output_details() print(input_details)
print(output_details)interpreter.set_tensor(input_details[0]['index'], test_features) interpreter.invoke()
# The function `get_tensor()` returns a copy of the tensor data. # Use `tensor()` in order to get a pointer to the tensor. output_data = interpreter.get_tensor(output_details[0]['index']) print(output_data)
Take a look at the sensors_tf_runtime.py file where we send a command to the Raspberry to collect data and run inference in the end.
Conclusion
If you need only to run inference inside the Raspberry there is no need to install the full TensorFlow package but you can select the lighter tflite-runtime. You must pay attention what version you have inside the single board computer because you have to train the models with the same TensorFlow version. If you follow the procedure exactly and use the same data types you can receive the exact same result. Stay tuned for the connection of the Pi with an android phone through Bluetooth so we can have visual results without watching the Raspberry’s screen.