TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It allows you to run machine learning models on edge devices with low latency, eliminating the need for a server.
After the development of the TensorFlow model, we can convert the same to a more efficient and smaller version by converting it into a Tflite model format. Let’s explore how.
Conversion Workflow of TensorFlow Lite

Conversion Evaluation of TensorFlow Lite
Here are some things we should undergo, before conversion of our model –
- The contents of your model are compatible with the Tflite format.
- Model’s size and complexity.
- Ensure your model should be good enough to be used on mobile and edge devices.
Conversion Types for TensorFlow Lite
There are two different ways we can convert our model –
1. Python API
This allows you to integrate the conversion into your development pipeline, apply optimizations, add metadata, and do many other tasks that simplify the conversion process.
There are three different ways we can use the Tflite converter with Python API
- Convert TF SaveModel to TF Lite (tf.lite.TFLiteConverter.from_saved_model())
import tensorflow as tf converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir) tflite_converted_model = converter.convert() with open('trained_model.tflite', 'wb') as f: f.write(tflite_converted_model)
- Convert Keras PreBuilt Model to TF Lite (tf.lite.TFLiteConverter.from_keras_model()
import tensorflow as tf
trained_model = tf.keras.models.Sequential([
tf.keras.layers.Dense(units=1, input_shape=[1]),
tf.keras.layers.Dense(units=16, activation='relu'),
tf.keras.layers.Dense(units=1)
])
trained_model.compile(optimizer='sgd', loss='mean_squared_error')
trained_model.fit(x=[-1, 0, 1], y=[-3, -1, 1], epochs=5)
converter = tf.lite.TFLiteConverter.from_keras_model(trained_model)
tflite_converted_model = converter.convert()
with open('trained_model.tflite', 'wb') as f:
f.write(tflite_converted_model)
- Concrete Function to TF Lite (tf.lite.TFLiteConverter.from_saved_model())
import tensorflow as tf
class Squared(tf.Module):
@tf.function(input_signature=[tf.TensorSpec(shape=[None], dtype=tf.float32)])
def __call__(self, x):
return tf.square(x)
model = Squared()
concrete_func = model.__call__.get_concrete_function()
converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func],
model)
tflite_model = converter.convert()
with open('model.tflite', 'wb') as f:
f.write(tflite_model)
2. Command-line Tool
This only supports basic model conversion.
- Convert TF SaveModel to TF Lite
tflite_convert \ --saved_model_dir=/tmp/mobilenet_saved_model \ --output_file=/tmp/mobilenet.tflite
- Convert Keras PreBuilt Model to TF Lite
tflite_convert \ --keras_model_file=/tmp/mobilenet_keras_model.h5 \ --output_file=/tmp/mobilenet.tflite
Conclusion
In this blog, we learned how we can convert our TensorFlow models to Tflite models by different approaches. I hope this will be a great learning blog for everyone.
References
- https://www.tensorflow.org/
- https://medium.com/techwasti/tensorflow-lite-converter-dl-example-febe804b8673


