TensorFlow Lite is designed to efficiently execute models on a variety of devices. This efficiency stems in part from the use of a special format when storing models. TensorFlow model must be converted to this format before it can be used by TensorFlow Lite.
The transformation model reduces the size of the model file and introduces optimization measures that do not affect the accuracy. Developers can choose to further reduce the size of the model file and improve the execution speed under some trade-offs. You can use the TensorFlow Lite converter to select the optimization actions to perform.
Because TensorFlow Lite supports some TensorFlow operators, not all models can be converted.
TensorFlow Lite converter is a tool to convert the trained TensorFlow model into TensorFlow Lite format, and it can also introduce optimization measures.
The converter is provided in the form of Python API. Although the converter can be used from the command line, it is recommended to use Python API for conversion.
The API provided by Tensorflow Lite converter can be obtained through help function.
import tensorflow as tf print("Tensorflow Lite Converter:") print(help(tf.lite.TFLiteConverter))
The following provides four model conversion methods for TensorFlow Lite.
1. Convert SavedModel format to. tflite model
import tensorflow as tf converter = tf.lite.tfliteConverter.from_saved_model(saved_model_dir) TensorFlow Lite_model = converter.convert(). with open('model.tflite', 'wb') as f: f.write(TensorFlow Lite_model)
Where, saved_model_dir passes in the path where SavedModel is located.
2. How Session imports converter
import tensorflow as tf input1 = tf.placeholder(name="input1", dtype=tf.float32, shape=(1, 32, 32, 3)) //Input 1 input2 = tf.get_variable("input2", dtype=tf.float32, shape=(1, 32, 32, 3)) //Input 2 output = input1 + input2 //Addition operation out = tf.identity(output, name="out") //Create output with tf.Session() as sess: //Create session sess.run(tf.global_variables_initializer()) //initialization converter = tf.lite.tfliteConverter.from_session(sess, [input1], [out]) //Create converter TensorFlow Lite_model = converter.convert() //Model export open("saveToTensorFlow Lite.tflite", "wb").write(TensorFlow Lite_model) //Model saving
3. Convert pb model to. tflite mode
import tensorflow as tf graph_def_file = "/model/mobilenet_v1_1.0_224/frozen_graph.pb" //pb model path input_arrays = ["input"] //Model input output_arrays = ["MobilenetV1/Predictions/Softmax"] //Model output converter = tf.lite.tfliteConverter.from_frozen_graph( //converter graph_def_file, input_arrays, output_arrays) TensorFlow Lite_model = converter.convert() open("converted_model.tflite", "wb").write(TensorFlow Lite_model) //Model saving
4. Convert h5 model to. tflite model
import tensorflow as tf converter = tf.lite.tfliteConverter.from_keras_model_file("mobilenet_slim.h5") TensorFlow Lite_model = converter.convert() open("mobilenet_slim.tflite", "wb").write(TensorFlow Lite_model)
After the model transformation, it can be used for reasoning deployment. For a detailed description of the code, please refer to the following books.
**
The content shared in this paper comes from the book "in-depth learning of computer vision practice", which was jointly completed by researcher Liu Dong and Xiao Ling and published by the electronic industry press.
The book from algorithm guide to model training to one-stop model deployment. The case notes in the book are detailed and have been verified by operation. The book consists of four parts:
The first part (Chapter 1 ~ 2) explains the basis and algorithm of in-depth learning and computer vision; The second part (Chapter 3 ~ 6) case explanation of traditional image processing algorithm;
The third part (Chapter 7-11) case explanation of computer vision direction; Part IV (chapters 12-13) Tensorflow
Lite explanation (source code analysis, model optimization, model transformation, etc.) and deployment cases of Tensorflow Lite on PC and mobile terminals.
**
This book can provide reference and help for readers who are getting started with computer vision and readers who want to deploy the model. JD link: https://u.jd.com/rwk3HPT
The content shared in this section is the model deployment of Windows. If readers want to use the model deployment of mobile terminal and Linux terminal, it is recommended to refer to the cases in the book.