This is the last story of this series.
Export trained model(Checkpoint) to Tensorflow graph
If you have successfully trained your model, you might get checkpoint files in the model_dir directory(the directory you set in the training command) Checkpoint consists of 3 different files(data, index, meta) per each checkpoint timing and they are distinguished by their extension. You may see sample checkpoint as below. This is the checkpoint created after 43450 epochs of training.
Now you should export the frozen tensorflow graph from the checkpoint files. You can use below command for this work.
INPUT_TYPE=image_tensorPIPELINE_CONFIG_PATH="object_detection/masknet/facessd_mobilenet_v2_quantized_320x320_open_image_v4/noquant_mask_pipeline.config" #Config file address, this should be same as the one you input for the trainingTRAINED_CKPT_PREFIX="object_detection/pvc/mask_train_model/model.ckpt-50000" #Address of checkpoint files, this corresponds to MODEL_DIR in the training commandEXPORT_DIR="object_detection/pvc/mask_model_export_test" #The address where exported frozen graph is storedpython object_detection/export_inference_graph.py \
This will create frozen_inference_graph.pb file in the EXPORT_DIR. Now you can use this file for model inference. You may use this code to run inference with your test image.
Export tflite graph
If your ultimate goal is to produce a tflite file that can run on mobile devices, you should use different export command. This looks very similar to above command, but you use different python file for this. Also, note that your config file had to include graph_rewriter option from the training.
PIPELINE_CONFIG_PATH="object_detection/masknet/facessd_mobilenet_v2_quantized_320x320_open_image_v4/mask_pipeline.config"TRAINED_CKPT_PREFIX="object_detection/pvc/mask_train_model/model.ckpt-50000"EXPORT_DIR="object_detection/pvc/mask_model_export_tflite"python object_detection/export_tflite_ssd_graph.py \
This command will create tflite_graph.pb in the EXPORT_DIR. But unlike frozen_inference_graph.pb, you cannot run this with tflite. You need another conversion.(It was a long process…. but almost finished!)
You need to run TOCO(Tensorflow lite Optimizing COnverter) to finally get a runnable tflite model. And TOCO requires you to install Bazel to run this. My coworker Woochul explained the process in his process — follow this link for details.
Woochul also explained TPU conversion and running the model with Google Coral(on EdgeTPU), so if you are interested in brand-new technologies this will be a great source.
This is final inference result with test image from my model. (Non-max suppression is not applied so some boxes are overlapped)
Thank you for reading!