TensorFlow .pb: Save and Display Models

Basics
pb stands for protobuf. In TensorFlow, the protbuf file contains the graph definition as well as the weights of the model. Thus, a pb file is all you need to be able to run a given trained model.

Given a pb file, you can load it as follow.

def load_pb(path_to_pb):
    with tf.gfile.GFile(path_to_pb, "rb") as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
    with tf.Graph().as_default() as graph:
        tf.import_graph_def(graph_def, name='')
        return graph

Once you have loaded the graph, you can basically do anything. For instance, you can retrieve tensors of interest with

input = graph.get_tensor_by_name('input:0')
output = graph.get_tensor_by_name('output:0')

and use regular TensorFlow routine like:

sess.run(output, feed_dict={input: some_data})

Explanation

The .pb format is the protocol buffer (protobuf) format, and in Tensorflow, this format is used to hold models. Protobufs are a general way to store data by Google that is much nicer to transport, as it compacts the data more efficiently and enforces a structure to the data. When used in TensorFlow, it’s called a SavedModel protocol buffer, which is the default format when saving Keras/ Tensorflow 2.0 models. More information about this format can be found here and here.

For example, the following code (specifically, m.save), will create a folder called my_new_model, and save in it, the saved_model.pb, an assets/ folder, and a variables/ folder.

first download a SavedModel from TFHub.dev, a website with models
m = tf.keras.Sequential([
    hub.KerasLayer("https://tfhub.dev/google/imagenet/mobilenet_v2_130_224/classification/4")
])
m.build([None, 224, 224, 3])  # Batch input shape.

m.save("my_new_model") # defaults to save as SavedModel in tensorflow 2

In some places, you may also see .h5 models, which was the default format for TF 1.X. source
Extra information: In TensorFlow Lite, the library for running models on mobile and IoT devices, instead of protocol buffers, flatbuffers are used. This is what the TensorFlow Lite Converter converts into (<span>.tflite</span>format). This is another Google format which is also very efficient: it allows access to any part of the message without deserialization (unlike json, xml). For devices with less memory (RAM), it makes more sense to load what you need from the model file, instead of loading the entire thing into memory to deserialize it.

Loading SavedModels in TensorFlow 2

I noticed BiBi’s answer to show loading models was popular, and there is a shorter way to do this in TF2:

import tensorflow as tf
model_path = "/path/to/directory/inception_v1_224_quant_20181026"
model = tf.saved_model.load(model_path)

Note,

  • the directory (i.e. inception_v1_224_quantBasis _20181026) has to have a saved_model.pb or saved_model.pbtxt, otherwise the code will crash.You cannot specify the <span>.pb</span>path, specify the directory.

  • you might get TypeError: 'AutoTrackable' object is not callable for older models,fix here.

If you load a TF1 model, I found that I don’t get any errors, but the loaded file doesn’t behave as expected. (e.g. it doesn’t have any functions on it, like predict)

More on Saved Models: Official Docs

includes: create, save, load and finetuning

Soln1 (March 31st 2017)

As far as I know, you don’t need to create any summaries to load the graph into Tensorboard. If you begin to create a summarywriter and then add the graph to it then you should see the graph appear in Tensorboard. I have some code that does that (as part of another project). I do however agree that being able to import something into Tensorboard would be handy.

I’ve quickly created a bit of code to load a graph into Tensorboard. See how that goes. Also available as a gist here

import tensorflow as tf
from tensorflow.python.platform import gfile
with tf.Session() as sess:
    model_filename ='PATH_TO_PB.pb'
    with gfile.FastGFile(model_filename, 'rb') as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
        g_in = tf.import_graph_def(graph_def)
LOGDIR='YOUR_LOG_LOCATION'
train_writer = tf.summary.FileWriter(LOGDIR)
train_writer.add_graph(sess.graph)

==> the variation in .pb file and tf versions might cause the example scripts to fail, see from further discussions how to deal with these issues, such as:

Don’t forget to flush ;) (Nov. 29th 2019)

train_writer.flush()

Soln 2 (Dec. 8th 2020)

New Readers/Visitors

Please note that my original solution was thrown together to temporarily answer the issue to help the user. I then PR’d a helper function into tf where much smarter people have taken it over and seemingly converted for 2.0. I’ve been fairly dormant on TF now too. Judging by the reactions on my original solution this is clearly a high traffic issue, but please ensure you checkout the latest version of tensorflow/import_pb_to_tensorboard.py at master · tensorflow/tensorflow · GitHub
You may be best off raising a new issue with that code explicitly (or other tb bits. Tb has become a lot more sophisticated since the early days when I was still working with tf regularly!). Do reference this issue still so those who land here can link over to your issue and find a better solution.

Also note that this method is most likely superseded by more accessible/superior ways of loading your model into TB ==> it seems so far loading the models is simplified but working with TB still requires crutches. I’ve not actually ever needed to use my own helper function, especially if you are using the Keras functionality, so I would recommend that you take some time to get your TB integration working in the best, native way possible so you can get this most out of it. I’ll keep an eye on this to help where I can. Cheers

==> script usage example from Feb. 13th 2020:

local file management and import/load local models in colab:

Original: https://blog.csdn.net/maxzcl/article/details/123729236
Author: EverNoob
Title: TensorFlow .pb: Save and Display Models

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/509266/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球