tensorflow 实现从checkpoint中获取graph信息
代码:
import tensorflow as tf sess = tf.Session() check_point_path = 'variables' saver = tf.train.import_meta_graph('variables/save_variables.ckpt.meta') saver.restore(sess, tf.train.latest_checkpoint(check_point_path)) graph = tf.get_default_graph() #print(graph.get_operations()) #with open('op.txt','a') as f: # f.write(str(graph.get_operations())) op1 = graph.get_tensor_by_name('fully_connected/biases:0') print(op1)
使用函数graph.get_operations()获取ckpt.meta中保存的graph中的所有operation,而tensor_name为'op_name:0'。
然后使用graph.get_tensor_by_name('op_name:0') 获取tensor信息。
代码从ckpt文件中获取保存的variable的数据(tensor的name和value):
import os import tensorflow as tf from tensorflow.python import pywrap_tensorflow check_point_path = 'variables' #checkpoint_path = os.path.join(logs_train_dir, 'model.ckpt') ckpt = tf.train.get_checkpoint_state(checkpoint_dir=check_point_path) checkpoint_path = os.path.join('.', ckpt.model_checkpoint_path) #print(ckpt.model_checkpoint_path) reader = pywrap_tensorflow.NewCheckpointReader(checkpoint_path) var_to_shape_map = reader.get_variable_to_shape_map() for key in var_to_shape_map: print("tensor_name: ", key) #print(reader.get_tensor(key))
法二:
from tensorflow.python.tools.inspect_checkpoint import print_tensors_in_checkpoint_file print_tensors_in_checkpoint_file("variables/save_variables.ckpt",tensor_name='', all_tensors=False, all_tensor_names=False)
注意:tf.train.latest_checkpoint(check_point_path) 方法用来获取最后一次ckeckpoint的路径,等价于
ckpt = tf.train.get_checkpoint_state(check_point_path) ckpt.model_checkpoint_path
不能将tf.train.latest_checkpoint与tf.train.get_checkpoint_state 搞混了
以上这篇tensorflow 实现从checkpoint中获取graph信息就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持我们。
赞 (0)