Thanks, I do a small experiment with the solution of placeholder

```
input_shape = [None,128,128,2]
features = tf.placeholder(tf.float32, input_shape, name = 'input')
net = tf.layers.conv2d(inputs = features, filters = 64, kernel_size = [3, 3], strides = (2, 2), padding = 'same')
training = tf.placeholder(tf.bool, name='training')
net = tf.contrib.layers.batch_norm(net, is_training = training)
net = tf.nn.relu(net)
net = tf.reshape(net, [-1, 64 * 64 * 64])
net = tf.layers.dense(inputs = net, units = 8, kernel_initializer = tf.contrib.layers.xavier_initializer(), name = 'regression_output')
x = tf.placeholder(tf.float32, shape=[None,8])
#apply sqrt to avoid the loss value bloated
loss = tf.sqrt((tf.nn.l2_loss(net - x)))
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
imgs = np.zeros([1,128,128,2])
labels = [[1,2,3,4,5,6,7,8]]
sess.run(loss, feed_dict = {features : imgs, training : False, x : labels})
saver = tf.train.Saver()
saver.save(sess, 'reshape_final.ckpt')
tf.train.write_graph(sess.graph.as_graph_def(), "", 'graph_final.pb')
```

After that, I freeze->optimize->transform the model by following commands

```
python3 ~/.keras2/lib/python3.5/site-packages/tensorflow/python/tools/freeze_graph.py --input_graph=graph_final.pb --input_checkpoint=reshape_final.ckpt --output_graph=frozen_graph.pb --output_node_names=regression_output/BiasAdd
python3 ~/.keras2/lib/python3.5/site-packages/tensorflow/python/tools/optimize_for_inference.py --input frozen_graph.pb --output opt_graph.pb --frozen_graph True --input_names input --output_names regression_output/BiasAdd
~/Qt/3rdLibs/tensorflow/bazel-bin/tensorflow/tools/graph_transforms/transform_graph --in_graph=opt_graph.pb --out_graph=fused_graph.pb --inputs=input --outputs=regression_output/BiasAdd --transforms="fold_constants sort_by_execution_order"
```

Last command give me error:

âYou must feed a value for placeholder tensor âtrainingâ with dtype boolâ

I already feed bool value at this line

`sess.run(loss, feed_dict = {features : imgs, training : False, x : labels})`

What should I do?