如何在张量流中获得卷积数的值?

阿拉·阿贝德拉(Alla Abdella)

我有下面来自Github Tutorial的代码,我想访问每个“ x层”的值,并在训练完成后将其保存到numpy数组中。

def decoder(sampled_z, keep_prob):
    with tf.variable_scope("decoder", reuse=None):
        x = tf.layers.dense(sampled_z, units=inputs_decoder, activation=lrelu)
        x = tf.layers.dense(x, units=inputs_decoder * 2 + 1, activation=lrelu)
        x = tf.reshape(x, reshaped_dim)
        x = tf.layers.conv2d_transpose(x, filters=64, kernel_size=4, strides=2, 
                                       padding='same', activation=tf.nn.relu)
        x = tf.nn.dropout(x, keep_prob)
        x = tf.layers.conv2d_transpose(x, filters=64, kernel_size=4, strides=1, 
                                       padding='same', activation=tf.nn.relu)
        x = tf.nn.dropout(x, keep_prob)
        x = tf.layers.conv2d_transpose(x, filters=64, kernel_size=4, strides=1, 
                                       padding='same', activation=tf.nn.relu)
        x = tf.contrib.layers.flatten(x)
        x = tf.layers.dense(x, units=28*28, activation=tf.nn.sigmoid)
        img = tf.reshape(x, shape=[-1, 28, 28])
    return img
弗拉德

无论您是卷积层还是密集层,无论您是否完成训练,都可以通过session接口访问变量的值(一旦初始化它们)。

考虑以下示例:

import tensorflow as tf

def two_layer_perceptron(x):
    with x.graph.as_default():
        with tf.name_scope('fc'):
            fc = tf.layers.dense(
                     inputs=x, units=2,
                     kernel_initializer=tf.initializers.truncated_normal)
        with tf.name_scope('logits'):
            logits = tf.layers.dense(
                         inputs=fc, units=2,
                         kernel_initializer=tf.initializers.truncated_normal)
    return logits

x = tf.placeholder(tf.float32, shape=(None, 2))
logits = two_layer_perceptron(x)

# define loss, train operation and start training

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    # train here
    # ...
    # sess.run(train_op, feed_dict=...)
    # ...
    # when training is finished, do:
    trainable_vars = tf.trainable_variables()
    vars_vals = sess.run(trainable_vars)
    vars_and_names = [(val, var.name) for val, var in zip(vars_vals, trainable_vars)]


for val, name in vars_and_names:
    print(name, type(val), '\n', val)

# dense/kernel:0 <class 'numpy.ndarray'> 
# [[ 0.23275916  0.7079906 ]
# [-1.0366516   1.9141678 ]]
# dense/bias:0 <class 'numpy.ndarray'> 
# [0. 0.]
# dense_1/kernel:0 <class 'numpy.ndarray'> 
# [[-0.55649596 -1.4910121 ]
# [ 0.54917735  0.39449152]]
# dense_1/bias:0 <class 'numpy.ndarray'> 
# [0. 0.]

如果您想访问网络中的特定变量,可以通过将它们添加到集合中,tf.add_to_collection()然后再通过tf.get_collection()OR访问它们,您可以从所有变量列表中按变量名进行过滤(例如[v if 'conv' in v.name for v in tf.trainable_variables()]

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章