Tensorflow教程中的尺寸与解码器不匹配

更多企鹅

我正在使用tensorflow 2.0和keras进行针对Tensorflow的卷积自动编码器教程,可在此处找到

使用提供的代码来构建CNN,但是在编码器和解码器上再添加一个卷积层,则会导致代码中断:

class Denoise(Model):
  def __init__(self):
    super(Denoise, self).__init__()
    self.encoder = tf.keras.Sequential([
      layers.Input(shape=(28, 28, 1)), 
      layers.Conv2D(16, (3,3), activation='relu', padding='same', strides=2),
      layers.Conv2D(8, (3,3), activation='relu', padding='same', strides=2),
      ## New Layer ##
      layers.Conv2D(4, (3,3), activation='relu', padding='same', strides=2)
      ## --------- ##
      ])

    self.decoder = tf.keras.Sequential([
      ## New Layer ##
      layers.Conv2DTranspose(4, kernel_size=3, strides=2, activation='relu', padding='same'),
      ## --------- ##
      layers.Conv2DTranspose(8, kernel_size=3, strides=2, activation='relu', padding='same'),
      layers.Conv2DTranspose(16, kernel_size=3, strides=2, activation='relu', padding='same'),
      layers.Conv2D(1, kernel_size=(3,3), activation='sigmoid', padding='same')
      ])

  def call(self, x):
    encoded = self.encoder(x)
    decoded = self.decoder(encoded)
    return decoded

autoencoder = Denoise()

运行autoencoder.encoder.summary()autoencoder.decoder.summary(),我可以看到这是一个形状问题:

Encoder:
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_124 (Conv2D)          (None, 14, 14, 16)        160       
_________________________________________________________________
conv2d_125 (Conv2D)          (None, 7, 7, 8)           1160      
_________________________________________________________________
conv2d_126 (Conv2D)          (None, 4, 4, 4)           292       
=================================================================
Total params: 1,612
Trainable params: 1,612
Non-trainable params: 0
_________________________________________________________________

Decoder:
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_transpose_77 (Conv2DT (32, 8, 8, 4)             148       
_________________________________________________________________
conv2d_transpose_78 (Conv2DT (32, 16, 16, 8)           296       
_________________________________________________________________
conv2d_transpose_79 (Conv2DT (32, 32, 32, 16)          1168      
_________________________________________________________________
conv2d_127 (Conv2D)          (32, 32, 32, 1)           145       
=================================================================
Total params: 1,757
Trainable params: 1,757
Non-trainable params: 0
_________________________________________________________________

为什么解码方面的领先优势32None, 4, 4, 4如果输入是从编码器传递的,为什么输入层的尺寸不为我该如何解决?

在此先感谢您的帮助!

尼古拉斯·格维斯(Nicolas Gervais)

stride=2在最后一个编码器层中删除,然后stride=2在最后一个解码器层中添加

from tensorflow.keras import layers
from tensorflow.keras import Model

class Denoise(Model):
  def __init__(self):
    super(Denoise, self).__init__()
    self.encoder = tf.keras.Sequential([
      layers.Input(shape=(28, 28, 1)), 
      layers.Conv2D(16, (3,3), activation='relu', padding='same', strides=2),
      layers.Conv2D(8, (3,3), activation='relu', padding='same', strides=2),
      ## New Layer ##
      layers.Conv2D(4, (3,3), activation='relu', padding='same')
      ## --------- ##
      ])

    self.decoder = tf.keras.Sequential([
      ## New Layer ##
      layers.Conv2DTranspose(4, kernel_size=3, strides=2, activation='relu', padding='same'),
      ## --------- ##
      layers.Conv2DTranspose(8, kernel_size=3, strides=2, activation='relu', padding='same'),
      layers.Conv2DTranspose(16, kernel_size=3, strides=2, activation='relu', padding='same'),
      layers.Conv2D(1, kernel_size=(3,3), activation='sigmoid', padding='same', strides=2)
      ])

  def call(self, x):
    encoded = self.encoder(x)
    decoded = self.decoder(encoded)
    return decoded

autoencoder = Denoise()
autoencoder.build(input_shape=(1, 28, 28, 1))
autoencoder.summary()

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章