When I try to fit the following model:
model = Sequential([
Lambda(vgg_preprocess, input_shape=(3,244,244)),
Conv2D(64,3,3, activation='relu'),
BatchNormalization(axis=1),
Conv2D(64,3,3, activation='relu'),
MaxPooling2D(),
BatchNormalization(axis=1),
Conv2D(128,3,3, activation='relu'),
BatchNormalization(axis=1),
Conv2D(128,3,3, activation='relu'),
MaxPooling2D(),
BatchNormalization(axis=1),
Conv2D(256,3,3, activation='relu'),
BatchNormalization(axis=1),
Conv2D(256,3,3, activation='relu'),
MaxPooling2D(),
Flatten(),
BatchNormalization(),
Dense(1024, activation='relu'),
BatchNormalization(),
Dropout(0.5),
Dense(1024, activation='relu'),
BatchNormalization(),
Dense(10, activation='softmax')
])
model.compile(Adam(), loss='categorical_crossentropy', metrics=['accuracy'])
I get this error:
TypeError: Cannot convert Type TensorType(float32, 4D) (of Variable AbstractConv2d_gradInputs{convdim=2, border_mode='valid', subsample=(1, 1), filter_flip=True, imshp=(None, 256, 56, 56), kshp=(256, 256, 3, 3), filter_dilation=(1, 1)}.0) into Type TensorType(float64, 4D). You can try to manually convert AbstractConv2d_gradInputs{convdim=2, border_mode='valid', subsample=(1, 1), filter_flip=True, imshp=(None, 256, 56, 56), kshp=(256, 256, 3, 3), filter_dilation=(1, 1)}.0 into a TensorType(float64, 4D).
This is how I do the fitting:
model.fit_generator(train_batches, train_batches.n, nb_epoch=1, validation_data=test_batches, nb_val_samples=test_batches.n)
And here is vgg_preprocess
function:
vgg_mean = np.array([123.68, 116.779, 103.939]).reshape((3,1,1))
def vgg_preprocess(x):
x = x - vgg_mean #Subtract the mean of each channel
return x[:, ::-1] #Inverse the channel order to suit that of VGG RGB->BGR
What does it mean, how to fix it?
The problem lies in a fact that vgg_mean.dtype = 'float64
, whereas the standard floating point decision in most of DL packages is float32
.
Setting:
vgg_mean = np.array(vgg_mean, dtype='float32')
Should fix your problem.
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments