值:LSTM_14层的输入0与图层不兼容:预期ndim = 3,找到ndim = 4。收到完整的形状:[NONE,12,12,64]
我正在使用CNN-LSTM网络进行图像分类。我的图像大小为(224、224、3),批处理大小为90。当我将输入传递到LSTM层时,我会遇到此错误。以下是我的代码片段:
input1 = Input(shape=(224, 224,3))
x = Conv2D(8, (3,3), activation ="relu")(input1)
x = MaxPooling2D(2,2)(x)
x = Conv2D(16, (3,3), activation ="relu")(x)
x = MaxPooling2D(2,2)(x)
x = Conv2D(32, (3,3), activation ="relu")(x)
x = MaxPooling2D(2,2)(x)
x = Conv2D(64, (3,3), activation ="relu")(x)
x = Dropout(0.2)(x)
x = MaxPooling2D(2,2)(x)
x = LSTM(units= 64, activation= 'tanh', input_shape= [None, 144], return_sequences = True)(x)
error:
---> 10 x = LSTM(units= 64, activation= 'tanh', input_shape= [None, 144], return_sequences = True)(x)
ValueError: Input 0 of layer lstm_14 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 12, 12, 64]
感谢有人可以对我的问题进行整理。
I am using CNN-LSTM network for image classification. My image size is (224, 224, 3) and batch size is 90. I m getting this error when i passing input to LSTM layer. Following is my code snippet:
input1 = Input(shape=(224, 224,3))
x = Conv2D(8, (3,3), activation ="relu")(input1)
x = MaxPooling2D(2,2)(x)
x = Conv2D(16, (3,3), activation ="relu")(x)
x = MaxPooling2D(2,2)(x)
x = Conv2D(32, (3,3), activation ="relu")(x)
x = MaxPooling2D(2,2)(x)
x = Conv2D(64, (3,3), activation ="relu")(x)
x = Dropout(0.2)(x)
x = MaxPooling2D(2,2)(x)
x = LSTM(units= 64, activation= 'tanh', input_shape= [None, 144], return_sequences = True)(x)
error:
---> 10 x = LSTM(units= 64, activation= 'tanh', input_shape= [None, 144], return_sequences = True)(x)
ValueError: Input 0 of layer lstm_14 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 12, 12, 64]
Thanks if someone can sort my issue.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
根据 documentation>文档页面以下形状:
[批处理,时间段,功能]
。您的LSTM层接收到形状的输入[NONE,12,12,64]
,这就是为什么您获得有关3D/4D形状的错误的原因。您需要重塑张量:[NONE,12,12,64] - > [无,144,64]
。为此,您可以插入reshape
layer 您的最后一个maxpooling2d
和lstm
层。According to the documentation page, LSTM layer input should have the following shape:
[batch, timesteps, feature]
. Your LSTM layer receives input of shape[None, 12, 12, 64]
, this is why you obtain an error about 3D/4D shapes. You need to reshape your tensor:[None, 12, 12, 64] -> [None, 144, 64]
. To do this, you can insertReshape
layer between your lastMaxPooling2D
andLSTM
layers.