I am trying to use pretrained embeddings as a layer in a neural network but can't quite get it to work. The error I am getting is in the reshape layer:
Tried to convert 'shape' to a tensor and failed. Error: None values not supported.
What am I doing wrong here?
epochs = 50
n_units = 512
embedding_size = 200
text_in = Input(shape = ())
embedding_layer = hub.KerasLayer("https://tfhub.dev/google/elmo/3")(text_in)
reshape = tf.keras.layers.Reshape(target_shape=(None, 1024, 1))(embedding_layer)
x = LSTM(n_units)(reshape)
x = Dropout(0.2)(x)
text_out = Dense(total_words, activation = 'softmax')(x)
model = Model(inputs=[input_layer], outputs=output_layer, name="LSTM with ELMo Embeddings")
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
CodePudding user response:
The 'None' should not be there. Keras is bit weird and it often treats the first dimension as something special. In this case, the first (batch size) dimension is omitted and the layer can't change it.
Otherwise, your code has probably more issues. input_layer
and output_layer
are not defined. Shape (?, 1024, 1) is weird. You have sequences of length 1024 and only 1D embeddings? The whole code looks wrong.
CodePudding user response:
According to the doc https://keras.io/api/layers/reshaping_layers/reshape/ None
is not support for unknown dimension size with keras.layers.Reshape
. However you can use -1
instead and it should works as you want
PS: don't forget to change input_layer
/output_layer
(in keras.Model
) by text_in
/text_out
(you don't need to put input_layer
in a list btw)