Home > OS >  Pre-trained BERT not the right shape for LSTM layer: Value Error, total size of new array must be un
Pre-trained BERT not the right shape for LSTM layer: Value Error, total size of new array must be un

Time:04-21

I am attempting to use a pre-trained BERT model on a Siamese neural network. However, I am having issues passing the BERT model to the shared LSTM layer. I encounter the below error:

ValueError: Exception encountered when calling layer "reshape_4" (type Reshape).

total size of new array must be unchanged, input_shape = [768], output_shape = [64, 768, 1]

Call arguments received:
  • inputs=tf.Tensor(shape=(None, 768), dtype=float32)

I read in several other posts that the dimensions I feed into the LSTM should be [batch_size, 768, 1]. However, when I attempt to reshape, I run into the error. How can I resolve this error?

input_1 = Input(shape=(), dtype=tf.string, name='text')
preprocessed_text_1 = bert_preprocess(input_1)
outputs_1 = bert_encoder(preprocessed_text_1)
e1 = tf.keras.layers.Reshape((64, 768, 1))(outputs_1['pooled_output'])

input_2 = Input(shape=(), dtype=tf.string, name='text')
preprocessed_text_2 = bert_preprocess(input_2)
outputs_2 = bert_encoder(preprocessed_text_2)
e2 = Reshape((64, 768, 1))(outputs_2['pooled_output'])

lstm_layer = Bidirectional(LSTM(50, dropout=0.2, recurrent_dropout=0.2)) # Won't work on GPU

x1 = lstm_layer(e1)
x2 = lstm_layer(e2)

mhd = lambda x: exponent_neg_cosine_distance(x[0], x[1]) 
merged = Lambda(function=mhd, output_shape=lambda x: x[0], name='cosine_distance')([x1, x2])
preds = Dense(1, activation='sigmoid')(merged)
model = Model(inputs=[input_1, input_2], outputs=preds)

CodePudding user response:

You have to remove the batch size (=64) from the Reshape layers.

  • Related