I have the following piece of code, which is a simplification of my actual code:
#!/usr/bin/env python3
import tensorflow as tf
import re
import numpy as np
import pandas as pd
from tensorflow.keras.preprocessing.text import Tokenizer #text to vector.
from keras.layers import Dense, Embedding, LSTM, SpatialDropout1D
from keras.models import Sequential
DATASIZE = 1000
model = Sequential()
model.add(Embedding(1000, 120, input_length = DATASIZE))
model.add(SpatialDropout1D(0.4))
model.add(LSTM(176, dropout=0.2, recurrent_dropout=0.2))
model.add(Dense(2,activation='sigmoid'))
model.compile(loss = 'binary_crossentropy', optimizer='adam', metrics = ['accuracy'])
print(model.summary())
training = [[0 for x in range(10)] for x in range (DATASIZE)] #random value
label = [[1 for x in range(10)] for x in range (DATASIZE)] #random value
model.fit(training, label, epochs = 5, batch_size=32, verbose = 'auto')
What I need:
My endgoal is to be able to check whether a given input vector (which is a numerical representation of data) is positive or negative, so it is a binary classification issue here. Here all the 1000 vectors are 10 digits long, but it could be waay longer or even much shorter, length might vary throughout the dataset.
When running this I get the following error:
ValueError: logits and labels must have the same shape ((None, 2) vs (None, 10))
How do I have to structure my vectors in order to not get this error and actually correctly fit the model?
EDIT:
I could change the model.add(Dense(2,activation='sigmoid'))
call to model.add(Dense(10,activation='sigmoid'))
. But this doesn't make much sense, I think. As the first parameter of Dense is the actual number of output possibilities. In my case there are only 2 possibilities: positive or negative. So rnndomly changing this to 10, makes the program run, but doesn't make sense to me. And I am not even sure it is using my 1000 training vectors....
CodePudding user response:
Your last layer dense
has 2 neurons, while your dataset has 10 labels. So technically, the last layer must have same number of neurons as the number of classes in your dataset which is 10 in your case. Just replace 2 with 10 in your last dense
layer.