Home > Back-end >  Logits and labels must have same shape for Keras model
Logits and labels must have same shape for Keras model

Time:12-28

I am new to Keras and have been practicing with resources from the web. Unfortunately, I cannot build a model without it throwing the following error:

ValueError: logits and labels must have the same shape, received ((None, 10) vs (None, 1)).

I have attempted the following:

DF = pd.read_csv("https://raw.githubusercontent.com/EpistasisLab/tpot/master/tutorials/MAGIC Gamma Telescope/MAGIC Gamma Telescope Data.csv")

X = DF.iloc[:,0:-1]
y = DF.iloc[:,-1]
yBin = np.array([1 if x == 'g' else 0 for x in y ])
scaler = StandardScaler()
X1 = scaler.fit_transform(X)
X_train, X_test, y_train, y_test = train_test_split(X1, yBin, test_size=0.25, random_state=2018) 

print(X_train.__class__,X_test.__class__,y_train.__class__,y_test.__class__ )

model=Sequential()
model.add(Dense(6,activation="relu", input_shape=(10,)))
model.add(Dense(10,activation="softmax"))
model.build(input_shape=(None,1))
model.summary()
model.compile(optimizer='rmsprop',
              loss='binary_crossentropy',
              metrics=['accuracy'])
model.fit(x=X_train,
          y=y_train,
          epochs=600,
          validation_data=(X_test, y_test), verbose=1
          )

I have read my model is likely wrong in terms of input parameters, what is the correct approach?

CodePudding user response:

When I look at the shape of your data

print(X_train.shape,X_test.shape,y_train.shape,y_test.shape)

I see, that X is 10-dimensional and y us 1-dimensional

Therefore, you need 10-dimensional input

 model.build(input_shape=(None,10))

and 1-dimensional output in the last dense layer

model.add(Dense(1,activation="softmax"))

CodePudding user response:

Target variable yBin/y_train/y_test is 1D array (has a shape (None,1) for a given batch).

Your logits come from the Dense layer and the last Dense layer has 10 neurons with softmax activation. So it will give 10 outputs for each input or (batch_size,10) for each batch. This is represented formally as (None,10).

To resolve the particular shape mismatch issue in question change the neuron count of dense layer to 1 and set activation finction to "sigmoid".

model.add(Dense(1,activation="sigmoid"))
  • Related