Home > Back-end >  Tensorflow Neural Network like MLPClassifier (sklearn)
Tensorflow Neural Network like MLPClassifier (sklearn)

Time:05-05

So, I am creating an AI which predicts how long a user will take to finish exercises. I previously created a NN with Sklearn, but I want to integrate Tensorflow.

I have 6 features as input and 1 output, which is a number.

I tried this but it does not seem to be willing to work:

    # Train data
X_train = X[:1500]
y_train = y[:1500]

# Test data
X_test = X[1500:]
y_test = y[1500:]


# Create the TF model
model = tf.keras.Sequential([
    tf.keras.layers.Flatten(input_shape=(6,)),
    tf.keras.layers.Dense(256, activation='softmax'),
    tf.keras.layers.Dense(128, activation='softmax'),
    tf.keras.layers.Dense(64, activation='softmax'),
    tf.keras.layers.Dense(1)
])


model.compile(optimizer='adam',
              loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              metrics=['accuracy'])


model.fit(X_train, y_train, epochs=10)

With that, it used to work with a simple MLPClassifier.

I also managed to get this nice error which does not seem to be fixed by changing the layers:

Received a label value of 1209638408 which is outside the valid range of [0, 1).

CodePudding user response:

So I changed it a bit and came up with this:

features_train = features[:1500]
output_train = output[:1500]

features_test = features[1500:]
output_test = output[1500:]


classifier = Sequential()


classifier.add(Dense(units = 16, activation = 'relu', input_dim = 6))
classifier.add(Dense(units = 128, activation = 'relu'))
classifier.add(Dense(units = 64, activation = 'relu'))
classifier.add(Dense(units = 32, activation = 'relu'))
classifier.add(Dense(units = 8, activation = 'relu'))
classifier.add(Dense(units = 2, activation = 'relu'))
classifier.add(Dense(units = 1))


classifier.compile(optimizer='rmsprop', loss='binary_crossentropy')


classifier.fit(features_train, output_train, batch_size = 1, epochs = 10)

But now I get a loss of 100%.

CodePudding user response:

You should use a smaller network. Try with fewer Dense layers, 2 or 3 maximum. If you use the binary_crossentropy loss, use a sigmoid activation in the last Dense layer. You can also pass metrics=['accuracy'] when compiling the model to monitor the accuracy.

  • Related