Home > Enterprise >  NaN loss in fit parametrs
NaN loss in fit parametrs

Time:11-25

import pyreadr
from sklearn.model_selection import train_test_split
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import numpy as np
from sklearn import preprocessing


# сбор тестовых данных в массивы
result = pyreadr.read_r("/home/ignat/Downloads/Telegram Desktop/TMS_coefficients.RData")
dataset = []
values = []

for i in range(694):
    dataset.append((result['tms.coef']['bs0'][i],
                    result['tms.coef']['bs'][i],
                    result['tms.coef']['bi0'][i],
                    result['tms.coef']['bi'][i],
                    result['tms.coef']['b0'][i],
                    result['tms.coef']['b1'][i],
                    result['tms.coef']['b2'][i],
                    result['tms.coef']['a0'][i],
                    result['tms.coef']['a1'][i]))

    values.append([0.0 if result['tms.coef']['Y'][i] == "НС"
                   else 1.0 if result['tms.coef']['Y'][i] == "AD"
                   else 2.0 if result['tms.coef']['Y'][i] == "DLB"
                   else 3.0])

dataset = np.array(dataset, dtype="float")
values = np.array(values, dtype="float")

print(dataset[0])
print(values[0])
(trainX, testX, trainY, testY) = train_test_split(dataset,
    values, test_size=0.25, random_state=42)

# модель нейронки
visible = layers.Input(shape=(9,))
drop1 = layers.Dropout(0.5, input_shape=(9,))(visible)
hidden1 = layers.Dense(32, activation="relu")(drop1)
drop2 = layers.Dropout(0.5, input_shape=(9,))(hidden1)
output = layers.Dense(1, activation="relu")(drop2)
model = tf.keras.Model(inputs=visible, outputs=output)

# компиляция
model.compile(optimizer=tf.keras.optimizers.Adam(0.01),
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'],)

# обучение
model.fit(trainX, trainY, validation_data=(testX, testY), epochs=100)

model.save('my_model')

for i in range(1, 10):
    print(model.predict(dataset), values[-i])

While go fit I have troubles - Nan Lose

dataset[i] example: 1.35684728e-01 -4.03785135e-02 -8.27514734e-02 4.21657613e-03 1.40184876e-01 1.06088863e-02 -1.31599134e-03 -1.77011366e 00 -7.19767825e-02

values[i] example: 1.

I beg your help for me in debug only this fragment of code. Im google the problem very much, because i wont to see abstract instruction

CodePudding user response:

I think the main problem here might be... the loss function that you have chosen. Normally categorical_crossentropy is used in multiclass problems. As you are having only 1 neuron in your output layer this might lead to some problems. So if you are trying to make a classification you might want to switch the loss to:

# компиляция
model.compile(optimizer=tf.keras.optimizers.Adam(0.01),
              loss='binary_crossentropy',
              metrics=['accuracy'],)
  • Related