Home > Software engineering >  Is this LSTM underfitting?
Is this LSTM underfitting?

Time:07-16

I am trying to create a model that predicts if it will rain in the next 5 days (multi-step) or not, so I dont need the precipitation value, just a "yes" or "no". I've been testing with some different tools/algorithms and I guess the big challenge here is dealing with the zero skewed data.

The dataset consists of hourly data that has columns such as precipitation, temperature, pressure, wind speed, humidity. It has around 1 milion rows. There is no requisite to use a multivariate approach.

Rain occurs mostly on months 1,2,3,11 and 12.

So I tried using a univariate LSTM on the data, and with hourly sample I had the best results. I used the following architecture:

model=Sequential()

model.add(LSTM(150,return_sequences=True,input_shape=(1,look_back)))
model.add(LSTM(50,return_sequences=True))
model.add(LSTM(50))
model.add(Dense(1))

model.compile(loss='mean_squared_error', optimizer='adam')
history = model.fit(trainX, trainY, epochs=15, batch_size=4096, validation_data=(testX, testY), shuffle=False)

I'm using a lookback value of 24*60, which should mean 2 months.

Train/Validation Loss:

https://i.stack.imgur.com/CjDbR.png

Final result:

https://i.stack.imgur.com/p6SnD.png

So I read that this train/validation loss means the model is underfitting, is it? What could I do to prevent this?

Before using LSTM I tried using Prophet, which rendered really bad results and tried used autoarima, but it couldn't handle a yearly seasonality (365 days).

CodePudding user response:

In case of underfitting what you can do is icreasing the learning rate, increasing training duration and number of training data.

It is also worth having some external metric such as the F1 score because loss isn't a good metrics for human evaluation.

Just looking at your example I would start with experimenting a bit with the loss function, it seems like your data is binary so it would be wiser to use a binary loss instead of a regression loss

  • Related