Home > Enterprise >  How to fix Memory Error while training model?
How to fix Memory Error while training model?

Time:11-28

I've been working on a Neural Network recently but everytime I try to compile the model, I get a SIGKILL which by looking at Activity Monitor, is from a memory error. My data is very large but it's not a part of the problem because I tried taking a tiny part of it but I still get the same error. This is the code I'm using:

f = gzip.GzipFile('Data_x.npy.gz', "r")
datax = np.load(f)[:5, :, :]
f.close()
f = gzip.GzipFile('Data_y.npy.gz', "r")
datay = np.load(f)[:5, :, :]

f.close()
f = None
model = Sequential(
    [
        #Conv1D(32, 3, input_shape=datax.shape, activation="relu"),
        Flatten(input_shape=datax.shape),
        Dense(750, activation='relu'),
        Dense(750, activation='relu'),
        Dense(2, activation='sigmoid')
    ]
)
model.compile(optimizer=Adam(learning_rate=0.1), loss="binary_crossentropy", metrics=['accuracy'])
model1 = model.fit(x=datax, y=datay, batch_size=5, epochs=5, shuffle=True, verbose=2)

I've tried many different structures for the model and different batch sizes/epochs but I still get this error. Any help in this matter would be greatly appreciated.

CodePudding user response:

you add dropout layer in your model.

Dropout is a technique where randomly selected neurons are ignored during training. They are “dropped-out” randomly. This means that their contribution to the activation of downstream neurons is temporally removed on the forward pass and any weight updates are not applied to the neuron on the backward pass.

model = Sequential(
    [
        #Conv1D(32, 3, input_shape=datax.shape, activation="relu"),
        Flatten(input_shape=datax.shape),
        Dense(750, activation='relu'),
        Dropout(0.2),
        Dense(750, activation='relu'),
        Dropout(0.2),
        Dense(2, activation='sigmoid')
    ]
  • Related