Home > OS >  I'm building a deep neural network and I keep getting "TypeError: __init__() takes from 1
I'm building a deep neural network and I keep getting "TypeError: __init__() takes from 1

Time:04-21

I'm trying to develop a deep neural network where I want to predict a single parameter based on multiple inputs. However, I'm getting the error, as stated in the title, and I'm not sure why. I haven't even called an __init__() method in my code, so I'm confused as to why it's giving me this error.

This is the code that I've written so far and yields the following error. I would appreciate any help, thanks!

import pandas as pd
import tensorflow as tf
import numpy as np
from tensorflow import keras
from tensorflow.keras import models
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from sklearn.model_selection import train_test_split
import matplotlib.pyplot as plt

d = pd.read_csv(r"AirfoilSelfNoise.csv")
x = d.iloc[:, 0:5] #frequency [Hz], angle of attack [deg], chord length [m], free-stream velocity [m/s], suction side displacement thickness [m], input
y = d.iloc[:, 5] #scaled sound pressure level [dB], output
df = pd.DataFrame(d, columns=['f', 'alpha', 'c', 'U_infinity', 'delta', 'SSPL'])

xtrain, xtest, ytrain, ytest = train_test_split(x, y, test_size=0.2, random_state=42)

mod = keras.Sequential(
    keras.layers.Dense(30, input_shape=(5,), activation='relu'),
    keras.layers.Dense(25, activation='relu'),
    keras.layers.Dense(1, activation='sigmoid')
)

mod.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
mod.fit(xtrain, ytrain, epochs=50)```

TypeError: __init__() takes from 1 to 3 positional arguments but 4 were given

CodePudding user response:

You forgot to add brackets into the Sequential function. With your code, it takes all the layers as different input parameters. However, the first parameter needs to be a list of your desired layers. In your case:

mod = keras.Sequential([
    keras.layers.Dense(30, input_shape=(5,), activation='relu'),
    keras.layers.Dense(25, activation='relu'),
    keras.layers.Dense(1, activation='sigmoid')]
)
  • Related