Home > Net >  I want to use ReLU function in a neural network however I don't know how to implement it since
I want to use ReLU function in a neural network however I don't know how to implement it since

Time:09-29

I want to use de ReLU function as an activation function in a neural network however it throws an error since my input is an array. Here is the code where i define my function:

def relu(x):
    if x<0:
       x=0
    else:
       x=x
    return x

I expect to get an array as an output as well.

CodePudding user response:

You can do like below for your input layer.

def relu(x):
    if x > 0:
        return x
    else:
        return 0

input_vals = [-1, 2, 3, 4, 5]
output = [relu(x) for x in input_vals]
print(output)

Here note that when you are greater than zero only you need to get the value else you need only zero. Then you can apply relu for all the inputs in your input layer like above.

CodePudding user response:

If you are using NumPy for your arrays (which you probably should), then you can implement ReLU as follows:

import numpy as np

def relu(x):
    return np.where(np.asarray(x) > 0, x, 0)

Note that using np.asarray(x) instead of just x means that you can pass something "array-like" to the relu() function as well as a NumPy array.

This should work for array inputs of any dimension.

  • Related