Home > Back-end >  Endless Confusion Deriving the Sigmoid Function
Endless Confusion Deriving the Sigmoid Function

Time:01-03

def sigmoid(x):
  return 1 / (1   np.exp(-x))

def sigmoid_derivative(x):
  return x * (1 - x)

I'm learning how about neural networks from the simple neural network example: https://www.kdnuggets.com/2018/10/simple-neural-network-python.html

#Let's say 
x = 2
y = sigmoid(x)
y
Output: 0.8807970779778823

slope = sigmoid_derivative(y)
slope
Output: 0.10499358540350662

Now if m * x = y

slope * x = y

0.10499358540350662 * 2 = 0.2099871708 ?

The numbers don't add up and I have a feeling I am fundamentally misunderstanding this whole process. Is there any assistance you can offer? Thank you so much.

CodePudding user response:

As it is known from calculus, the statement

slope * x = y

is true only for a linear function, which sigmoid obviously isn't, especially around 2. The only thing you should expect is that

sigmoid_derivative(sigmoid(x0)) * (x1 - x0) sigmoid(x0) = sigmoid(x1) eps

where eps is a small error such that the smaller is |x1 - x0|, the smaller is |eps|.

You can easily check it yourself.

  • Related