Home > Enterprise >  Partial Derivative term in the Gradient Descent Algorithm
Partial Derivative term in the Gradient Descent Algorithm

Time:05-14

I'm learning the "Machine Learning - Andrew Ng" course from Coursera. In the lesson called "Gradient Descent", I've found the formula a bit complicated. The theorem is consist of "partial derivative" term.
The problem for me to understand the calculation of partial derivative term. Thus, later the term is calculated as ​

1/m * ∑ ​ (h θ ​ (x) − y(i) )²


My question is, "How did the 1/2m from the 'Cost Function' becomes 1/m while calculating the partial derivative inside the Gradient Descent theorem?"

CodePudding user response:

Differentiation of is 2x. Similarly, differentiation of ∑​(h θ​(x) − y(i) )² is 2 * ∑​(h θ​(x) − y(i) ). Therefore, differentiation of 1/2m * ∑​(h θ​(x) − y(i) )² is 1/m * ∑​(h θ​(x) − y(i) ).

  • Related