I have the following definition for calculating the gradient at a pixel using central difference:
Where h is small, f'(x)=f(x 0.5h)-f(x-0.5h)
• If we make h twice the distance between pixels
• The above equation simple states that the derivative of the image gradient at a pixel, is the next (right) pixel’s value minus the previous (left) pixel’s value
Why is it not necessary to divide by h to get the rate of change? why does simply subtracting the left pixel's value from the right pixel's value give the derivative at the central pixel?
CodePudding user response:
Your definition is wrong. You do need to divide by h to get a proper estimate of the derivative.
In image processing, oftentimes we see definitions for derivatives that are off by a scaling, like what you have here. In most applications, the scaling is not important, what matters is comparing values in different parts of the image, for example to find the most salient edges. For these cases it is OK to use a simplified definition (that maybe is also cheaper to compute).
For example, the Sobel operator is usually defined in a way that it produces a value 8 times larger than the derivative it tries to estimate.