I'm trying to count black and white pixels in an image with Numpy and OpenCV. However, the final count doesn't match the real value. By using the following image (4x4):
In the following code:
# importing libraries
import cv2
import numpy as np
from math import *
path = "pictures/test.png"
# reading the image data from desired directory
img = cv2.imread(path)
cv2.imshow('Image', img)
height, width, color = img.shape
# counting the number of pixels
number_of_white_pix = np.sum(img == 255)
number_of_black_pix = np.sum(img == 0)
def phase():
white = number_of_white_pix
black = number_of_black_pix
phaseper = white/(white black)
print(white)
print(black)
return phaseper
print(phase())
I get the following output:
9
36
0.2
Process finished with exit code 0
Which means it counted 9 white pixels and 36 black pixels, which is clearly wrong, as it can be seen from the image that the correct number is 3 white and 13 pixels (for a total of 4x4 = 16 pixels). Since the code is not giving any errors and doesn't seem to be wrong, I do not know what is going on.
CodePudding user response:
Dumping the data of the original image using print(img)
gives
[[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
[ 0 0 0]]
[[ 0 0 0]
[ 0 0 0]
[ 0 0 0]
[ 0 0 0]]
[[255 255 255]
[255 255 255]
[255 255 255]
[ 0 0 0]]
[[ 1 1 1]
[ 0 0 0]
[ 0 0 0]
[ 0 0 0]]]
This agrees with the results of your code: we have 9 255
s and 36 0
s. As you can see one of the pixels is not quite black, but a very dark gray.
The reason each value appears three times is because it is encoded as RGB. If you only care about handling grayscale images, you can tell opencv
to load the image as grayscale instead:
img = cv2.imread(path, cv2.IMREAD_GRAYSCALE)
"""
print(img) now gives
[[ 0 0 0 0]
[ 0 0 0 0]
[255 255 255 0]
[ 1 0 0 0]]
"""