Home > Enterprise >  grayscale image different in cv2.imshow() and matplotlib.pyplot.show()
grayscale image different in cv2.imshow() and matplotlib.pyplot.show()

Time:09-17

import cv2 
import numpy as np 
import math
import sys
import matplotlib.pyplot as plt
import utils as ut

imgGray = cv2.imread(imgfile, cv2.IMREAD_GRAYSCALE)
plt.imshow(imgGray, cmap = 'gray')
plt.show() 


cv2.imshow("",imgGray)
cv2.waitKey(0)
cv2.destroyAllWindows()


sys.exit()

plt.show() result

enter image description here

cv2.imshow() result

enter image description here

I thought both of them would be same. But as you can see, two pictures have different grayscale. Seems plt.show() darker than cv2.imshow()

How do I have to make grayscale in plt.show() same as cv2.imshow()?

Python : 3.9.6

opencv-python : 4.5.3.56

mathplotlib : 3.4.3

CodePudding user response:

This is the behavior of matplotlib. It finds the minimum and maximum of your picture, makes those black and white, and scales everything in between.

This is useful for arbitrary data that may have integer or floating point types, and value ranges between 0.0 and 1.0, or 0 .. 255, or anything else.

OpenCV does no such auto-scaling. It has fixed rules. If it's floating point, 0.0 is black and 1.0 is white. If it's uint8, the range is 0 .. 255.

  • Related