Home > front end >  Why doesn't Matplotlib read image as grayscale?
Why doesn't Matplotlib read image as grayscale?

Time:09-15

I use matplotlib.pyplot.imsave with argument cmap='gray' to save a 1024x1024 nparrayas a grayscale image, but when I then read the saved image using matplotlib.pyplot.imread, I get a 1024x1024x4 nparray. Why is this?

Here is the code:

import numpy as np
import matplotlib.pyplot as plt

im = np.random.rand(1024, 1024)
print(im.shape)

plt.imsave('test.png', im, cmap='gray')
im = plt.imread('test.png')

print(im.shape)

The documentation for imread states that "The returned array has shape (M, N) for grayscale images." I suppose this raises the question of what exactly is meant by a grayscale image? How are they stored on disk, and how is Matplotlib supposed to know whether to read an image as grayscale, RGB, RGBA, etc. (and why is it being read as an RGBA image in this case)?

CodePudding user response:

I believe the cmap parameter doesn't change the file structure whatsoever in imsave.

The code from the matplotlib library for this function doesn't seem to take in account cmap for the number of channels it saves the file https://github.com/matplotlib/matplotlib/blob/v3.5.3/lib/matplotlib/image.py#L1566-L1675

CodePudding user response:

I also think that Plain Onion's answer is correct. Secondly

Rather than this If you want to save a grayscale image use open cv try this code-

import cv2  
img = cv2.imread("Image path here") 
img = cv2.cvtColor(img,cv2.COLOR_RGB2GRAY)
cv2.imread("path where you want to save image",img)
  • Related