Home > other >  Python processing data sets out of memory
Python processing data sets out of memory

Time:09-21

Want to MirFlicker formatted as mat - 25 image data set processing, and has written the following code, but
Always out of memory, what solution?

The from skimage. IO import imread, imshow
The import scipy. IO as scio
The from skimage. Transform the import the resize
The from OS. Path import join
The from OS import listdir
The import numpy as np
The from TQDM import TQDM

IMG_P=r 'D: \ mirflickr25k \ mirflickr'

# image file list
Key_img=lambda s: int (s.s plit (' JPG ') [0]. The split (' im ') [1])
Fs_img=[f for f in listdir (IMG_P) if 'JPG' in f]
Fs_img=sorted (fs_img, key=key_img) # according to the label ordering
Fs_img=[join (IMG_P, f) for f in fs_img]

N_IMG=25000
Batch_size=2500
Print (N_IMG)

All_img=[]

For I in TQDM (range (25000) :
Im=imread (fs_img [I])
Im=the resize (im, (224, 224, 3), the mode='constant')
Im=np. Expand_dims (im, axis=0)
Im=im. Transpose ((0, 3, 2, 1))
All_img. Append ((im))

All_img=np. Vstack (all_img)
Scio. Savemat (' D:/result/img '{' image' : all_img})

CodePudding user response:

I put the ndarray data type changed to int8, settled,

CodePudding user response:

Congratulations to the building Lord, electing the building Lord!
  • Related