Home > Software engineering >  Limiting Python file space in memory
Limiting Python file space in memory

Time:12-09

When you open a file in python (e.g., open(filename, 'r') does it load the entire file into memory? More importantly, is there a way to partially load a file into memory to save memory space (for larger systems) or am I overthinking this? Particularly, I'm trying to optimize this in a cloud environment where I only need ~1-2 lines of a large file and would prefer not inputting all of that into memory as we pay for computation time.

General question, nothing was tested. looking for opinions and such

CodePudding user response:

You can't add any more arguments into the open() function, but you can change how you read the lines from the file. For example:

# open the sample file used
file = open('test.txt')
  
# read the content of the file opened
content = file.readlines()
  
# read 10th line from the file
print("tenth line")
print(content[9])
  
# print first 3 lines of file
print("first three lines")
print(content[0:3])

You could also use the file.readline() method to read individual lines from a file.

Although this still means that your entire file will be read into memory, this is a compressed version of the full file, so doesn't take up the same amount of space as the full file in memory.

  • Related