Home > database >  How to take all files from a folder as input (one at a time) and save them into another folder after
How to take all files from a folder as input (one at a time) and save them into another folder after

Time:11-16

I have a .py script which takes one .csv at a time and pre-process it. I am running the bash command in this way:

python3 cleaning_script.py file1.csv results_file1.csv. It works completely fine. There are 300 files in a folder which I have to pre-process, is there a better way (Loop maybe, but I don't know how to use loop in this environment) in which I can mention the folder and each file is taken as input from the source folder and pre-processed files is saved in the destination folder?

Because of the non-administrative rights in the system (Windows 10), I am using Rstudio terminal to execute my bash command.

CodePudding user response:

with bash, if there's not a massive number of files then you can use a glob:

for file in path/to/dir/*.csv
do
    python3 cleaning_script.py "$file" "$(dirname "$file")/result_$(basename "$file")"
done


CodePudding user response:

I think you are looking for this function from "os" module! os.listdir(path='.') returns list of all entities at "path", this includes files and other directories. If you want to recursively loop through all entities of directory look at os.walk(top, topdown=True, one rror=None, followlinks=False) here! it's a little more complicated but more powerfull.

  • Related