I need to merge a lot of text files into a single one. All files are named "patch_number.txt" and they are located in different directories. I've already tried some codes but I was only able to create a file for each different "patch_number.txt". For example: one file for every "patch_0.txt". Here's the code:
import os
paths = {}
for root, directories, files in os.walk('.'):
for f in files:
if f.startswith('patch_', 0, 5):
if f not in paths:
paths[f] = []
paths[f].append(root)
for f, paths in paths.items():
txt = []
for p in paths:
with open(os.path.join(p, f)) as f2:
txt.append(f2.read())
with open(f, 'w') as f3:
f3.write(''.join(txt))
CodePudding user response:
you should have a global txt variable which would append all file content into it. and finally you write it to one file which is something like this:
import os
paths = {}
for root, directories, files in os.walk('.'):
for f in files:
if f.startswith('patch_', 0, 5):
if f not in paths:
paths[f] = []
paths[f].append(root)
txt = []
for f, paths in paths.items():
for p in paths:
with open(os.path.join(p, f)) as f2:
txt.append(f2.read())
with open(f, 'w') as f3:
f3.write(''.join(txt))