I am merging 64 gigs of mp4s together, though ffmpeg will go past the file size limit and corrupt it. Is there a way to stop ffmpeg at a 100 hour mark and create another file, resume, then repeat until finished?
This is my python code, with the ffmpeg code I used to generate the mp4. It works fine with less files/
from moviepy.editor import *
import os
from natsort import natsorted
L = []
total = 0
for root, dirs, files in os.walk("F:\door"):
#files.sort()
files = natsorted(files)
with open("list.txt", "a") as filer:
for file in files:
if os.path.splitext(file)[1] == '.mp4':
filePath = os.path.join(root, file)
head, tail = os.path.split(filePath)
filePath = "file '" str(tail) "'\n"
print(filePath)
filer.write(filePath)
#run in cmd: ffmpeg -f concat -i list.txt -c copy output.mp4
CodePudding user response:
Yes, you can use the -t
option in ffmpeg to specify the duration of the output file. For example, to create a 100-hour output file, you can use -t 100:00:00
. You can then run ffmpeg multiple times with different -t
values until you have processed all of the input files.
To use this with your script, you can modify the command to something like this:
ffmpeg -f concat -i list.txt -t 100:00:00 -c copy output.mp4
You can then repeat this command with a different value for -t until you have processed all of the input files. You might want to use a loop or some other control flow construct to automate this process.
Alternatively, you can use the split filter in ffmpeg to split the output into multiple files automatically. To do this, you can use a command like this:
ffmpeg -f concat -i list.txt -c copy -map 0 -segment_time 100:00:00 -f segment output%d.mp4
This will create multiple output files, each with a duration of 100 hours. The output files will be named output0.mp4
, output1.mp4
, etc.