I'm trying to scan all the videos inside a specific folder and create a thumbnail for each of them.
Here is the relevant code:
public async scan(): Promise<string[]> {
const files = await this.readFile(this._directoryPath);
const result: string[] = [];
for await (const file of files) {
result.push(file);
try {
ffmpeg(file)
.on("error", (err, stdout, stderr) => {
console.log(err.message);
})
.screenshots({
timestamps: [5],
filename: basename(file) ".png",
folder: this._thumbnailPath,
});
} catch (e: any) {
logger.error("Failed taking screenshot for " file " with error " e.message);
}
}
return result;
}
It is running fine a for small amount of video, but when I tried on a network path (\\Servername\some_folder) containing 2000 video files, my pc died after a moment. It manages to scan ~800 videos then everything crashed.
Is there a way to make that a background process ? I don't need to wait for it to be over. Or is it possible to run this chunk by chunk from one API call ?
I'm completely new to node.js so any help is appreciated.
CodePudding user response:
Your problem seems to be that the code is trying to process all input files at once in parallel. As a result, you experience resource allocation issues.
From the usage of await, I assume that you are trying to handle this situation. But the problem is that the ffmpeg(..).on(...).screenshots(..) code sequence returns immediately, without waiting for the screenshots to be created first. So, await is not working the way you want it to.
So, you need to wait for each video file to finish, then move on to the next file.
A solution would be to place the thumbnail generation inside a promise:
async function processFile(file) {
return new Promise((resolve, reject) => {
ffmpeg(file)
.screenshots(...YOUR_OPTIONS_HERE...)
.on("error", ...USE_REJECT_HERE.....)
.on("end", function() {
resolve(file);
});
})
}
And then, instead of doing:
result.push(file)
try {
ffmpeg(file)....
to do something like:
try {
result.push(await processFile(file));