Home > Enterprise >  Video Morph Between Two Images, FFMPEG/Minterpolate
Video Morph Between Two Images, FFMPEG/Minterpolate

Time:02-23

I am trying to make a quick and easy morph video using two frames (png images) with ffmpeg's minterpolate filter, in a bash script on Ubuntu Linux. The intent is to use the morphs as transitions between similar video in a different video editor later.

It will work on 3 frames/images, but fails using just 2 frames/images.

First the code that works: 3 frames

This is using three 1080p png files:

test01_01.png test01_02.png test01_03.png

input01="test01_d.png"

ffmpeg -y -fflags  genpts -r 30 -i $input01 -vf "setpts=100*PTS,minterpolate=fps=24:scd=none" -pix_fmt yuv420p "test01.mp4"

This takes a bit of processing time, then creates a 414kb, roughly three second mp4 video of a morph starting with the first frame, morphing to the second, then morphing to the third.

The code that fails: 2 frames

This is using just two of the same 1080p png files:

test02_01.png test02_02.png

input01="test02_d.png"

ffmpeg -y -fflags  genpts -r 30 -i $input01 -vf "setpts=100*PTS,minterpolate=fps=24:scd=none" -pix_fmt yuv420p "test02.mp4"

This almost immediately creates a 262 byte corrupt mp4 file. There are no differences except the number of frames.

Things I've tried:

I have tried this with the Ubuntu default repo version of ffmpeg, and the static 64bit 5.0 and git-20220108-amd64 versions, all with the same result.

I have also tried with a 2-frame mp4 file as the input, with the same result.

Thoughts?

Is this a bug in ffmpeg or am I doing something wrong?

I am also open to any suggestions for creating a morph like this using other Linux-compatible software.

Thank you for any insight!

CodePudding user response:

It is not documented, but it looks like minterpolate filter requires at least 3 input frames.

We may create a longer video using 5 input frames, and keep the relevant part.

For getting the same output as applying Minterpolate filter with only two input images, we may use the following solution:

  • Define two input streams:
    Set test02_01.png as the first input and test02_02.png as the second input.
  • Loop each image at least twice, using -stream_loop
    (test02_01.png is repeated twice and test02_02.png is repeated 3 times).
  • Set the input frame rate to 0.3 fps (it is equivalent to -r 30 and setpts=100*PTS).
    The input arguments are as follows: -r 0.3 -stream_loop 1 -i test02_01.png -r 0.3 -stream_loop 2 -i test02_02.png.
  • Concatenate the two input streams using concat filter.
  • Apply minterpolate filer to the concatenated output.
    The output of the above stage is a video with few redundant seconds at the beginning, and few redundant seconds at the end.
  • Apply trim filter for keeping the relevant part.
    Add setpts=PTS-STARTPTS at the end (as recommended when using trim filter).

Suggested command:

ffmpeg -y -r 0.3 -stream_loop 1 -i test02_01.png -r 0.3 -stream_loop 2 -i test02_02.png -filter_complex "[0][1]concat=n=2:v=1:a=0[v];[v]minterpolate=fps=24:scd=none,trim=3:7,setpts=PTS-STARTPTS" -pix_fmt yuv420p test02.mp4

Sample output (as animate GIF):
enter image description here


test02_01.png:
enter image description here

test02_02.png:
enter image description here

  • Related