Home > Net >  How to overlay sequence of frames on video using ffmpeg-python?
How to overlay sequence of frames on video using ffmpeg-python?

Time:11-21

I tried below but it is only showing the background video.

background_video = ffmpeg.input( "input.mp4")
overlay_video = ffmpeg.input(f'{frames_folder}*.png', pattern_type='glob', framerate=25)
subprocess = ffmpeg.overlay(
                          background_video,
                          overlay_video,
                        ).filter("setsar", sar=1)

I also tried to assemble sequence of frames into .webm/.mov video but transparency is lost. video is taking black as background.

P.s - frame size is same as background video size. So no scaling needed.

Edit

I tried @Rotem suggestions

Try using single PNG image first

overlay_video =  ffmpeg.input('test-frame.png')

It's not working for frames generated by OpenCV but working for any other png image. This is weird, when I'm manually viewing these frames folder it's showing blank images(Link to my frames folder). But If I convert these frames into the video(see below) it is showing correctly what I draw on each frame.

output_options = {
                    'crf': 20,
                    'preset': 'slower',
                    'movflags': 'faststart',
                    'pix_fmt': 'yuv420p'
                }
ffmpeg.input(f'{frames_folder}*.png', pattern_type='glob', framerate=25 , reinit_filter=0).output(
                    'movie.avi',
                    **output_options
                ).global_args('-report').run()

try creating a video from all the PNG images without overlay

It's working as expected only issue is transparency. Is there is way to create a transparent background video? I tried .webm/.mov/.avi but no luck.

Add .global_args('-report') and check the log file

Report written to "ffmpeg-20221119-110731.log"
Log level: 48
ffmpeg version 5.1 Copyright (c) 2000-2022 the FFmpeg developers
  built with Apple clang version 13.1.6 (clang-1316.0.21.2.5)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/5.1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-neon
  libavutil      57. 28.100 / 57. 28.100
  libavcodec     59. 37.100 / 59. 37.100
  libavformat    59. 27.100 / 59. 27.100
  libavdevice    59.  7.100 / 59.  7.100
  libavfilter     8. 44.100 /  8. 44.100
  libswscale      6.  7.100 /  6.  7.100
  libswresample   4.  7.100 /  4.  7.100
  libpostproc    56.  6.100 / 56.  6.100
Input #0, image2, from './frames/*.png':
  Duration: 00:00:05.00, start: 0.000000, bitrate: N/A
  Stream #0:0: Video: png, rgba(pc), 1920x1080, 25 fps, 25 tbr, 25 tbn
Codec AVOption crf (Select the quality for constant quality mode) specified for output file #0 (movie.avi) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
Codec AVOption preset (Configuration preset) specified for output file #0 (movie.avi) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
Stream mapping:
  Stream #0:0 -> #0:0 (png (native) -> mpeg4 (native))
Press [q] to stop, [?] for help
Output #0, avi, to 'movie.avi':
  Metadata:
    ISFT            : Lavf59.27.100
  Stream #0:0: Video: mpeg4 (FMP4 / 0x34504D46), yuv420p(tv, progressive), 1920x1080, q=2-31, 200 kb/s, 25 fps, 25 tbn
    Metadata:
      encoder         : Lavc59.37.100 mpeg4
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
frame=  125 fps= 85 q=31.0 Lsize=     491kB time=00:00:05.00 bitrate= 804.3kbits/s speed=3.39x    
video:482kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.772174%

To draw frame I used below.

for i in range(num_frames):
            transparent_img = np.zeros((height, width, 4),  dtype=np.uint8)
            cv2.line(transparent_img, (x1,y1), (x2,y2) ,(255, 255, 255), thickness=1, lineType=cv2.LINE_AA)
            self.frames.append(transparent_img)


## To Save each frame of the video in the given folder
for i, f in enumerate(frames):
    cv2.imwrite("{}/{:0{n}d}.png".format(path_to_frames, i, n=num_digits), f)



CodePudding user response:

Here are answers to your two questions:

  • For drawing a white line on BGRA image, use (255, 255, 255, 255) color instead of (255, 255, 255).
    The last 255 applies alpha (transparency) channel value, and 255 makes the line fully opaque.

  • For creating video with transparent background try: .webm file type, use libvpx-vp9 video codec, and use -pix_fmt yuva420p - the a of yuva applies alpha (transparency) channel.


Here is a "self contained" code sample (please read the comments):

import cv2
import numpy as np
import ffmpeg

# Create synthetic MP4 video file from testing
ffmpeg.input('testsrc=size=192x108:rate=1:duration=10', f='lavfi').output('tmp.mp4').overwrite_output().run()

transparent_img = np.zeros((108, 192, 4), np.uint8)

width, height, fps = 192, 108, 1

def make_sample_image(i):
    p = width//60
    img = np.zeros((height, width, 4), np.uint8)  # Fully transparent
    cv2.putText(img, str(i), (width//2-p*10*len(str(i)), height//2 p*10), cv2.FONT_HERSHEY_DUPLEX, p, (255, 255, 255, 255), p*2)  # White number
    return img

# Create 10 PNG files with transparent background an white number (counter).
for i in range(1, 11):
    transparent_img = make_sample_image(i)
    cv2.imwrite(f'{i:03d}.png', transparent_img)


output_options = {  'vcodec' : 'libvpx-vp9',  # libvpx-vp9 supports transparency.
                    'crf': 20,
                    #'preset': 'slower',  # Not supported by libvpx-vp9
                    #'movflags': 'faststart',  # Not supported by WebM
                    'pix_fmt': 'yuva420p'  # yuva420p includes transparency.
                }

frames_folder = './'

# Create video with transparency:
# reinit_filter=0 is required, only if the PNG images have different characteristics (example: some are RGB and some RGBA).
# Use d.png instead of glob pattern, becuase my Windows version of FFmpeg doesn't support glob pattern.
ffmpeg.input(f'{frames_folder}d.png', framerate=fps, reinit_filter=0).output(
                    'movie.webm',  # WebM container supports transparency
                    **output_options
                ).global_args('-report').overwrite_output().run()


# Overlay the PNG on top of tmp.mp4
background_video = ffmpeg.input( "tmp.mp4")
overlay_video = ffmpeg.input(f'{frames_folder}d.png', framerate=fps)
subprocess = ffmpeg.overlay(
                          background_video,
                          overlay_video,
                        ).filter("setsar", sar=1)

subprocess.output('overlay_video.webm', **output_options).global_args('-report').overwrite_output().run()
  • Related