I want to record the video output (encoded or not) from an off-screen NSView running on macOS. I'm pretty sure there is no API to do this, however I believe it is feasible by rendering it frame-by-frame into a framebuffer.
The problem is that I can't find a way to render the view at a fast enough rate. Methods I've tried without success (tested on a MacBook M1 Pro running Monterey) :
[view dataWithPDFInsideRect:]
and[view dataWithEPSInsideRect:]
: takes about 200ms to execute.[view.layer renderInContext:]
: takes about 350ms to execute.[view cacheDisplayInRect: toBitmapImageRep:]
: takes about 100ms to execute.
I also tried to embed the view in a window and capture the window. Window capturing functions (such as CGWindowListCreateImage
) are much faster, but does not work when windows are off-screen.
Considering the view can be rendered at 60fps in a window without issue, why do theese methods take so much time? Is there any method I missed to render an NSView into a framebuffer?
CodePudding user response:
I finally found a performant way of doing it. By capturing the view this way I am able to reach 60 fps.
NSView* view = ...;
NSBitmapImageRep *bitmap = [
[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:nil
pixelsWide:view.bounds.size.width
pixelsHigh:view.bounds.size.height
bitsPerSample:8
samplesPerPixel:4
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSCalibratedRGBColorSpace
bitmapFormat:0
bytesPerRow:(4 * view.bounds.size.width)
bitsPerPixel:32
];
NSGraphicsContext *graphicsContext = [NSGraphicsContext graphicsContextWithBitmapImageRep:bitmap];
[NSGraphicsContext saveGraphicsState];
[NSGraphicsContext setCurrentContext:graphicsContext];
[view displayRectIgnoringOpacity:view.bounds inContext:graphicsContext];
[NSGraphicsContext restoreGraphicsState];
// pixels are in format [R, G, B, A, R, G, B, A, ...]
unsigned char* pixels = [bitmap bitmapData];