Home > Net >  AVMutableComposition output freezes at the last frame of the first video
AVMutableComposition output freezes at the last frame of the first video

Time:12-15

I am trying to merge multiple clips(videos) into one using AVMutableCompositions, I have successfully done this as well as rotating and translating each instruction, however, there is still one issue that remains. When the first clip finishes the output freezes at its last frame (the last frame of the first clip); this only happens if there is another clip visible, so, for example, if I were to set the opacity of the second and third clips to 0 at CMTime.Zero and the first one to 0 at firstClip.Duration, the result would be a video that displays the first clip's video, and once this finishes it displays a black background. The clips' audio works perfectly.

Here is my code:

        public void TESTING()
        {
            //microphone
            AVCaptureDevice microphone = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio);

            AVMutableComposition mixComposition = AVMutableComposition.Create();
            AVVideoCompositionLayerInstruction[] Instruction_Array = new AVVideoCompositionLayerInstruction[Clips.Count];

            foreach (string clip in Clips)
            {
                var asset = AVUrlAsset.FromUrl(new NSUrl(clip, false)) as AVUrlAsset;
                #region HoldVideoTrack
                
                //This range applies to the video, not to the mixcomposition
                CMTimeRange range = new CMTimeRange()
                {
                    Start = CMTime.Zero,
                    Duration = asset.Duration
                };

                var duration = mixComposition.Duration;
                NSError error;

                AVMutableCompositionTrack videoTrack = mixComposition.AddMutableTrack(AVMediaType.Video, 0);
                AVAssetTrack assetVideoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
                videoTrack.InsertTimeRange(range, assetVideoTrack, duration, out error);
                videoTrack.PreferredTransform = assetVideoTrack.PreferredTransform;

                if (microphone != null)
                {
                    AVMutableCompositionTrack audioTrack = mixComposition.AddMutableTrack(AVMediaType.Audio, 0);
                    AVAssetTrack assetAudioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];
                    audioTrack.InsertTimeRange(range, assetAudioTrack, duration, out error);
                }
                #endregion

                AVAssetTrack videoTrackWithMediaType = mixComposition.TracksWithMediaType(AVMediaType.Video)[0];

                var instruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack(videoTrackWithMediaType);
                
                #region Instructions
                int counter = Clips.IndexOf(clip);
                Instruction_Array[counter] = TestingInstruction(asset, mixComposition.Duration, videoTrackWithMediaType);
                #endregion
            }

            // 6
            AVMutableVideoCompositionInstruction mainInstruction = AVMutableVideoCompositionInstruction.Create() as AVMutableVideoCompositionInstruction;

            CMTimeRange rangeIns = new CMTimeRange()
            {
                Start = new CMTime(0, 0),
                Duration = mixComposition.Duration
            };
            mainInstruction.TimeRange = rangeIns;
            mainInstruction.LayerInstructions = Instruction_Array;

            var mainComposition = AVMutableVideoComposition.Create();
            mainComposition.Instructions = new AVVideoCompositionInstruction[1] { mainInstruction };
            mainComposition.FrameDuration = new CMTime(1, 30);
            mainComposition.RenderSize = new CGSize(mixComposition.NaturalSize.Height, mixComposition.NaturalSize.Width);

            finalVideo_path = NSUrl.FromFilename(Path.Combine(Path.GetTempPath(), "Whole2.mov"));
            if (File.Exists(Path.GetTempPath()   "Whole2.mov"))
            {
                File.Delete(Path.GetTempPath()   "Whole2.mov");
            }

            //... export video ...
            AVAssetExportSession exportSession = new AVAssetExportSession(mixComposition, AVAssetExportSessionPreset.HighestQuality)
            {
                OutputUrl = NSUrl.FromFilename(Path.Combine(Path.GetTempPath(), "Whole2.mov")),
                OutputFileType = AVFileType.QuickTimeMovie,
                ShouldOptimizeForNetworkUse = true,
                VideoComposition = mainComposition
            };
            exportSession.ExportAsynchronously(_OnExportDone);

            FinalVideo = Path.Combine(Path.GetTempPath(), "Whole2.mov");
        }

        private AVMutableVideoCompositionLayerInstruction TestingInstruction(AVAsset asset, CMTime currentTime, AVAssetTrack mixComposition_video_Track)
        {
            var instruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack(mixComposition_video_Track);

            var startTime = CMTime.Subtract(currentTime, asset.Duration);

            //NaturalSize.Height is passed as a width parameter because IOS stores the video recording horizontally 
            CGAffineTransform translateToCenter = CGAffineTransform.MakeTranslation(mixComposition_video_Track.NaturalSize.Height, 0);
            //Angle in radiants, not in degrees
            CGAffineTransform rotate = CGAffineTransform.Rotate(translateToCenter, (nfloat)(Math.PI / 2));

            instruction.SetTransform(rotate, (CMTime.Subtract(currentTime, asset.Duration)));

            instruction.SetOpacity(1, startTime);
            instruction.SetOpacity(0, currentTime);

            return instruction;
        }
    }

Does anyone know how to solve this?

If you need more information I will provide it as soon as I see your request. Thank you all for your time, have a nice day. (:

CodePudding user response:

I believe I figured out the problem in your code. You are only creating instructions on the first track. Look at these two lines here:

AVAssetTrack videoTrackWithMediaType = mixComposition.TracksWithMediaType(AVMediaType.Video)[0];

var instruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack(videoTrackWithMediaType);

AVMutableComposition.tracksWithMediaType gets an array of tracks so, at the end of the first line, [0], grabs only the first track in the composition, which is the first video. As you loop through you are just creating instructions for the first video multiple times.

Your code and me not being familiar with Xamarin is confusing me, but I believe you can just do this and it should work:

Change these lines:

AVAssetTrack videoTrackWithMediaType =     mixComposition.TracksWithMediaType(AVMediaType.Video)[0];

var instruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack(videoTrackWithMediaType);
                
#region Instructions
int counter = Clips.IndexOf(clip);
Instruction_Array[counter] = TestingInstruction(asset, mixComposition.Duration, videoTrackWithMediaType);
#endregion

To this:

var instruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack(videoTrack);
                
#region Instructions
int counter = Clips.IndexOf(clip);
Instruction_Array[counter] = TestingInstruction(asset, mixComposition.Duration, videoTrack);
#endregion

All I did here was get rid of the videoTracksWithMediaType variable you made and used videoTrack instead. No need to fetch the corresponding track since you already created it and still have access to it within the code block you are in when creating instructions.

  • Related