I have an audio url (.m4a) that I create using the AVAudioRecorder. I want to share that audio on Instagram so I convert the audio to a video. The issue is after the conversion, when I save the video url to the Files app using the UIActivityViewController, I can replay the video, see the time (eg 7 seconds) and hear the audio with no problem. A black screen with a sound icon appears.
But when I save the video to the Photos Library using the UIActivityViewController, the video shows the 7 seconds but nothing plays, the video is all gray, and the sound icon doesn't show.
Why is the video successfully saving/playing in the Files app but saving and not playing in the Photos Library?
let asset: AVURLAsset = AVURLAsset(url: audioURL)
let mixComposition = AVMutableComposition()
guard let compositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: CMPersistentTrackID()) else { return }
let track = asset.tracks(withMediaType: .audio)
guard let assetTrack = track.first else { return }
do {
try compositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: assetTrack.timeRange.duration), of: assetTrack, at: .zero)
} catch {
print(error.localizedDescription)
}
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough) else { return }
let dirPath = NSTemporaryDirectory().appending("\(UUID().uuidString).mov")
let outputFileURL = URL(fileURLWithPath: dirPath)
exporter.outputFileType = .mov
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously {
switch exporter.status {
// ...
guard let videoURL = exporter.outputURL else { return }
// present UIActivityViewController to save videoURL and then save it to the Photos Library via 'Save Video`
}
}
CodePudding user response:
As Lance rightfully pointed out, the issue is that while there was an export of a file in the .mov
or .mp4
format, there was no video, it was just an audio playing.
On reading a bit more, .mp4 for example is just a digital multimedia container format which can very well just be used for audio so it's possible to save audio file as a .mp4 / .mov.
What was needed was to add an empty video track to the AVMutableComposition
to succeed. Lance already posted a great solution works perfectly well and is more self sustained
than an alternative solution I propose which relies on having a blank 1 second video.
Overview of how it works
- You get a blank video file that is 1 second long in the resolution you want, for example 1920 x 1080
- You retrieve the video track from this video asset
- Retrieve the audio track from your audio file
- Create an
AVMutableComposition
which will be used to merge the audio and video tracks - Configure an
AVMutableCompositionTrack
with the audio track and add that to the mainAVMutableComposition
- Configure an
AVMutableVideoComposition
with the video track - Use an
AVAssetExportSession
to export the final video with theAVMutableComposition
and theAVMutableVideoComposition
The code
In most of the code below you will see multiple guard statements. You can create one guard, however, it can be useful to know with such types of tasks where the failure occurred as there could be several reason why an export could fail.
Configuring the audio track
private func configureAudioTrack(_ audioURL: URL,
inComposition composition: AVMutableComposition) -> AVMutableCompositionTrack?
{
// Initialize an AVURLAsset with your audio file
let audioAsset: AVURLAsset = AVURLAsset(url: audioURL)
let trackTimeRange = CMTimeRange(start: .zero,
duration: audioAsset.duration)
// Get the audio track from the audio asset
guard let sourceAudioTrack = audioAsset.tracks(withMediaType: .audio).first
else
{
manageError(nil, withMessage: "Error retrieving audio track from source file")
return nil
}
// Insert a new video track to the AVMutableComposition
guard let audioTrack = composition.addMutableTrack(withMediaType: .audio,
preferredTrackID: CMPersistentTrackID())
else
{
// manage your error
return nil
}
do {
// Inset the contents of the audio source into the new audio track
try audioTrack.insertTimeRange(trackTimeRange,
of: sourceAudioTrack,
at: .zero)
}
catch {
// manage your error
}
return audioTrack
}
Configuring the video track
private func configureVideoTrack(inComposition composition: AVMutableComposition) -> AVMutableCompositionTrack?
{
// Initialize a video asset with the empty video file
guard let blankMoviePathURL = Bundle.main.url(forResource: "blank",
withExtension: ".mp4"),
let videoAsset = AVAsset(url: blankMoviePathURL)
else
{
// manage errors
return nil
}
// Get the video track from the empty video
guard let sourceVideoTrack = videoAsset.tracks(withMediaType: .video).first
else
{
// manage errors
return nil
}
// Insert a new video track to the AVMutableComposition
guard let videoTrack = composition.addMutableTrack(withMediaType: .video,
preferredTrackID: kCMPersistentTrackID_Invalid)
else
{
// manage errors
return nil
}
let trackTimeRange = CMTimeRange(start: .zero,
duration: composition.duration)
do {
// Inset the contents of the video source into the new audio track
try videoTrack.insertTimeRange(trackTimeRange,
of: sourceVideoTrack,
at: .zero)
}
catch {
// manage errors
}
return videoTrack
}
Configure the video composition
// Configure the video properties like resolution and fps
private func createVideoComposition(with videoCompositionTrack: AVMutableCompositionTrack) -> AVMutableVideoComposition
{
let videoComposition = AVMutableVideoComposition()
// Set the fps
videoComposition.frameDuration = CMTime(value: 1,
timescale: 25)
// Video dimensions
videoComposition.renderSize = CGSize(width: 1920, height: 1080)
// Specify the duration of the video composition
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(start: .zero, duration: .indefinite)
// Add the video composition track to a new layer
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoCompositionTrack)
let transform = videoCompositionTrack.preferredTransform
layerInstruction.setTransform(transform, at: .zero)
// Apply the layer configuration instructions
instruction.layerInstructions = [layerInstruction]
videoComposition.instructions = [instruction]
return videoComposition
}
Configure the AVAssetExportSession
private func configureAVAssetExportSession(with composition: AVMutableComposition,
videoComposition: AVMutableVideoComposition) -> AVAssetExportSession?
{
// Configure export session
guard let exporter = AVAssetExportSession(asset: composition,
presetName: AVAssetExportPresetHighestQuality)
else
{
// Manage your errors
return nil
}
// Configure where the exported file will be stored
let documentsURL = FileManager.default.urls(for: .documentDirectory,
in: .userDomainMask)[0]
let fileName = "\(UUID().uuidString).mov"
let dirPath = documentsURL.appendingPathComponent(fileName)
let outputFileURL = dirPath
// Apply exporter settings
exporter.videoComposition = videoComposition
exporter.outputFileType = .mov
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
return exporter
}
Over here, one important thing to not is to set the exporter's present quality
to a movie present like AVAssetExportPresetHighestQuality
or AVAssetExportPresetLowQuality
for example, something other than AVAssetExportPresetPassthrough
which as per the documentation,
A preset to export the asset in its current format, unless otherwise prohibited.
So you would still get an audio mp4 or mov file since the current format of the composition is of an audio. I did not test this extensively but this is from a few tests.
Finally, you can bring it all the above functions together like so:
func generateMovie(with audioURL: URL)
{
delegate?.audioMovieExporterDidStart(self)
let composition = AVMutableComposition()
// Configure the audio and video tracks in the new composition
guard let _ = configureAudioTrack(audioURL, inComposition: composition),
let videoCompositionTrack = configureVideoTrack(inComposition: composition)
else
{
// manage error
return
}
let videoComposition = createVideoComposition(with: videoCompositionTrack)
if let exporter = configureAVAssetExportSession(with: composition,
videoComposition: videoComposition)
{
exporter.exportAsynchronously
{
switch exporter.status {
case .completed:
guard let videoURL = exporter.outputURL
else
{
// manage errors
return
}
// notify someone the video is ready at videoURL
default:
// manege error
}
}
}
}
Final Thoughts
- You could test drive a working sample here
- I converted this into a simple library if you wish to use it where you can configure the orientation, fps and even set a background color to the video - available at the same link
- If you just want the blank videos, you can get them from here
CodePudding user response:
So it seems that although the code from my question did covert the audio file to a video file, there still wasn't a video track
. I know this for a fact because after I got the exporter's videoURL from my question, I tried to add a watermark to it and in the watermark code it kept crashing on
let videoTrack = asset.tracks(withMediaType: AVMediaType.video)[0]
Basically the code from my question coverts audio to video but doesn't add a video track.
What I assume is happening is when the Files app reads the file, it knows that it's a .mov or .mp4 file and then it'll play the audio track even if the video track is missing.
Conversely, when the Photos app reads the file it also know's that it's a .mov or .mp4 file but if there isn't a video track, it won't play anything.
I had to combine these 2 answers to get the audio to play as a video in the Photos app.
1st- I added my app icon (you can add any image) as 1 image to an array of images to make a video track using the code from How do I export UIImage array as a movie? answered by @scootermg.
The code from @scootermg's answer is conveniently in 1 file at this GitHub by @dldnh. In his code, in the ImageAnimator
class, in the render
function, instead of saving to the Library I returned the videoWriter's output URL
in the completionHandler.
2nd- I combined the app icon video that I just made with the audio url from my question using the code from Swift Merge audio and video files into one video answered by @TungFam
In the mixCompostion from TungFam's answer I used the audio url's asset duration for the length of the video.
do {
try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aVideoAssetTrack,
at: .zero)
try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aAudioAssetTrack,
at: .zero)
if let aAudioOfVideoAssetTrack = aAudioOfVideoAssetTrack {
try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(start: .zero,
duration: aAudioAssetTrack.timeRange.duration),
of: aAudioOfVideoAssetTrack,
at: .zero)
}
} catch {
print(error.localizedDescription)
}