Home > Net >  iPhone 14 won't record using MediaRecorder
iPhone 14 won't record using MediaRecorder

Time:12-16

Our website records audio and plays it back for a user. It has worked for years with many different devices, but it started failing on the iPhone 14. I created a test app at https://nmp-recording-test.netlify.app/ so I can see what is going on. It works perfectly on all devices but it only works the first time on an iPhone 14. It works on other iPhones and it works on iPad and MacBooks using Safari or any other browser.

It looks like it will record if that is the first audio you ever do. If I get an AudioContext somewhere else the audio playback will work for that, but then the recording won't.

The only symptom I can see is that it doesn't call MediaRecorder.ondataavailable when it is not working, but I assume that is because it isn't recording.

Here is the pattern that I'm seeing with my test site:

  1. Click "new recording". (the level indicator moves, the data available callback is triggered)
  2. Click "listen" I hear what I just did
  3. Click "new recording". (no levels move, no data is reported)
  4. Click "listen" nothing is played.

But if I do anything, like click the metronome on and off then it won't record the FIRST time, either.

The "O.G. Recording" is the original way I was doing the recording, using deprecated method createMediaStreamSource() and createScriptProcessor()/createJavaScriptNode(). I thought maybe iPhone finally got rid of that, so I created the MediaRecorder version.

What I'm doing, basically, is (truncated to show the important part):

const chunks = []
function onSuccess(stream: MediaStream) {
  mediaRecorder = new MediaRecorder(stream);
  mediaRecorder.ondataavailable = function (e) {
    chunks.push(e.data);
  }
  mediaRecorder.start(1000);
}
navigator.mediaDevices.getUserMedia({ audio: true }).then(onSuccess, one rror);

Has anyone else seen anything different in the way the iPhone 14 handles recording?

Does anyone have a suggestion about how to debug this?

If you have an iPhone 14, would you try my test program above and let me know if you get the same results? We only have one iPhone 14 to test with, and maybe there is something weird about that device.

If it works you should see a number of lines something like data {"len":6784} appear every second when you are recording.

--- EDIT ---

I reworked the code similar to Frank zeng's suggestion and I am getting it to record, but it is still not right. The volume is really low, it looks like there are some dropouts, and there is a really long pause when resuming the AudioContext.

The new code seems to work perfectly in the other devices and browsers I have access to.

--- EDIT 2 ---

There were two problems - one is that the deprecated use of createScriptProcessor stopped working but the second one was an iOS bug that was fixed in version 16.2. So rewriting to use the AudioWorklet was needed, but keeping the recording going once it is started is not needed.

CodePudding user response:

I have the same problem as you,I think the API of AudioContent.createScriptProcessor is Invalid in Iphone14, I used new API About AudioWorkletNode to replace it. And don't closed the stream, Because the second recording session of iPhone 14 is too laggy, Remember to destroy the data after recording. After testing, I have solved this problem,Here's my code,

// get stream
window.navigator.mediaDevices.getUserMedia(options).then(async (stream) => {
          // that.stream = stream
          that.context = new AudioContext()
          await that.context.resume()
          const rate = that.context.sampleRate || 44100
          that.mp3Encoder = new lamejs.Mp3Encoder(1, rate, 128)
          that.mediaSource = that.context.createMediaStreamSource(stream)
          // API开始逐步淘汰了,如果可用则继续用,如果不可用则采用worklet方案写入音频数据
          if (that.context.createScriptProcessor && typeof that.context.createScriptProcessor === 'function') {
            that.mediaProcessor = that.context.createScriptProcessor(0, 1, 1)
            that.mediaProcessor.onaudioprocess = event => {
              window.postMessage({ cmd: 'encode', buf: event.inputBuffer.getChannelData(0) }, '*')
              that._decode(event.inputBuffer.getChannelData(0))
            }
          } else { // 采用新方案
            that.mediaProcessor = await that.initWorklet()
          }
          resolve()
        })
// content of audioworklet function
async initWorklet() {
    try {
      /*音频流数据分析节点*/
      let audioWorkletNode;
       /*---------------加载AudioWorkletProcessor模块并将其添加到当前的Worklet----------------------------*/
       await this.context.audioWorklet.addModule('/get-voice-node.js');
       /*---------------AudioWorkletNode绑定加载后的AudioWorkletProcessor---------------------------------*/
       audioWorkletNode = new AudioWorkletNode(this.context, "get-voice-node");
      /*-------------AudioWorkletNode和AudioWorkletProcessor通信使用MessagePort--------------------------*/
      console.log('audioWorkletNode', audioWorkletNode)
      const messagePort = audioWorkletNode.port;
      messagePort.onmessage = (e) => {
        let channelData = e.data[0];
        window.postMessage({ cmd: 'encode', buf: channelData }, '*')
        this._decode(channelData)
      }
      return audioWorkletNode;
    } catch (e) {
      console.log(e)
    }
  }

// content of get-voice-node.js, Remember to put it in the static resource directory
class GetVoiceNode extends AudioWorkletProcessor {
  /*
  * options由new AudioWorkletNode()时传递
  * */
  constructor() {
      super()
  }

  /*
  * `inputList`和outputList`都是输入或输出的数组
  * 比较坑的是只有128个样本???如何设置
  * */
  process (inputList, outputList, parameters) {
    // console.log(inputList)
    if(inputList.length>0&&inputList[0].length>0){
      this.port.postMessage(inputList[0]);
    }
    return true //回来让系统知道我们仍处于活动状态并准备处理音频。
  }
}
registerProcessor('get-voice-node', GetVoiceNode)

Destroy the recording instance and free the memory,if want use it the nextTime,you have better create new instance

   this.recorder.stop()
   this.audioDurationTimer && window.clearInterval(this.audioDurationTimer)
   const audioBlob = this.recorder.getMp3Blob()
   // Destroy the recording instance and free the memory
   this.recorder = null
  • Related