Home > Mobile >  Should I use requestAnimationFrame for audio fading?
Should I use requestAnimationFrame for audio fading?

Time:12-03

I am writing an audio fading handler that manipulates a single HTML audio or video element. Currently I am just using the HTMLMediaElement.volume property, because it's there and computationally non-expensive.

I use a simple setInterval approach that periodically sets a new volume to fade in or out until the target volume level is reached:

const clearIntervalId = setInterval(() => {
    const now = new Date().getTime();
    //[simplified]
    const newTarget =
        from   (stepSize / duration) * passedTime;
    this.setAudioVolume(newTarget);
}, this.stepDuration);

However, on some lower-spec devices, I have observed that the time duration between the audio volume changes is noticeable and causes a clicking/distorted sound because of the larger volume steps.

I wonder whether using requestAnimationFrame would provide smoother audio fadings.

However, since the requestAnimationFrame is always advertised as for visual animation, I am reluctant to use it. Especially, the audio fadings should also work when the browser or tab is not in front and even when e.g. a mobile device has it's screen turned off.

Is there any official recommendation about whether to use requestAnimationFrame for non-visual animations?

Note: I know about the Web Audio API, but since the Audio Context is advertised as expensive, I would like to avoid that.

CodePudding user response:

No, as you correctly hinted, requestAnimationFrame (rAF) is meant for visual animations, not audio. What you'd want for audio is to have the smallest interval you can and rAF doesn't do that, it does provide a way to hook to the monitor's refresh event, which may vary between devices, but which won't be what you need in any event.

For a fade effect, the best is to use an AudioContext, which does come with methods to do just so out of the box.

since the Audio Context is advertised as expensive, I would like to avoid that.

What this link says is that one should avoid creating many AudioContext() instances, because (it used to be that) each context would have a hook on the physical audio devices and managing the resource was a complex matter. I believe that nowadays it's not that true anymore and that browsers are able to manage multiple AudioContext instances just fine, but even then, it's probably better to create just one.
But anyway, even an <audio> element needs to hook to the audio device, it's just that the browser can (generally) better manage the various media elements.
These points combined make it that there is no solid reason to not use an AudioContext over an <audio> element when you want to do something <audio> aren't meant to do, and fading is one such thing.

(async () => {
  const btn = document.querySelector("button");
  const context = new AudioContext();
  const url = "https://upload.wikimedia.org/wikipedia/en/transcoded/d/dc/Strawberry_Fields_Forever_(Beatles_song_-_sample).ogg/Strawberry_Fields_Forever_(Beatles_song_-_sample).ogg.mp3";
  const resp = await fetch(url);
  const buf  = resp.ok && await resp.arrayBuffer();
  const audioBuf = await context.decodeAudioData(buf);
  btn.onclick = (evt) => {
    const gain = context.createGain();
    const volume = gain.gain;
    const source = context.createBufferSource();
    source.buffer = audioBuf;
    source.connect(gain);
    gain.connect(context.destination);
    // set current volume (not to zero, but close enough)
    volume.setValueAtTime(Number.EPSILON, context.currentTime);
    // fade in
    volume.linearRampToValueAtTime(1, context.currentTime   3);
    // Then to fade out
    setTimeout(() => {
      volume.linearRampToValueAtTime(Number.EPSILON, context.currentTime   3);
    }, 8000);
    source.start(context.currentTime);
  };
  btn.disabled = false;
  
})();
<button disabled>play</button>

  • Related