The way that audio playback is handled when a device is set to silent is inconsistent between audio/video elements and the Web Audio API. Steps to reproduce: 1. * On an iPhone, make sure volume isn't set to zero * But make sure the ring/silent switch is set to silent. * Press play on a video or audio element. * The "play video" button here is an example: https://thesession.org/tunes/21871#comment943577 Observed behaviour: You can hear the sound. 2. * On an iPhone, make sure volume isn't set to zero * But make sure the ring/silent switch is set to silent. * Press play on an element that triggers the Web Audio API. * The "play audio" button here is an example: https://thesession.org/tunes/21871#setting45439 Observed behaviour: You can't hear the sound. Expected behaviour: The two situations should give the same results. Either the sound should be audible in both cases, or the sound not be audible in both cases. Some background: https://adactio.com/journal/19929 The issue with Web Audio silence on mute was raised (and closed) here: https://bugs.webkit.org/show_bug.cgi?id=251532 This is a different but related issue: if the Web Audio playback *should* be silenced by the ring/silence switch being enabled, then shouldn't that apply to audio/video elements as well?
I've just been pointed to the nascent Audio Session proposal which sets out to give authors more control over playback: https://github.com/w3c/audio-session/blob/main/explainer.md Does the issue I filed still stand? (Seeing as it's about unifying playback behaviour)
The original intent of the Web Audio API was to play short-duration, low latency sound effects. As such, we made the decision to map AudioContext generated audio to the system's "Ambient" audio behavior. Web Audio would be allowed to mix (rather than interrupt) other currently playing audio such as music or podcasts. However, it would "obey" the mute switch on the device, and would not be allowed to continue to play when the browser was not foreground. Since the primary use case was video game sound effects, this meant that in-browser game audio would behave the same as native game audio: audio plays through the headphones, obeys the mute switch, mixes with media playback in other applications, stops when backgrounding the app. However, the use cases for Web Audio have grown to include long-form audio playback, and as you and others have noted, this means that pages using Web Audio for long-form audio playback see inconsistent behavior when compared to <audio> element playback. But without extra context from the page about _why_ Web Audio is being used, it's difficult to make "correct" decisions about how that audio should behave. The intent of the Audio Session API is to allow pages to inform the User Agent of _how_ it should treat the audio generated by the page. If a page is using Web Audio for long-form media playback, it can and should say so, and the audio generated by Web Audio should be treated similarly to <audio> and <video> playback.
(In reply to Jeremy Keith from comment #0) > This is a different but related issue: if the Web Audio playback *should* be > silenced by the ring/silence switch being enabled, then shouldn't that apply > to audio/video elements as well? No, because the "mute switch" behavior for each are tied together with other system behaviors: Web Audio is treated as "ambient" sound by iOS, which allows it to mix with other applications instead of interrupting, at the cost of having to obey the mute switch. Video/Audio playback will interrupt other (non-mixable) audio playback, and ignores the mute switch. In other words, there's no way* to untangle the "mixable" with "ignore the mute switch" from one another. * No straightforward way at least.
Thanks, Jer—that all makes sense. Sounds like the Audio Session API is the way forward here.
<rdar://problem/106086882>