Bug 180680 - Allow AudioContext to start when getUserMedia is on
Summary: Allow AudioContext to start when getUserMedia is on
Status: RESOLVED FIXED
Alias: None
Product: WebKit
Classification: Unclassified
Component: WebRTC (show other bugs)
Version: WebKit Nightly Build
Hardware: Unspecified Unspecified
: P2 Normal
Assignee: youenn fablet
URL:
Keywords: InRadar
Depends on:
Blocks:
 
Reported: 2017-12-11 16:36 PST by youenn fablet
Modified: 2018-09-05 00:02 PDT (History)
6 users (show)

See Also:


Attachments
Patch (5.73 KB, patch)
2017-12-11 16:38 PST, youenn fablet
no flags Details | Formatted Diff | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description youenn fablet 2017-12-11 16:36:40 PST
Allow AudioContext to start when getUserMedia is on
Comment 1 youenn fablet 2017-12-11 16:38:21 PST
Created attachment 329057 [details]
Patch
Comment 2 WebKit Commit Bot 2017-12-12 09:26:38 PST
Comment on attachment 329057 [details]
Patch

Clearing flags on attachment: 329057

Committed r225785: <https://trac.webkit.org/changeset/225785>
Comment 3 WebKit Commit Bot 2017-12-12 09:26:39 PST
All reviewed patches have been landed.  Closing bug.
Comment 4 Radar WebKit Bug Importer 2017-12-12 09:27:36 PST
<rdar://problem/35995701>
Comment 5 Ben 2018-09-04 06:14:00 PDT
With this fix we could autoplay a video element that has audio as long as the user previously approved a getUserMedia request.

In current Safari stable on desktop it's not enough to approve a getUserMedia request in the beginning of the session. The user has to actively capture mic or webcam to make video autoplay work.
This a regression. In our app we have one user broadcasting and several viewers. Capturing their mic just to make autoplay work doesn't make sense.
Comment 6 youenn fablet 2018-09-04 08:33:34 PDT
> In current Safari stable on desktop it's not enough to approve a
> getUserMedia request in the beginning of the session. The user has to
> actively capture mic or webcam to make video autoplay work.
> This a regression. In our app we have one user broadcasting and several
> viewers. Capturing their mic just to make autoplay work doesn't make sense.

The principle is that a user should make a gesture to activate sound.
It can be the getUserMedia prompt, it can also be a click on a video element, play button, "activate sound" button.
Once a page is producing audio content, other video elements should autoplay.

I am not sure what your exact request is and what the regression you are pointing at is.
Comment 7 Ben 2018-09-04 10:33:30 PDT
I'm creating AudioContext and playing it as a response of user gesture. I can hear the noise from the AudioContext but later when I'm trying to autoplay a video element with audio it is muted unless I'm actively capturing the local mic/cam. 

This is the callback of click event to enable audio:

enableAudio() {
  let audioContext = 'AudioContext' in window ? new AudioContext() : new window.webkitAudioContext();
  // create 2 seconds buffer
  let buffer = audioContext.createBuffer(2, audioContext.sampleRate*2, audioContext.sampleRate);
  
  // create noise
  for (var channel = 0; channel < 2; channel++) {
    // This gives us the actual ArrayBuffer that contains the data
    var nowBuffering = buffer.getChannelData(channel);
    for (var i = 0; i < audioContext.sampleRate*2; i++) {
      // Math.random() is in [0; 1.0]
      // audio needs to be in [-1.0; 1.0]
      nowBuffering[i] = Math.random() * 2 - 1;
    }
  }

  let source = audioContext.createBufferSource();
  source.buffer = buffer;
  source.connect(audioContext.destination);
  source.start(0);

  // create a PeerConnection and try to autoplay remote video+audio
}
Comment 8 youenn fablet 2018-09-04 11:10:09 PDT
To summarize, your issue is:
- AudioContext is started on user click and produces audio
- video element is being added later on and will not autoplay even though web audio is producing audio

There are two workarounds I can think of right now:
- play the audio of the video element through AudioContext instead of video elements
- When AudioContext is being clicked, call play() on the video element
Comment 9 Ben 2018-09-05 00:02:27 PDT
Thank you for the workaround. Mixing all the audio and playing with a single AudioContext work with WebRTC streams but I think will break lip sync.
It also doesn't help with autoplay HLS and YouTube videos on the web conference.