Created attachment 371341 [details]
sysdiagnose of the issue
Once in awhile, a user on iOS is not able to hear audio from a remote peer. In a multi-party call, it shows up as one person on iOS safari not being able to hear one of the other participants.
This one is hard to reproduce. I've managed to reproduce it only a couple of times after trying for several hours. I've just been establishing calls over and over again with our app. No specific action seems to cause it. I started looking into it after several of our users reported the issue. I used an iPhone 5s running iOS 12.3.1 to reproduce it. Due to to the amount of time it takes to reproduce with our app, I haven't tried to reproduce with a bare-bones p2p setup yet. Let me know if you'd like me to try that.
AFAIK, there's no way to detect the issue. Everything looks normal. The remote audio track is enabled, not ended, not muted, "live", etc. The audio element is playing, not muted, etc. You just can't hear it. I also used the Web Audio API to log the volume level from the remote track and it looks normal as well.
Once it happens, it's fairly easy to fix. Calling .pause() then .play() on the audio element fixes it, as does refreshing the page. Note that just calling .play() has no impact. The problem for us is that you we detect it.
I've attached a sysdiagnose for reference. The issue occurred at 2:18pm PST, about 20 seconds before I ran the sysdiagnose. Hope that helps!
Has anyone been able to reproduce this or have any ideas for what might cause it?
The network data is flowing correctly.
Given pause/play works, that probably makes sense.
We fixed recently some threading issues that might help there.
These are not yet testable in devices though.
One possibility is that the device is a little bit slow and we are trying to read outside the available ring buffer data. We could add some logging there.
Filed https://bugs.webkit.org/show_bug.cgi?id=199814 to improve release logging for the ring buffer.
We have also heard increased reports of this issue following iOS 13.
Previously we have probably wrongly categorized these issues in support as "local audio muted" problem, when in reality it was remote audio failed. Now we have accurate metrics of local audio failure, and we have confirmed this as a separate issue.
Will try to get a repro case.
Interesting that you're getting more reports on iOS 13. This certainly continues to be a big problem for us as well. Can't confirm whether it's more common on iOS 13 or not though. Haven't looked into that yet. If you can get a repro case that would be awesome :)
@youenn Has the Safari team made any more progress on this one? Besides a repro, anything we can do to help move this along?
Thinking about it increased failures in iOS 13 is probably a false presumption because previously we would file the reports of all audio failures to the local audio capture problem.
Now that we have accurate metrics and workaround for that (which happens to coincide with iOS 13 release), we have started seeing other issues pop up related to audio.
Still no idea what could cause this. For local audio capture issues we could trigger Siri twice to force the error, maybe something similar works for remote audio?
To me it smells like an autoplay issue, but I know that autoplay should be disabled when capturing. Are you adding the video elements before starting the capture by chance? Or accept media from remote side before starting your own?
In our case, we always begin capturing before receiving remote tracks (or even beginning to establish the remote connections). But we don't necessarily wait for the capture to resolve. I just tried to reproduce it by creating the remote connections and rendering audio elements while the capture was occurring but wasn't able to. Note that we use separate audio and video elements to get around the Safari limitation of only allowing 1 video element with audio. Not sure if that matters here, but certainly could.
This bug could be related to https://bugs.webkit.org/show_bug.cgi?id=204682
For me the workaround is to pause and play the video and after disable and enable audio track.
It seems to me that this is a race condition, it happens more frequently if i display multiple streams (remote and local) at the "same" time.
@Gabor are you able to detect when the issue is happening? Pausing then playing the video element fixes it for me. I don't also need to enable/disable the track. But I'm not able to detect the issue so don't know when to pause/play.
We ended up trying to work around this by listening for the changes to the muted property. Since for us the problem seems to occur whenever the user has gone to the home page and back. So if the OS unmutes a track we stop7start all video element using that MediaStream.
But honestly, at this point there are so many audio issue bugs in the wild it's hard to tell them apart, let alone accurately measure the effect of our fixes.
If I get only one issue for iOS WebRTC in the future it's increased resources dedicated to sorting out issues with capturing media and the lifecycle around that. I can't count the number of weird workarounds we have had to put in place because iOS suddenly decides that we can't have audio anymore.
@Dan, our workaround is not stable at all, i could not find a way how to find out and even have a several case when starting/stopping could go worse.
However i find out that using a headphone i can still hear the remote stream, its just very quiet. Maybe there is some issue with autogaincontroll?