Bug 198545 - iOS not hearing other participant
Summary: iOS not hearing other participant
Status: RESOLVED CONFIGURATION CHANGED
Alias: None
Product: WebKit
Classification: Unclassified
Component: WebRTC (show other bugs)
Version: Safari 12
Hardware: iPhone / iPad iOS 12
: P2 Normal
Assignee: Nobody
URL:
Keywords: InRadar
: 197049 212367 (view as bug list)
Depends on:
Blocks:
 
Reported: 2019-06-04 15:12 PDT by Dan
Modified: 2022-06-30 16:59 PDT (History)
18 users (show)

See Also:


Attachments
simctl_diagnose from simulator iOS 14.4 iPhone 11 (40.78 MB, application/x-gzip)
2021-02-02 06:20 PST, Miguel
no flags Details

Note You need to log in before you can comment on or make changes to this bug.
Description Dan 2019-06-04 15:12:08 PDT
Created attachment 371341 [details]
sysdiagnose of the issue

Once in awhile, a user on iOS is not able to hear audio from a remote peer. In a multi-party call, it shows up as one person on iOS safari not being able to hear one of the other participants.

This one is hard to reproduce. I've managed to reproduce it only a couple of times after trying for several hours. I've just been establishing calls over and over again with our app. No specific action seems to cause it. I started looking into it after several of our users reported the issue. I used an iPhone 5s running iOS 12.3.1 to reproduce it. Due to to the amount of time it takes to reproduce with our app, I haven't tried to reproduce with a bare-bones p2p setup yet. Let me know if you'd like me to try that.

AFAIK, there's no way to detect the issue. Everything looks normal. The remote audio track is enabled, not ended, not muted, "live", etc. The audio element is playing, not muted, etc. You just can't hear it. I also used the Web Audio API to log the volume level from the remote track and it looks normal as well.

Once it happens, it's fairly easy to fix. Calling .pause() then .play() on the audio element fixes it, as does refreshing the page. Note that just calling .play() has no impact. The problem for us is that you we detect it.

I've attached a sysdiagnose for reference. The issue occurred at 2:18pm PST, about 20 seconds before I ran the sysdiagnose. Hope that helps!
Comment 1 Radar WebKit Bug Importer 2019-06-06 10:22:15 PDT
<rdar://problem/51488659>
Comment 2 Dan 2019-06-11 13:05:29 PDT
Has anyone been able to reproduce this or have any ideas for what might cause it?
Comment 3 youenn fablet 2019-07-15 11:18:14 PDT
The network data is flowing correctly.
Given pause/play works, that probably makes sense.

We fixed recently some threading issues that might help there.
These are not yet testable in devices though.

One possibility is that the device is a little bit slow and we are trying to read outside the available ring buffer data. We could add some logging there.
Comment 4 youenn fablet 2019-07-15 16:34:21 PDT
Filed https://bugs.webkit.org/show_bug.cgi?id=199814 to improve release logging for the ring buffer.
Comment 5 daginge 2019-10-25 00:23:05 PDT
We have also heard increased reports of this issue following iOS 13. 

Previously we have probably wrongly categorized these issues in support as "local audio muted" problem, when in reality it was remote audio failed. Now we have accurate metrics of local audio failure, and we have confirmed this as a separate issue.

Will try to get a repro case.
Comment 6 Dan 2019-10-25 14:48:26 PDT
Interesting that you're getting more reports on iOS 13. This certainly continues to be a big problem for us as well. Can't confirm whether it's more common on iOS 13 or not though. Haven't looked into that yet. If you can get a repro case that would be awesome :)

@youenn Has the Safari team made any more progress on this one? Besides a repro, anything we can do to help move this along?
Comment 7 daginge 2019-10-28 00:48:38 PDT
Thinking about it increased failures in iOS 13 is probably a false presumption because previously we would file the reports of all audio failures to the local audio capture problem.

Now that we have accurate metrics and workaround for that (which happens to coincide with iOS 13 release), we have started seeing other issues pop up related to audio.

Still no idea what could cause this. For local audio capture issues we could trigger Siri twice to force the error, maybe something similar works for remote audio?

To me it smells like an autoplay issue, but I know that autoplay should be disabled when capturing. Are you adding the video elements before starting the capture by chance? Or accept media from remote side before starting your own?
Comment 8 Dan 2019-10-29 15:59:14 PDT
In our case, we always begin capturing before receiving remote tracks (or even beginning to establish the remote connections). But we don't necessarily wait for the capture to resolve. I just tried to reproduce it by creating the remote connections and rendering audio elements while the capture was occurring but wasn't able to. Note that we use separate audio and video elements to get around the Safari limitation of only allowing 1 video element with audio. Not sure if that matters here, but certainly could.
Comment 9 daginge 2019-12-10 00:04:01 PST
This bug could be related to https://bugs.webkit.org/show_bug.cgi?id=204682
Comment 10 Gábor Tóth 2020-01-30 15:35:14 PST
For me the workaround is to pause and play the video and after disable and enable audio track.

It seems to me that this is a race condition, it happens more frequently if i display multiple streams (remote and local) at the "same" time.
Comment 11 Dan 2020-02-02 22:27:33 PST
@Gabor are you able to detect when the issue is happening? Pausing then playing the video element fixes it for me. I don't also need to enable/disable the track. But I'm not able to detect the issue so don't know when to pause/play.
Comment 12 daginge 2020-02-03 02:14:46 PST
We ended up trying to work around this by listening for the changes to the muted property. Since for us the problem seems to occur whenever the user has gone to the home page and back. So if the OS unmutes a track we stop7start all video element using that MediaStream.

But honestly, at this point there are so many audio issue bugs in the wild it's hard to tell them apart, let alone accurately measure the effect of our fixes.

If I get only one issue for iOS WebRTC in the future it's increased resources dedicated to sorting out issues with capturing media and the lifecycle around that. I can't count the number of weird workarounds we have had to put in place because iOS suddenly decides that we can't have audio anymore.
Comment 13 Gábor Tóth 2020-02-03 05:27:08 PST
@Dan, our workaround is not stable at all, i could not find a way how to find out and even have a several case when starting/stopping could go worse.

However i find out that using a headphone i can still hear the remote stream, its just very quiet. Maybe there is some issue with autogaincontroll?
Comment 14 Anna Vasilko 2020-04-02 16:28:25 PDT
We hit this issue a lot with Safari 13 exactly as described by the initial reporter.
iOS Safari can't hear one of the remote participants in a multi-party call.

And to emphasize, this is NOT related to backgrounding, and the tracks are not silent/muted, all remote tracks look normal, enabled, not ended, not muted, "live", etc. 

Calling .pause() then .play() on the audio element fixes it, but we can't find a way to detect it programmatically. 

The issue seems to be related to audio track rendering, it happens normally when someone joins the room with several participants or dominant speaker changes (basically on any activity which results in re-rendering of audio elements).
Comment 15 youenn fablet 2020-04-03 05:20:43 PDT
*** Bug 197049 has been marked as a duplicate of this bug. ***
Comment 16 youenn fablet 2020-04-09 04:44:59 PDT
We recently fixed https://bugs.webkit.org/show_bug.cgi?id=209411 and https://bugs.webkit.org/show_bug.cgi?id=209412.

In both cases, one possibility would be to reset the srcObject attribute value of the audio or video element that is playing the remove track whenever a remote track goes from muted to unmuted or disabled to enabled.
That will make sure to create a fresh new audio renderer.
Comment 17 Anna Vasilko 2020-04-15 17:44:36 PDT
(In reply to youenn fablet from comment #16)
> We recently fixed https://bugs.webkit.org/show_bug.cgi?id=209411 and
> https://bugs.webkit.org/show_bug.cgi?id=209412.
> 
> In both cases, one possibility would be to reset the srcObject attribute
> value of the audio or video element that is playing the remove track
> whenever a remote track goes from muted to unmuted or disabled to enabled.
> That will make sure to create a fresh new audio renderer.

This workaround does not seem to fix the issue based on our testing.
The mentioned fixed tickets seem to be related to muting and backgrounding but please note that this audio playback issue happens without muting tracks or backgrounding.
Comment 18 Dan 2020-04-15 20:03:54 PDT
@Anna In your testing, have you found a way to reliably reproduce the issue? Do you find you get it more consistently by doing things in a specific order or having a certain number of connections?
Comment 19 tmendoza 2020-04-16 15:22:27 PDT
Hey @Dan - I'm on Anna's team. We haven't found a way to reliably reproduce the issue. It seems to happen at random. That said, the issue does seem to happen somewhat frequently - maybe 10% of the time.

We haven't found any special conditions that need to exist in order to reproduce the issue. It appears to happen when the mediaStream is attached to an audio element in the DOM. If I detach then re-attach an existing audio track to the DOM, it can trigger the issue about 10% of the time. 

If you're curious, here's a link to the the code that we use to attach and detach audio elements to/from the DOM (see the 'attach' and 'detach' methods): https://github.com/twilio/twilio-video.js/blob/master/lib/media/track/mediatrack.js#L118
Comment 20 youenn fablet 2020-04-20 01:57:37 PDT
As shown by https://jsfiddle.net/5of6Lmys/, adding an audio track to a video stream that is already playing will not play audio. Could that explain some of the issues discussed here?
Calling pause/play after addTrack or setting element.srcObject to its same value are potential workarounds.
Comment 21 youenn fablet 2020-04-20 03:34:40 PDT
(In reply to youenn fablet from comment #20)
> As shown by https://jsfiddle.net/5of6Lmys/, adding an audio track to a video
> stream that is already playing will not play audio. Could that explain some
> of the issues discussed here?
> Calling pause/play after addTrack or setting element.srcObject to its same
> value are potential workarounds.

Track this issue at https://bugs.webkit.org/show_bug.cgi?id=210740, which might also require https://bugs.webkit.org/show_bug.cgi?id=210494.
Comment 22 Martin Ždila 2020-04-23 04:47:46 PDT
*** Bug 208134 has been marked as a duplicate of this bug. ***
Comment 23 Gareth Lloyd 2020-04-23 05:01:01 PDT
Anyone from Apple care to comment when these audio issues are going to be fixed?
Comment 24 Martin Ždila 2020-04-23 05:35:53 PDT
@youenn no, setting element.srcObject to the same value won't help. Please see https://bugs.webkit.org/show_bug.cgi?id=208134

Also calling pause() and play() is not the solution because on play the volume is again random.
Comment 25 youenn fablet 2020-04-23 06:06:09 PDT
(In reply to Martin Ždila from comment #24)
> @youenn no, setting element.srcObject to the same value won't help. Please
> see https://bugs.webkit.org/show_bug.cgi?id=208134
> 
> Also calling pause() and play() is not the solution because on play the
> volume is again random.

Audio volumen is probably a separate issue, let's keep the other bug open.
Comment 26 Sarmistha 2020-04-24 05:42:05 PDT
We are seeing 2 issues with iOS 13.3.1. Participant A on iPhone 11 with iOS 13.3.1 and participant B on IE(Electron).

1. Participant A on iPhone can't hear participant B after WebRTC connection is established. Mobile browser refresh on iPhone is resolving it.

2. Participant B on IE can't hear participant A on iPhone after WebRTC connection is established. Mobile browser refresh on iPhone is NOT resolving it.

This issue is causing lots of user dissatisfaction. Will the fix of 208134 resolve both conditions?

Is there any update on fix timeline?
Comment 27 youenn fablet 2020-05-06 02:03:10 PDT
https://bugs.webkit.org/show_bug.cgi?id=211287 helps a lot according my testing.
The only potential workaround I can think of so far is to stop audio capture while setting up the audio.
Comment 28 youenn fablet 2020-06-08 11:02:58 PDT
This should be fixed in latest iOS 13.5.5 beta.
Comment 29 Francesco Durighetto (kekkokk) 2020-07-01 03:46:51 PDT
(In reply to youenn fablet from comment #28)
> This should be fixed in latest iOS 13.5.5 beta.

Thanks Youenn,
the "read out of the ring buffer" seems to be fixed in 13.6 beta 2 (the audio of the remote streams is always playing)

However some of the bug reported and/or connected to this are still in a way present in the latest 10.6 beta 3

I found other major bugs for the user experience and the impossibility to catch them programmatically, which is quite disruptive for our services, especially in this lockdown period since video collaboration had an impressive boost:

1) when entering a call (with audio constraints {video: true, audio: {echoCancellation: true}} ) the remote stream is playing ok in iOS Safari.
- I put airpods: the switch is working flawlessly, audio and microphone are now picked from earphones.
- I remove airpods: the switch to the internal device microphone works ok but the remote audio is NOT playing. (html audio tag -> paused: false) (audio mediaStreamTrack -> enabled: true, readyState: live, muted: false), no paused event emitted from html audio nor muted or ended events are emitted from mediaStreamTrack.
- Try to call .play() on html audio tag -> nothing happens
- Try to call .pause() on html audio tag -> a paused event is emitted from html audio tag
- Try to call .play() again and finally the audio resumes BUT with a very low volume

The only ways to "unlock" the volume are:
- calling new webkitAudioContext()
- rotating the screen of the device
- doing a pinch to zoom (this leads to another nested bug, see * on the bottom)

- Trying to wear again the airpods, the audio is ok but the volume is again really low.
- Trying to remove them again, and no audio is emitted by the phone.


2) Now let's do the same thing but with the airpods already on the ears.
- Put airpods on
- Join the call. audio is playing good in the airpods and the airpods mic is selected automatically.
- Remove airpods: audio from speakerphone is very low and the only 3 ways I found to unlock are the same described before.
- Also, the audio is not crystal clear as when join the call without airpods. I think the output bitrate of airpods is kept when switching from airpods to iphone (not a big deal but could help debugging)

3) onended events is not emitted by the mediastreamtrack whenever a gsm call is received. 
The gsm "steal" the microphone to the web page and never gives it back after the call.
This is a really big issue since we cannot do a getusermedia again after the call to have the control back.
Instead, a muted event is emitted from the mediaStreamTrack. This is not spec-compliant since we expect that the microphone is temporarily muted but eventually is given back (which is the behavior I expect, but is not). With the camera however, this works well. Wehen the page is set in background, the capturer stops and emits a stream muted, but when safari is opened again, the camera is captured again.
So, the best fix would be to have a microphone muted event whenever other apps "steal" the microphone such as gsm call and eventually have it back once the other app is done, or have an ended event if you do not plan to give the control back to safari.

4) Would be nice if the play/pause in the dropdown menu widget would not interfere with realtime webrtc calls.


* When doing a pinch to zoom, a pause event is emitted once from the video html tag. all the subsequent pinch to zoom emits a pause event TWICE, thus the video continues playing normally.

Let me know if you prefer I open these as separate bugs but please consider working on these before the next 13.6 release. iOS 13 was a nightmare from the audio perspective in WebRTC and users keeps complaining about that.

Question: is Safari webkit engine updated on iOS only on iOS release or there is "hotfix" branch that auto updates itself as an "app"?

thanks!
Comment 30 youenn fablet 2020-07-01 07:06:29 PDT
@francesco, would you mind filing specific new bugs, thanks!
Comment 31 Art Matsak 2020-10-16 08:05:24 PDT
iOS 13.6 had this issue fixed but it looks like it's made a comeback (albeit with lower frequency of occurrence) in iOS 14. Is there a regression of some sort?
Comment 32 youenn fablet 2020-10-16 08:06:31 PDT
(In reply to Art Matsak from comment #31)
> iOS 13.6 had this issue fixed but it looks like it's made a comeback (albeit
> with lower frequency of occurrence) in iOS 14. Is there a regression of some
> sort?

If you can reproduce, could you send me a sysdiagnose privately?
Comment 33 mawojtcz 2020-11-24 08:47:27 PST
We're also seeing this issue - I assume it's the same, because we sometimes have a video element that's playing remote video fine but with no audio and calling pause() and then play() on that element fixes it. Same issue happens also on just a pure audio element. 
The problem is that we don't have a way to detect when to trigger this workaround of calling pause() and play(). So far we've been calling it whenever we were changing the srcObject property on the media element and that seemed to work fine in iOS 13. We couldn't reproduce the issue anymore. Now, with the latest iOS 14.3 Beta 2 I've found this issue is happening again.
Comment 34 Jovan Chohan 2020-11-30 17:55:55 PST
We also starting seeing this issue in iOS 14.2 and have been getting a lot of complaints about it recently. It occurred randomly, maybe 1/10 times. Video was working but the user on the iOS Safari 14.2 device could not hear the other. Our workaround was to play the audio stream from the MediaStream audio track through the AudioContext.
Comment 35 mawojtcz 2020-12-09 05:57:04 PST
Youenn, I've sent you an email with sysdiagnose for this issue from an Ipad running the latest 14.3 beta
Comment 36 Miguel 2021-02-02 06:20:06 PST
Created attachment 418987 [details]
simctl_diagnose from simulator iOS 14.4 iPhone 11

We can reproduce this audio issue in our web application quite often in real iPhones and in the simulator too. Please find attached a diagnose. 

The issue happens randomly on the initial load of the browser and refreshing the browser. It can be reproduced consistently after switching back and forth a few times to another Safari tab, another app, or the home screen. 

In this test, the issue is reproduced but not fixed. The video element is not muted, it is created and played on a click event. We do not capture a media stream before or after the video plays. The MediaStream is enabled and has always readyState as live. The video track is remote and plays without issues. The audio track is remote, is not muted, is enabled, and has always readyState as live. 

The only workaround that fixed the issue consistently was playing the audio track through an AudioContext but a secondary issue appeared that might point us to the root cause. When the audio is lost if the MediaStream is played through an AudioContext the audio track plays on the left channel/speaker only.

I tried the following workarounds but they do not solve the issue consistently:

- pause and play and video
- reset the srcObject of the video
- disable and enable the audio track or the MediaStream  

Hope this helps, thanks to everybody!
Comment 37 Brent Fulgham 2022-06-30 16:59:42 PDT
*** Bug 212367 has been marked as a duplicate of this bug. ***