Bug 202405 - Regression: iOS 13.1 MediaStreamTrack.enabled = false kills audio track
Summary: Regression: iOS 13.1 MediaStreamTrack.enabled = false kills audio track
Status: RESOLVED FIXED
Alias: None
Product: WebKit
Classification: Unclassified
Component: WebRTC (show other bugs)
Version: Other
Hardware: Unspecified Unspecified
: P2 Normal
Assignee: youenn fablet
URL:
Keywords: InRadar
: 203382 (view as bug list)
Depends on:
Blocks:
 
Reported: 2019-10-01 07:58 PDT by alan.ford
Modified: 2020-04-07 12:17 PDT (History)
23 users (show)

See Also:


Attachments
Screenshot (203.92 KB, image/png)
2019-10-01 07:58 PDT, alan.ford
no flags Details
Patch (3.23 KB, patch)
2019-10-03 07:02 PDT, youenn fablet
no flags Details | Formatted Diff | Diff
Patch for landing (3.23 KB, patch)
2019-10-03 08:24 PDT, youenn fablet
no flags Details | Formatted Diff | Diff
Patch (7.44 KB, patch)
2019-10-03 09:20 PDT, youenn fablet
no flags Details | Formatted Diff | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description alan.ford 2019-10-01 07:58:08 PDT
Created attachment 379902 [details]
Screenshot

To mute an audio track in WebRTC, you can set "enabled" on the MediaStreamTrack to false. This worked fine up until iOS 13.1 (and maybe also not in iOS 13), but now if you set "enabled" to false, then "readyState" goes from "live" to "ended" and no audio flows in either direction from then on.

This does not fail on video tracks.

You can test this with the Safari console on https://webrtc.github.io/samples/src/content/peerconnection/pc1/ - see screenshot.
Comment 1 daginge 2019-10-02 00:00:11 PDT
Can confirm this on my device as well, we've had several reports from users about this. Currently on 13.1, will update to 13.1.2 and report back.
Comment 2 daginge 2019-10-02 01:08:11 PDT
This bug is not fixed on iOS 13.1.2. I can reproduce the error 100% of the time with the following:

1. Go to https://webrtc.github.io/samples/src/content/peerconnection/pc1/ and initiate a call.
2. Observe that you get sound both ways by listening for feedback.
3. Open the developer console
4. Enter pc1.getSenders().forEach(s => {s.track.kind === "audio" ? s.track.enabled = false : null})
5. Observe that sound now disappears. Unmuting the track by reversing step 4 does not bring sound back.

Furthermore, we are able to reproduce this with remote audio using whereby (previously appear.in). Steps to reproduce:

1. Go to https://whereby.com/daginge1231231234 (p2p) on iOS 13
2. Join using another browser, doesn't matter which one, I've tested Chrome and iOS Safari.
3. Observe that you are getting audio running both ways (usually feedback)
4. Mute audio on iOS 13. Observe that audio now disappears both locally and from remote party
5. Observe that unmuting audio does not fix the problem

Only way to work around this issue is to hard refresh the page.

This is quite serious, and we've had to disable muting local audio on iOS 13 as a result.
Comment 3 youenn fablet 2019-10-02 01:46:48 PDT
I confirm this issue reproes on a recent iOS13, not older ones.
This does not seem to repro on MacOS.
Comment 4 daginge 2019-10-02 02:09:20 PDT
Workaround as suggested by fippo/philipp: Use replaceTrack(null) to remove the audio track from the PC instead of muting it. Remember to not add the audio track on new peer connections, though I guess this is less of an issue for you SFUers.

We'll implement this change now and test it out a bit.
Comment 5 daginge 2019-10-02 04:54:59 PDT
Word fo warning to others who might try the replaceTrack workaround. I'm seeing some serious delays post unmuting in certain cases on Chromium-based WebRTC engines (Safari included). Firefox seems unaffected. So don't go changing all of your muting code to replaceTrack, it might cause some other painful issues, best to keep this to iOS 13 only.

FWIW here is my workaround code:

const audioTrack = this.mediaStream.stream.getAudioTracks()[0];
const audioSender = this.pc
  .getSenders()
  .find(
    s => (s.track && s.track.kind === "audio") || s.track === null
  );
audioSender.replaceTrack(action.isMuted ? null : audioTrack);

And remember to stop doing track.enabled whereever you might do that in your code.
Comment 6 Philipp Hancke 2019-10-02 09:48:55 PDT
(yay, my old email still works :-)

The classic e2e test for this is to disable a track, ensure the remote gets silence, reenable and ensure the remote end hears something again. WPT doesn't have anything which seems pretty lacking. The closest is https://github.com/web-platform-tests/wpt/blob/master/mediacapture-streams/MediaStreamTrack-MediaElement-disabled-audio-is-silence.https.html which shows the analyzer but lacks the "reenable" step and has no peerconnection in between.
Comment 7 youenn fablet 2019-10-02 10:21:33 PDT
There is https://github.com/WebKit/webkit/blob/master/LayoutTests/webrtc/peer-connection-audio-mute2.html which is expected to cover this.

I am not sure why this did not catch it.
Probably the bug is in real capture code while the tests use mock capture.
Comment 8 Radar WebKit Bug Importer 2019-10-02 13:42:45 PDT
<rdar://problem/55922616>
Comment 9 Xin 2019-10-02 19:58:06 PDT
Hi daginge and youenn,

I have tried this on iphone 11 (OS 13.1.2), and keep changing the mute and unmute works properly.

So I am not sure if this one is related to some models.


Xin
Comment 10 daginge 2019-10-02 23:46:01 PDT
Hi Xin!

It works if you do it fast, but if it mutes for anything longer than 2 seconds it will kill all sound. We have reports from iPhone 11 (Pro and regular), X, XS and iPhone 8 with the same issue.
Comment 11 daginge 2019-10-03 00:05:34 PDT
FWIW we saw some reports from the facebook group for the Association for the Vision Impaired in Norway (botched that translation) that they experienced a similar issue with all sound disappearing after hanging up a call. 

This was OS level, and it was quite serious for them as they may no longer hear the phone when it rings, or that the sound from the other party may no longer work. So it might be that this issue is on a lower level than Safari.
Comment 12 Szymon Witamborski 2019-10-03 02:27:54 PDT
I've built a very minimal codepen to replicate this issue. It looks like this is not related to RTCPeerConnection at all, all it takes is an audio-only MediaStream connected to an <audio> element.

https://codepen.io/brainshave/pen/oNNvgZP

Steps:

- Click "Start" and accept microphone access
- Uncheck "Audio Enabled"
- Wait couple of seconds
- You should see "local audio track ended"

Works every time provided:

- screen recording is off
- MediaStream is set as srcObject on the audio element

Tested with Bluetooth headset and there was no difference. I also had a version of it with a RTCPeerConnection but that proved unnecessary.

It does seem like something on the system level because turning on screen recording prevents it from happening. Perhaps some kind of power-saving feature?

Tested on iPhone XR and iPhone 8, both iOS 13.1.2.

Our app doesn't have any work-arounds in yet for this issue but we do re-request audio access if the track emits "ended" event so in our case the user will see the prompt for accessing the microphone again, which is not ideal but at least it allows them to continue with their call.
Comment 13 youenn fablet 2019-10-03 07:02:02 PDT
Created attachment 380109 [details]
Patch
Comment 14 daginge 2019-10-03 07:22:02 PDT
Re: patch. I guess this explains why remote audio would disappear. Autoplay policies are removed when the page is capturing, but this bug will cause capturing to stop, reinstating autoplay policies. Now since the video element wasn't allowed to play due to a user event, the audio output will stop. I love it.
Comment 15 Eric Carlson 2019-10-03 07:39:50 PDT
Comment on attachment 380109 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=380109&action=review

r=me once the bots are happy

> Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:229
>  bool MediaStreamPrivate::hasCaptureAudioSource() const

Nit: we should probably rename this to something like "hasActiveAudioSource" now that it considers ended and muted states.

> Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:232
> -        if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack())
> +        if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack() && !track->ended() && !track->muted())

Should hasCaptureVideoSource also consider the ended and muted states?
Comment 16 youenn fablet 2019-10-03 07:43:33 PDT
> > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:232
> > -        if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack())
> > +        if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack() && !track->ended() && !track->muted())
> 
> Should hasCaptureVideoSource also consider the ended and muted states?

Agreed, we should also probably make MediaStreamTrack a PlatformMediaSessionClient, that would catch some cases of tracks with no MediaStream.
I tried to keep the patch as small as possible and plan to do a follow-up for these changes.
Comment 17 youenn fablet 2019-10-03 08:24:54 PDT
Created attachment 380117 [details]
Patch for landing
Comment 18 youenn fablet 2019-10-03 08:54:01 PDT
Actually, I can probably write a test for it.
Comment 19 youenn fablet 2019-10-03 09:20:42 PDT
Created attachment 380121 [details]
Patch
Comment 20 WebKit Commit Bot 2019-10-03 11:09:39 PDT
Comment on attachment 380121 [details]
Patch

Clearing flags on attachment: 380121

Committed r250663: <https://trac.webkit.org/changeset/250663>
Comment 21 WebKit Commit Bot 2019-10-03 11:09:42 PDT
All reviewed patches have been landed.  Closing bug.
Comment 22 Eric Carlson 2019-11-08 09:43:32 PST
*** Bug 203382 has been marked as a duplicate of this bug. ***
Comment 23 Manik 2019-11-20 09:36:41 PST
Hi Youenn, when can this be expected to be released? Apologies if I'm not reading the status correctly, but I see it as Resolved Fixed meaning it's still waiting for a QA cycle to be verified. Is that correct?
Comment 24 Manjesh Malavalli 2020-01-06 17:49:41 PST
Hi Youenn,

I think this bug is still present in i0S 13.3. I ran the JSFiddle in https://bugs.webkit.org/show_bug.cgi?id=203382 and was able to reproduce the observed behavior. A comment in that bug said that the fix was released in 13.2.2. Can you please clarify?

- Manjesh
Comment 25 cibernaio 2020-01-28 07:10:17 PST
Hi,
Any news about this fix?

Thanks.

Bye(In reply to alan.ford from comment #0)
> Created attachment 379902 [details]
> Screenshot
> 
> To mute an audio track in WebRTC, you can set "enabled" on the
> MediaStreamTrack to false. This worked fine up until iOS 13.1 (and maybe
> also not in iOS 13), but now if you set "enabled" to false, then
> "readyState" goes from "live" to "ended" and no audio flows in either
> direction from then on.
> 
> This does not fail on video tracks.
> 
> You can test this with the Safari console on
> https://webrtc.github.io/samples/src/content/peerconnection/pc1/ - see
> screenshot.
Comment 26 Keyur Patel 2020-03-30 12:47:49 PDT
Any update on the status of this bug? Looks like this is still reproducible on 13.2 and 13.3?
Comment 27 Jay Charles 2020-04-07 10:22:40 PDT
Can confirm this is still an issue on safari / ios 13.4

As previous comments suggest, the problem only surfaces if the remote MediaStreamTrack is attached to a video or audio element. If I attach only the video track to the video element and send the audio track to an AudioContext, all works as expected.
Comment 28 youenn fablet 2020-04-07 12:17:59 PDT
(In reply to Jay Charles from comment #27)
> Can confirm this is still an issue on safari / ios 13.4
> 
> As previous comments suggest, the problem only surfaces if the remote
> MediaStreamTrack is attached to a video or audio element. If I attach only
> the video track to the video element and send the audio track to an
> AudioContext, all works as expected.

Hi Jay,

This might not be the same bug as this one should be fixed in iOS 13.4.
Since you can get data from AudioContext, the issue might be either in the audio renderer or in autoplay policies.