Created attachment 379902 [details] Screenshot To mute an audio track in WebRTC, you can set "enabled" on the MediaStreamTrack to false. This worked fine up until iOS 13.1 (and maybe also not in iOS 13), but now if you set "enabled" to false, then "readyState" goes from "live" to "ended" and no audio flows in either direction from then on. This does not fail on video tracks. You can test this with the Safari console on https://webrtc.github.io/samples/src/content/peerconnection/pc1/ - see screenshot.
Can confirm this on my device as well, we've had several reports from users about this. Currently on 13.1, will update to 13.1.2 and report back.
This bug is not fixed on iOS 13.1.2. I can reproduce the error 100% of the time with the following: 1. Go to https://webrtc.github.io/samples/src/content/peerconnection/pc1/ and initiate a call. 2. Observe that you get sound both ways by listening for feedback. 3. Open the developer console 4. Enter pc1.getSenders().forEach(s => {s.track.kind === "audio" ? s.track.enabled = false : null}) 5. Observe that sound now disappears. Unmuting the track by reversing step 4 does not bring sound back. Furthermore, we are able to reproduce this with remote audio using whereby (previously appear.in). Steps to reproduce: 1. Go to https://whereby.com/daginge1231231234 (p2p) on iOS 13 2. Join using another browser, doesn't matter which one, I've tested Chrome and iOS Safari. 3. Observe that you are getting audio running both ways (usually feedback) 4. Mute audio on iOS 13. Observe that audio now disappears both locally and from remote party 5. Observe that unmuting audio does not fix the problem Only way to work around this issue is to hard refresh the page. This is quite serious, and we've had to disable muting local audio on iOS 13 as a result.
I confirm this issue reproes on a recent iOS13, not older ones. This does not seem to repro on MacOS.
Workaround as suggested by fippo/philipp: Use replaceTrack(null) to remove the audio track from the PC instead of muting it. Remember to not add the audio track on new peer connections, though I guess this is less of an issue for you SFUers. We'll implement this change now and test it out a bit.
Word fo warning to others who might try the replaceTrack workaround. I'm seeing some serious delays post unmuting in certain cases on Chromium-based WebRTC engines (Safari included). Firefox seems unaffected. So don't go changing all of your muting code to replaceTrack, it might cause some other painful issues, best to keep this to iOS 13 only. FWIW here is my workaround code: const audioTrack = this.mediaStream.stream.getAudioTracks()[0]; const audioSender = this.pc .getSenders() .find( s => (s.track && s.track.kind === "audio") || s.track === null ); audioSender.replaceTrack(action.isMuted ? null : audioTrack); And remember to stop doing track.enabled whereever you might do that in your code.
(yay, my old email still works :-) The classic e2e test for this is to disable a track, ensure the remote gets silence, reenable and ensure the remote end hears something again. WPT doesn't have anything which seems pretty lacking. The closest is https://github.com/web-platform-tests/wpt/blob/master/mediacapture-streams/MediaStreamTrack-MediaElement-disabled-audio-is-silence.https.html which shows the analyzer but lacks the "reenable" step and has no peerconnection in between.
There is https://github.com/WebKit/webkit/blob/master/LayoutTests/webrtc/peer-connection-audio-mute2.html which is expected to cover this. I am not sure why this did not catch it. Probably the bug is in real capture code while the tests use mock capture.
<rdar://problem/55922616>
Hi daginge and youenn, I have tried this on iphone 11 (OS 13.1.2), and keep changing the mute and unmute works properly. So I am not sure if this one is related to some models. Xin
Hi Xin! It works if you do it fast, but if it mutes for anything longer than 2 seconds it will kill all sound. We have reports from iPhone 11 (Pro and regular), X, XS and iPhone 8 with the same issue.
FWIW we saw some reports from the facebook group for the Association for the Vision Impaired in Norway (botched that translation) that they experienced a similar issue with all sound disappearing after hanging up a call. This was OS level, and it was quite serious for them as they may no longer hear the phone when it rings, or that the sound from the other party may no longer work. So it might be that this issue is on a lower level than Safari.
I've built a very minimal codepen to replicate this issue. It looks like this is not related to RTCPeerConnection at all, all it takes is an audio-only MediaStream connected to an <audio> element. https://codepen.io/brainshave/pen/oNNvgZP Steps: - Click "Start" and accept microphone access - Uncheck "Audio Enabled" - Wait couple of seconds - You should see "local audio track ended" Works every time provided: - screen recording is off - MediaStream is set as srcObject on the audio element Tested with Bluetooth headset and there was no difference. I also had a version of it with a RTCPeerConnection but that proved unnecessary. It does seem like something on the system level because turning on screen recording prevents it from happening. Perhaps some kind of power-saving feature? Tested on iPhone XR and iPhone 8, both iOS 13.1.2. Our app doesn't have any work-arounds in yet for this issue but we do re-request audio access if the track emits "ended" event so in our case the user will see the prompt for accessing the microphone again, which is not ideal but at least it allows them to continue with their call.
Created attachment 380109 [details] Patch
Re: patch. I guess this explains why remote audio would disappear. Autoplay policies are removed when the page is capturing, but this bug will cause capturing to stop, reinstating autoplay policies. Now since the video element wasn't allowed to play due to a user event, the audio output will stop. I love it.
Comment on attachment 380109 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=380109&action=review r=me once the bots are happy > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:229 > bool MediaStreamPrivate::hasCaptureAudioSource() const Nit: we should probably rename this to something like "hasActiveAudioSource" now that it considers ended and muted states. > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:232 > - if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack()) > + if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack() && !track->ended() && !track->muted()) Should hasCaptureVideoSource also consider the ended and muted states?
> > Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp:232 > > - if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack()) > > + if (track->type() == RealtimeMediaSource::Type::Audio && track->isCaptureTrack() && !track->ended() && !track->muted()) > > Should hasCaptureVideoSource also consider the ended and muted states? Agreed, we should also probably make MediaStreamTrack a PlatformMediaSessionClient, that would catch some cases of tracks with no MediaStream. I tried to keep the patch as small as possible and plan to do a follow-up for these changes.
Created attachment 380117 [details] Patch for landing
Actually, I can probably write a test for it.
Created attachment 380121 [details] Patch
Comment on attachment 380121 [details] Patch Clearing flags on attachment: 380121 Committed r250663: <https://trac.webkit.org/changeset/250663>
All reviewed patches have been landed. Closing bug.
*** Bug 203382 has been marked as a duplicate of this bug. ***
Hi Youenn, when can this be expected to be released? Apologies if I'm not reading the status correctly, but I see it as Resolved Fixed meaning it's still waiting for a QA cycle to be verified. Is that correct?
Hi Youenn, I think this bug is still present in i0S 13.3. I ran the JSFiddle in https://bugs.webkit.org/show_bug.cgi?id=203382 and was able to reproduce the observed behavior. A comment in that bug said that the fix was released in 13.2.2. Can you please clarify? - Manjesh
Hi, Any news about this fix? Thanks. Bye(In reply to alan.ford from comment #0) > Created attachment 379902 [details] > Screenshot > > To mute an audio track in WebRTC, you can set "enabled" on the > MediaStreamTrack to false. This worked fine up until iOS 13.1 (and maybe > also not in iOS 13), but now if you set "enabled" to false, then > "readyState" goes from "live" to "ended" and no audio flows in either > direction from then on. > > This does not fail on video tracks. > > You can test this with the Safari console on > https://webrtc.github.io/samples/src/content/peerconnection/pc1/ - see > screenshot.
Any update on the status of this bug? Looks like this is still reproducible on 13.2 and 13.3?
Can confirm this is still an issue on safari / ios 13.4 As previous comments suggest, the problem only surfaces if the remote MediaStreamTrack is attached to a video or audio element. If I attach only the video track to the video element and send the audio track to an AudioContext, all works as expected.
(In reply to Jay Charles from comment #27) > Can confirm this is still an issue on safari / ios 13.4 > > As previous comments suggest, the problem only surfaces if the remote > MediaStreamTrack is attached to a video or audio element. If I attach only > the video track to the video element and send the audio track to an > AudioContext, all works as expected. Hi Jay, This might not be the same bug as this one should be fixed in iOS 13.4. Since you can get data from AudioContext, the issue might be either in the audio renderer or in autoplay policies.