Once upon a time Chrome didn’t play audio on an audio element when the element’s srcObject contained a video track that hadn’t sent any data:
The reason was/is that for *video* elements loadedmetadata/onresize need to be fired which are not known without a (key)frame. Since audio elements don't fire these events they can just play.
This is still an issue in Safari as demonstrated by this fiddle:
This reproduces a situation where a MediaStream with audio and video tracks is created on the receiving end but the sender does not send rtp packets.
This can very easily happen from Firefox (see https://github.com/w3c/mediacapture-main/issues/642) and also when an implementation makes an effort to ensure the camera light is turned off in other browsers.
https://jsfiddle.net/kcutdvhe/ - the stream splitting approach which used to work in Chrome also seems to work but would be great to avoid.
Note that https://jsfiddle.net/23s0odra/ which attempts to reproduce this without a RTCPeerConnection in between still works. But maybe because the video track readyState is ended?
Created attachment 393651 [details]
Comment on attachment 393651 [details]
Clearing flags on attachment: 393651
Committed r258503: <https://trac.webkit.org/changeset/258503>
All reviewed patches have been landed. Closing bug.