Bug 231656 - createMediaElementSource() not working with HLS/m3u8 stream
Summary: createMediaElementSource() not working with HLS/m3u8 stream
Status: NEW
Alias: None
Product: WebKit
Classification: Unclassified
Component: Web Audio (show other bugs)
Version: Safari 15
Hardware: All All
: P1 Blocker
Assignee: Nobody
URL:
Keywords: InRadar
Depends on:
Blocks:
 
Reported: 2021-10-12 20:51 PDT by Guy
Modified: 2022-08-25 15:12 PDT (History)
6 users (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Guy 2021-10-12 20:51:41 PDT
In our use case we are getting the stream HLS/m3u8 stream from createMediaElementSource and processing it. 


For Safari the createMediaElementSource will work ok with Audio mp3 or Video mp4 as the source for the HTMLMediaElement, but for m3u8/HLS stream, as the source the HTMLMediaElement's audio is not correctly routed to the AudioContext and instead the audio is not passed.

This is a major blocker for us and for our customer and will appreciate if this issue can be addressed ASAP. Thank you. 

Note: For all other browsers like Chrome and Firefox this works fine. 

For reference here is another bug related issue: https://bugs.webkit.org/show_bug.cgi?id=180696
Comment 1 Guy 2021-10-15 18:12:15 PDT
This following issue presented here a major blocker for us and for our customers and we need your help and will appreciate if this issue can be addressed ASAP.


We're using new webkitAudioContext() in Safari 15.0 on MacBook or iOS Safari on iPhone and iPad to create AudioContext instance, and we're creating ScriptProcessorNode and attaching it to the HLS/m3u8 source create using audioContext. createMediaElementSource(). The onaudioprocess callback gets called with the audio data, but no data is processed and instead we get 0’s. 

If you also connect Analyser node to the same audio source create using audioContext. createMediaElementSource(), analyser.getByteTimeDomainData(dataArray) populates no data in the data but onaudioprocess on the ScriptProcessorNode on the same source 

What has been tried:
* We confirmed that the stream being used is the only stream in the tab and createMediaElementSource() was only called once to get the stream. 
* We confirmed that if the stream source is MP4/MP3 it works with no issues and data is received in onaudioprocess, but when modifing the source to HLS/m3u8 it does not work
* We also tried using MediaRecorder with HLS/m3u8 as the stream source but didn’t get any events or data
* We also tried to create two AudioContext’s, so the first AudioContext will be the source passing the createMediaElementSource as the destination to the other Audio Context and then pass it to ScriptProcessorNode, but Safari does not allow more than one output.    


Currently none of the scenarios we tried works and this is a major blocker to us and for our customers. 


Code sample used to create the ScriptProcessorNode:

const AudioContext = window.AudioContext || window.webkitAudioContext;

audioContext = new AudioContext();

// Create a MediaElementAudioSourceNode
// Feed the HTML Video Element 'VideoElement' into it
    
const audioSource = audioContext.createMediaElementSource(VideoElement);

const processor = audioContext.createScriptProcessor(2048, 1, 1);
processor.connect(audioContext.destination);
processor.onaudioprocess = (e) => {
    // Does not get called when connected to external microphone
    // Gets called when using internal MacBook microphone
    console.log('print audio buffer', e);
  }


The exact same behavior is also observed on iOS Safari on iPhone and iPad.

Safari knows as the expert for HLS/m3u8 and we are asking for your help on this matter ASAP. 

Thanks,
Comment 2 Radar WebKit Bug Importer 2021-10-19 20:52:17 PDT
<rdar://problem/84446324>
Comment 3 Guy 2021-11-04 16:22:19 PDT
Hi,

I wanted to reach out again as I didn’t get any response yet. This issue has a blocking impact on our ability to serve our product on any iOS devices (since Web Audio APIs are not supported on any other browsers than Safari on iOS) and Safari browser on desktop and iOS in general. And one of our customer is currently heavily impacted because of this limitation in Safari. Currently Safari built on WebKit has a limitation that it cannot provide access to raw audio data via AudioContext for HLS playback, which works on mp4 files. This is supported by EVERY OTHER MAJOR BROWSER EXCEPT SAFARI, which is concerning because we will need to force users to not use our application on safari desktop, and we simply CANNOT SERVE ANY IPHONE AND IPAD USERS which is a BLOCKER for us given that more than half of our users use iOS based devices. And of course this is clearly a feature that should’ve been in place already in Safari, which is currently lagging behind in comparison to other browsers. The W3C specification already supports this and all major browsers have already implemented and supported HLS streams to be used with AudioContext.

We’d like to re-iterate the importance and urgency of this for us, and this has been raised multiple times by other developers as well, so certainly this will help thousands of other Web developers to bring HLS based applications to life on Safari and iOS ecosystem.

Can we please get the visibility on what is going to be the plan and timelines for HLS support with AudioContext in Safari? Critical part of our business and our customer’s products depend on this support in Safari.

Thank you!
Comment 4 justin 2022-08-25 15:12:12 PDT
Our product is also severely impacted by this bug. 

While other major browsers like Chrome and Firefox have no issue with using HLS-backed media elements within their Web Audio contexts, Safari still seems to not support this. 

We have confirmed that this bug persists on Safari 15.5 and Safari Tech Preview 16.0 even in the case where all HLS assets hosted from the same origin, so it seems that Safari is not adhering to the W3C spec (based on my understanding).

We would greatly appreciate if this bug could receive higher priority and be resolved soon. Thanks!