Purpose: Attempting to use Web Audio API to get data for visualization during playback of an HTML5 <audio> element.
Observed: 'audioprocess' event does not fire (demo 1), getByteFrequencyData() returns array of zeroes (demo 2.)
Demo 1: AudioContext() + createMediaElementSource() + createScriptProcessor() → audioprocess event
Demo 2: AudioContext() + createMediaElementSource() + createAnalyser() / getByteFrequencyData() method
It seems that createMediaElementSource() is actually not implemented (correct me if wrong). The method is there; we can call it w/o error, but it doesn't filter/analyze/produce actual sound if the audio is routed through this node. (Sounds come from media element even the method is not working, which makes it error-prone)
would love to hear the plan.
* Btw many documentations (including apple dev guide) mention it as usable function,
Hello, is anyone paying attention to this issue?
As a user/developer, use case of createMediaElementSource() was clear: visualise audio data (in iOS Safari). Since it's not working now, the application needs to download all the audio file first and execute decodeAudioData(), or cache spectrum data per audio file. Both appears not to be so smart/straightforward approach.
It would be awesome if this worked.
I hope someone up there gets the time to look into it.
I believe this is essentially the same as http://crbug.com/419446.
Dup of bug #135042.
*** This bug has been marked as a duplicate of bug 135042 ***
Please try this against a WebKit nightly build on Mac OS X Yosemite.
It seems Safari 12 is still not working with the simple demo:
"analyzer is not working with MediaElementAudioSourceNode, but it's working fine with AudioBufferSourceNode".