Created attachment 425392 [details] iOS 14.4.2 errors after 26 calls to startRendering Safari Desktop with latest Webkit Version 14.0.3 (16610.4.3.1.7) on Big Sur Safari latest version for iOS 14.4.2 Safari on iOS 14.1 Hello, I work on an application that allows users to slow down audio so they can play piano alongside of it. The performance of the pitchshifting done by SoundTouchJS (https://github.com/cutterbl/SoundTouchJS) degrades significantly with iOS 14.4 (confirmed on 14.4.1 and 14.4.2). I believe iOS 14.4 has a performance degradation that affects the script processor node's onaudioprocess. As a result, we are forced to abandon realtime processing, and opt instead to render slowed down audio with pitch correction via OfflineAudioContext (similar to HTML Audio's playRate functionality). We have a Single Page Application, and users may change the speed of the audio dozens of time for a given session. We encountered a problem where the OfflineAudioContext creation starts to fail or cause the browser to crash after being called 24-60 times. For desktop with Big Sur and latest Safari version 14.0.3, the page will either be reloaded or crash after ~60 calls to startRendering. For iOS 14.4.2 and 14.1, calls to instantiate new OfflineAudioContext fail after ~26 calls with the error "The string did not match the expected pattern." If you get that error ~5 times, and try calling context.createBuffer, you will receive the error "Channel was not able to be created". If you refresh the page the error will immediately occur, and there does not seem to be a way to recover, other than to close the tab and open a new one. I put together a codepen which demonstrates the issue: https://codepen.io/jasonmcaffee/pen/GRrvLxX ``` const AudioContext = window.AudioContext || window.webkitAudioContext; const OfflineAudioContext = window.OfflineAudioContext || window.webkitOfflineAudioContext; const audioUrl = "https://playground-assets-sean.s3.amazonaws.com/r1/00/00/00/3c/slice_59_resource_178_20210308_13h53m55s_mst.mp3"; //demonstrate that iOS starts crashing or erroring when performing offline renders more than N times. async function main(){ const context = new AudioContext(); const audioBuffer = await fetchAndDecodeAudioData({url: audioUrl, context}); for(let i = 0; i < 60; ++i){ print(`rendering ${i}`); try{ const result = await renderOffline(audioBuffer); }catch(e){ print(`error rendering offline: `, e.message); createBuffers(); } } print(`done rendering`); } //demonstrate that after receiving the error from the OfflineAudioContext creation, creation of audio buffers fail. async function createBuffers(){ const context = new AudioContext(); print(`creating buffers`); for(let i = 0; i < 1; ++i){ try{ const buffer = context.createBuffer(2, 44100 * 10, 44100); }catch(e){ print(`error creating buffer: `, e.message); } } } async function renderOffline(audioBuffer: AudioBuffer){ let offlineAudioContext = new OfflineAudioContext(audioBuffer.numberOfChannels, audioBuffer.duration * audioBuffer.sampleRate, audioBuffer.sampleRate); let source = offlineAudioContext.createBufferSource(); source.buffer = audioBuffer; source.connect(offlineAudioContext.destination); return new Promise((resolve, reject)=>{ offlineAudioContext.oncomplete = (e)=>{ let renderedBuffer = e.renderedBuffer; resolve(renderedBuffer); //attempt to cleanup in hopes it will resolve the issue (Narrator: it doesnt) source.buffer = null; cleanupSource(source); offlineAudioContext.oncomplete = undefined; offlineAudioContext = undefined; source = undefined; renderedBuffer = undefined; }; source.start(); offlineAudioContext.startRendering();//ios doesn't support promise version }); } async function fetchAndDecodeAudioData({url, context}: {url: string; context: AudioContext}): Promise<AudioBuffer> { const response = await fetch(url, {cache: "force-cache"}); const arrayBuffer = await response.arrayBuffer(); return decodeAudioData(context, arrayBuffer); } async function decodeAudioData(context: AudioContext, arrayBuffer: ArrayBuffer): Promise<AudioBuffer>{ return new Promise((resolve, reject) => context.decodeAudioData(arrayBuffer, resolve, reject)); } //attempt to use a stracth buffer to force safari to release references to buffers (doesnt work for this issue) const context = new AudioContext(); let scratchBuffer = context.createBuffer(2,1,44100); function getScratchBuffer(){ return scratchBuffer; } function cleanupSource(source: AudioBufferSourceNode | undefined){ if(!source){ return; } try{ source.stop(); }catch(e){} source.onended = null; source.disconnect(0); try{source.buffer = getScratchBuffer();}catch(e){} } const outputEl = document.querySelector('#output'); function print<T extends unknown>(...params: T[]){ const message = params.map(p => JSON.stringify(p).replace(/['"]+/g, '')).join(` `); outputEl.innerHTML += `${message}<br/>`; } main(); ```
Created attachment 425393 [details] iOS 14.4.2 immediately errors after page refresh
> I believe iOS 14.4 has a performance degradation that affects the script processor node's onaudioprocess. That is correct, we introduced a performance regression in a very recent bugfix update that impacted ScriptProcessorNode. We have identified and fixed it but the fix has not shipped to customers yet.
The "Channel was not able to be created" error seems to indicate we failed to allocate the buffer memory, which likely indicates we ran out of memory.
(In reply to Chris Dumez from comment #3) > The "Channel was not able to be created" error seems to indicate we failed > to allocate the buffer memory, which likely indicates we ran out of memory. When I ran the provided codepen on macOS, I saw a banner indicating we were using too much memory and then a crash (likely due to reaching the memory limit).
I have further narrowed down the issue to just the instantiation of the OfflineAudioContext. Number of instantiations until you receive the error is based on the length param. The larger the length, the sooner you will encounter the issue. With length 4000 * 44100, iOS 14.4.2 starts throwing errors on the 2nd instantiation. Desktop safari (Version 14.0.3 (16610.4.3.1.7)) will show the same errors as iOS "The string did not match the expected pattern" after ~20 iterations, then crash or reload the page. https://codepen.io/jasonmcaffee/pen/wvgrqLb ``` const AudioContext = window.AudioContext || window.webkitAudioContext; const OfflineAudioContext = window.OfflineAudioContext || window.webkitOfflineAudioContext; //demonstrate that iOS starts crashing or erroring when performing offline renders more than N times. async function main(){ const sampleRate = 44100; const length = 4000 * sampleRate; for(let i = 0; i < 60; ++i){ print(`rendering ${i}`); try{ new OfflineAudioContext(2, length, sampleRate); }catch(e){ print(`error rendering offline: `, e.message); } } print(`done`); } const outputEl = document.querySelector('#output'); function print<T extends unknown>(...params: T[]){ const message = params.map(p => JSON.stringify(p).replace(/['"]+/g, '')).join(` `); outputEl.innerHTML += `${message}<br/>`; } main(); ```
(In reply to Jason from comment #5) > I have further narrowed down the issue to just the instantiation of the > OfflineAudioContext. Number of instantiations until you receive the error > is based on the length param. The larger the length, the sooner you will > encounter the issue. With length 4000 * 44100, iOS 14.4.2 starts throwing > errors on the 2nd instantiation. Desktop safari (Version 14.0.3 > (16610.4.3.1.7)) will show the same errors as iOS "The string did not match > the expected pattern" after ~20 iterations, then crash or reload the page. > > https://codepen.io/jasonmcaffee/pen/wvgrqLb > > ``` > const AudioContext = window.AudioContext || window.webkitAudioContext; > const OfflineAudioContext = window.OfflineAudioContext || > window.webkitOfflineAudioContext; > > //demonstrate that iOS starts crashing or erroring when performing offline > renders more than N times. > async function main(){ > const sampleRate = 44100; > const length = 4000 * sampleRate; > for(let i = 0; i < 60; ++i){ > print(`rendering ${i}`); > try{ > new OfflineAudioContext(2, length, sampleRate); > }catch(e){ > print(`error rendering offline: `, e.message); > } > } > print(`done`); > } > > const outputEl = document.querySelector('#output'); > function print<T extends unknown>(...params: T[]){ > const message = params.map(p => JSON.stringify(p).replace(/['"]+/g, > '')).join(` `); > outputEl.innerHTML += `${message}<br/>`; > } > > main(); > ``` Thanks, this is very helpful.
Indeed, the OfflineAudioContext objects are clearly leaking. I am investigating.
Awesome! No problem. Glad I was able to help narrow it down. If you can think of any work arounds, please let me know. For the script processor node fix, do you know if that is part of iOS 14.5 preview? If so, I can update to the preview, and realtime processing could be the fix. We can just tell our customers with 14.4.x to update once it is released. Thanks!
(In reply to Jason from comment #8) > Awesome! No problem. Glad I was able to help narrow it down. If you can > think of any work arounds, please let me know. > > For the script processor node fix, do you know if that is part of iOS 14.5 > preview? If so, I can update to the preview, and realtime processing could > be the fix. We can just tell our customers with 14.4.x to update once it is > released. > > Thanks! I am not 100% sure but I think the script processor node fix is indeed in iOS 14.5 beta 7.
(In reply to Chris Dumez from comment #6) > (In reply to Jason from comment #5) > > I have further narrowed down the issue to just the instantiation of the > > OfflineAudioContext. Number of instantiations until you receive the error > > is based on the length param. The larger the length, the sooner you will > > encounter the issue. With length 4000 * 44100, iOS 14.4.2 starts throwing > > errors on the 2nd instantiation. Desktop safari (Version 14.0.3 > > (16610.4.3.1.7)) will show the same errors as iOS "The string did not match > > the expected pattern" after ~20 iterations, then crash or reload the page. > > > > https://codepen.io/jasonmcaffee/pen/wvgrqLb > > > > ``` > > const AudioContext = window.AudioContext || window.webkitAudioContext; > > const OfflineAudioContext = window.OfflineAudioContext || > > window.webkitOfflineAudioContext; > > > > //demonstrate that iOS starts crashing or erroring when performing offline > > renders more than N times. > > async function main(){ > > const sampleRate = 44100; > > const length = 4000 * sampleRate; > > for(let i = 0; i < 60; ++i){ > > print(`rendering ${i}`); > > try{ > > new OfflineAudioContext(2, length, sampleRate); > > }catch(e){ > > print(`error rendering offline: `, e.message); > > } > > } > > print(`done`); > > } > > > > const outputEl = document.querySelector('#output'); > > function print<T extends unknown>(...params: T[]){ > > const message = params.map(p => JSON.stringify(p).replace(/['"]+/g, > > '')).join(` `); > > outputEl.innerHTML += `${message}<br/>`; > > } > > > > main(); > > ``` > > Thanks, this is very helpful. Note that even if I fix the leak, this test case will still fail. The reason is that you are never yielding so JS garbage collection does not get a change to run. A better test case would use setTimeout() to make sure GC has a chance to destroy the audio contexts.
Ok, that makes sense. I've updated the codepen to use a setTimeout of 1ms between calls: https://codepen.io/jasonmcaffee/pen/wvgrqLb ``` const AudioContext = window.AudioContext || window.webkitAudioContext; const OfflineAudioContext = window.OfflineAudioContext || window.webkitOfflineAudioContext; async function main(){ const sampleRate = 44100; const length = 4000 * sampleRate; for(let i = 0; i < 60; ++i){ print(`rendering ${i}`); try{ new OfflineAudioContext(2, length, sampleRate); await sleep(1); }catch(e){ print(`error rendering offline: `, e.message); } } print(`done`); } async function sleep(ms: number){ return new Promise((resolve, reject)=>{ setTimeout(resolve, ms); }); } ```
iOS 14.5 beta 7 does indeed fix the performance issue with script processor node. Thanks!
Never mind Nick.
Created attachment 425461 [details] WIP Patch
Created attachment 425467 [details] Patch
Comment on attachment 425467 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=425467&action=review Surprised there is no less-intrusive solution, but this seems to be implemented well. > Source/WTF/wtf/WeakPtr.h:126 > + explicit WeakPtr(Ref<WeakPtrImpl<Counter>>&& ref, EnableWeakPtrThreadingAssertions enableWeakPtrThreadingAssertions) I would use a shorter argument name. Like shouldEnableAssertions. > Source/WTF/wtf/WeakPtr.h:171 > + template<typename U> WeakPtr<U, Counter> createWeakPtr(U& object, EnableWeakPtrThreadingAssertions enableWeakPtrThreadingAssertions = EnableWeakPtrThreadingAssertions::Yes) const Ditto. > Source/WTF/wtf/WeakPtr.h:249 > +#if ASSERT_ENABLED > + m_areThreadingAssertionsEnabled = o.m_areThreadingAssertionsEnabled; > +#endif Seems like this is not needed. The WeakPtr we are assigning can keep its m_areThreadingAssertionsEnabled, doesn’t need to inherit it from the thing it’s getting the pointer value from. > Source/WTF/wtf/WeakPtr.h:258 > +#if ASSERT_ENABLED > + m_areThreadingAssertionsEnabled = o.m_areThreadingAssertionsEnabled; > +#endif Ditto. > Source/WebCore/Modules/webaudio/BaseAudioContext.h:110 > + WEBCORE_EXPORT static unsigned numberOfInstances(); Seems like we should have a comment somewhere, not necessarily here, to make it clear that this is specifically for lifetime testing and not otherwise part of the interface of this class. > Source/WebCore/Modules/webaudio/DefaultAudioDestinationNode.h:36 > + DefaultAudioDestinationNode(BaseAudioContext&, Optional<float> = WTF::nullopt); I think we should use explicit here. > Source/WebCore/rendering/FloatingObjects.cpp:37 > + WeakPtr<RenderBox> renderer; Do we need a check somewhere that WeakPtr is same size as a pointer when assertions are disabled?
Comment on attachment 425467 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=425467&action=review > Source/WebCore/ChangeLog:73 > + BaseAudioContext now ows the AudioDestinationNode when we take care of destroying its Nit. s/ows/owns/g. > Source/WTF/wtf/WeakPtr.h:132 > + UNUSED_PARAM(enableWeakPtrThreadingAssertions); Need '#if !ASSERT_ENABLED'?
(In reply to Peng Liu from comment #17) > Comment on attachment 425467 [details] > Patch > > View in context: > https://bugs.webkit.org/attachment.cgi?id=425467&action=review > > > Source/WebCore/ChangeLog:73 > > + BaseAudioContext now ows the AudioDestinationNode when we take care of destroying its > > Nit. s/ows/owns/g. OK. > > > Source/WTF/wtf/WeakPtr.h:132 > > + UNUSED_PARAM(enableWeakPtrThreadingAssertions); > > Need '#if !ASSERT_ENABLED'? I don't think it is worth the extra #ifdefs.
Created attachment 425510 [details] Patch
Comment on attachment 425510 [details] Patch Clearing flags on attachment: 425510 Committed r275668 (236304@main): <https://commits.webkit.org/236304@main>
All reviewed patches have been landed. Closing bug.
<rdar://problem/76411680>