When using decodeAudioData repeatedly with large files (or files that result in a large uncompressed buffer), it takes a very long time until the memory used by decodeAudioData is freed. I use decodeAudioData to decode an audio file and then just get an overview (peaks) from it, which is only a few kilobytes. I do not hold any references to the buffer, channel data, or file after doing this. Yet, potentially several hundred MBs of memory are being held by the "Page" process for around one minute after decodeAudioData, and then finally get freed. Can't this large amount of data be collected earlier?
Note: I'm asking because I'm using a WKWebView in an iOS app, and it will randomly get killed (it turns blank) by the OS because it appearently uses too much memory.
<rdar://problem/44271160>
I wonder if WebAudio-related classes are not reporting their memory costs to GC correctly?
(In reply to Simon Fraser (smfr) from comment #3) > I wonder if WebAudio-related classes are not reporting their memory costs to > GC correctly? AudioBuffer (which contains all the decoded audio data) implements memoryCost(), which should be what the GC uses to determine object size. I'm guessing that there's an AudioBufferSourceNode which is retaining the decoded AudioBuffer and that the ABSN is being kept alive long after its playback has completed.
(In reply to Jer Noble from comment #4) > I'm > guessing that there's an AudioBufferSourceNode which is retaining the > decoded AudioBuffer and that the ABSN is being kept alive long after its > playback has completed. As I said, I do not play back the buffer. I just use it for generating an overview. Here's the complete code (it's Tea, our in-house programming language that transcompiles to JS, but I hope it's readable enough!) HTTP.get("sevenapp:///#{encodeURIComponent(@playingPath)}", (audioData) => if not audioData? @clearPeaks() <- Audio.context.decodeAudioData(audioData, (buffer) => if not buffer? @clearPeaks() <- samples = buffer.getChannelData(0) # 30 samples per second step = buffer.sampleRate / 30 @peaks[@playingPath] = peaks = [] for frame in [0...samples.length/step] sum = 0 for index in [frame*step...(frame+1)*step] by 256 # accuracy, higher = worse, 1 = best if index >= samples.length break sum += Math.abs(samples[Math.floor(index)]) sum /= step/256 peaks.push(sum) @renderPeaks() ) )
(In reply to ae from comment #5) > (In reply to Jer Noble from comment #4) > > I'm > > guessing that there's an AudioBufferSourceNode which is retaining the > > decoded AudioBuffer and that the ABSN is being kept alive long after its > > playback has completed. > > As I said, I do not play back the buffer. I just use it for generating an > overview. Here's the complete code (it's Tea, our in-house programming > language that transcompiles to JS, but I hope it's readable enough!) > > HTTP.get("sevenapp:///#{encodeURIComponent(@playingPath)}", (audioData) => > if not audioData? > @clearPeaks() > <- > Audio.context.decodeAudioData(audioData, (buffer) => > if not buffer? > @clearPeaks() > <- > samples = buffer.getChannelData(0) Unless your in-house transpiler adds scoping statements to these variables, your going to be setting window.samples to a large Float32Array. If your in-house transpiler /is/ adding scoping statements, it would be much more useful to post the transpiled source.
(In reply to Jer Noble from comment #6) > (In reply to ae from comment #5) > > (In reply to Jer Noble from comment #4) > > > I'm > > > guessing that there's an AudioBufferSourceNode which is retaining the > > > decoded AudioBuffer and that the ABSN is being kept alive long after its > > > playback has completed. > > > > As I said, I do not play back the buffer. I just use it for generating an > > overview. Here's the complete code (it's Tea, our in-house programming > > language that transcompiles to JS, but I hope it's readable enough!) > > > > HTTP.get("sevenapp:///#{encodeURIComponent(@playingPath)}", (audioData) => > > if not audioData? > > @clearPeaks() > > <- > > Audio.context.decodeAudioData(audioData, (buffer) => > > if not buffer? > > @clearPeaks() > > <- > > samples = buffer.getChannelData(0) > > Unless your in-house transpiler adds scoping statements to these variables, > your going to be setting window.samples to a large Float32Array. If your > in-house transpiler /is/ adding scoping statements, it would be much more > useful to post the transpiled source. Oh sorry, of course it makes 'samples' local (var samples), I'm just so used to this that I forgot. Here's the transpiled code (not too readable): getPeaks: function() { element('player-peaks').classList.remove('show'); if (this.peaks[this.playingPath] != null) { timer(0.5, (function(_this) { return function() { _this.renderPeaks(); }; })(this)); return; } if (Audio.context == null) { this.needPeaksOnResume = true; return; } this.needPeaksOnResume = false; HTTP.get("sevenapp:///" + (encodeURIComponent(this.playingPath)), (function(_this) { return function(audioData) { if (audioData == null) { _this.clearPeaks(); return; } Audio.context.decodeAudioData(audioData, function(buffer) { var frame, i, index, j, peaks, ref, ref1, ref2, samples, step, sum; if (buffer == null) { _this.clearPeaks(); return; } samples = buffer.getChannelData(0); step = buffer.sampleRate / 30; _this.peaks[_this.playingPath] = peaks = []; for (frame = i = 0, ref = samples.length / step; 0 <= ref ? i < ref : i > ref; frame = 0 <= ref ? ++i : --i) { sum = 0; for (index = j = ref1 = frame * step, ref2 = (frame + 1) * step; j < ref2; index = j += 256) { if (index >= samples.length) { break; } sum += Math.abs(samples[Math.floor(index)]); } sum /= step / 256; peaks.push(sum); } _this.renderPeaks(); }); }; })(this)); },