Created attachment 171716 [details] Regression HTML page with debug FFT and time output to show distortion. There seems to be a problem with the implementation of linearRampToValueAtTime in AudioParamTimeline.cpp, which on the GainNode causes a buzzing distortion to begin when audioContext.currentTime reaches certain powers of 2. This suggests it has something to do with representing time as a floating point number during the for loop which calculates the sample accurate values. I have confirmed the bug with the nightly build on the following platforms: Windows XP - Intel Core 4 Quad 6600 Mac OS X Snow Leopard - Intel Core 2 Duo The problem is significantly worse on the Windows platform. On both platforms, distortion can be seen in the FFT view after 128.0s. But on Windows, after 256.0s an extremely audible buzzing begins. I presume this is because there are differences in the floating point code generated by the Visual Studio and gcc compilers? This makes it next to impossible to use ADSR style envelopes on parameters, since they cannot be relied upon without introducing distortion. I have attached a regression HTML page which demonstrates the problem.
Matt, thanks for taking the time to make such a nice reduced test. I've been able to repro, and am taking a look...
Created attachment 172160 [details] Patch
Comment on attachment 172160 [details] Patch Interesting. Looks fine. r=me
Committed r133365: <http://trac.webkit.org/changeset/133365>
Thanks for taking the time to deal with this so quickly Chris. I have tested with nightly build r133576 on OS X, and the problem seems to be fixed. I will test on Windows once there is a new nightly build (the Windows buildbot seems to have been broken since 16th October)