Bug 303790
| Summary: | AudioBufferSourceNode's start()'s duration broken with loop=true | ||
|---|---|---|---|
| Product: | WebKit | Reporter: | Kaiido <tristan.fraipont> |
| Component: | Web Audio | Assignee: | Nobody <webkit-unassigned> |
| Status: | NEW | ||
| Severity: | Normal | CC: | cdumez, jer.noble, webkit-bug-importer |
| Priority: | P2 | Keywords: | InRadar |
| Version: | WebKit Nightly Build | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
Kaiido
1. Open https://jsfiddle.net/ftwyoxh2/
2. Click the "start test" button.
3. Listen to how many beeps you hear (expected is 3)
The test case is made of a looping `AudioBufferSourceNode`, pointing to a 6s `AudioBuffer` of an audio count-down where each second a new beep is heard. We set the source's `playbackRate` to `2` and start it by calling `start(0, 0, 3)`.
Since https://github.com/WebAudio/web-audio-api/pull/1681 it is specced that the `duration` param is supposed to be independent of the `playbackRate`, and that it should represent the time of the source data, not the time of the `AudioContext`, whether the source node loops or not. This works well for non-looping sources, but when `loop` is set to `true`, the duration falls back to represent the time relative to the `AudioContext` clock.
This works as expected in Firefox.
Chrome has the same issue: https://issues.chromium.org/issues/467103139
| Attachments | ||
|---|---|---|
| Add attachment proposed patch, testcase, etc. |
Radar WebKit Bug Importer
<rdar://problem/166168368>