RESOLVED FIXED 189788
[GStreamer][MSE] Add a default sample duration
https://bugs.webkit.org/show_bug.cgi?id=189788
Summary [GStreamer][MSE] Add a default sample duration
Alicia Boya García
Reported 2018-09-20 08:29:53 PDT
Some WebM files don't provide sample durations, so we need to provide a safe default in order for them to be playable.
Attachments
Patch (2.10 KB, patch)
2018-09-20 08:33 PDT, Alicia Boya García
no flags
Alicia Boya García
Comment 1 2018-09-20 08:33:10 PDT
Alicia Boya García
Comment 2 2018-09-20 08:59:17 PDT
This is necessary to pass MSE YouTube tests 21. StartPlayWithoutData, 22. EventTimestamp... and any testcase that uses WebM files without frame durations.
WebKit Commit Bot
Comment 3 2018-09-20 09:47:06 PDT
Comment on attachment 350203 [details] Patch Clearing flags on attachment: 350203 Committed r236264: <https://trac.webkit.org/changeset/236264>
WebKit Commit Bot
Comment 4 2018-09-20 09:47:08 PDT
All reviewed patches have been landed. Closing bug.
Xabier Rodríguez Calvar
Comment 5 2018-09-21 00:51:42 PDT
Comment on attachment 350203 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=350203&action=review > Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp:59 > + m_duration = createMediaTime(16666667); // 1/60 seconds You should be used MediaTime(1, 60) then.
Alicia Boya García
Comment 6 2018-09-21 03:45:11 PDT
Comment on attachment 350203 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=350203&action=review >> Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp:59 >> + m_duration = createMediaTime(16666667); // 1/60 seconds > > You should be used MediaTime(1, 60) then. I thought about that, but decided purposely not to. Using a different timescale here would require the timescales would have to be recomputed as soon as an addition is made (and additions will be made plenty with this). Furthermore, the least common multiple of 1e9 and 60 is 3e9, which is greater than MediaTime::MaximumTimeScale (~2.147e9). In consequence, MaximumTimeScale would lossly be chosen instead, which is not a power of ten, but 0x7fffffff. This is a lot of complexity, wasted computation and more worringly, risk: because 1e9 and MaximumTimeScale don't play nicely, `pts + dur - dur` would not be equal to `pts`, and converting them to GStreamer time would return different values. In this case I'm just choosing an arbitrary duration (16000000 ns, or 16031415 ns would have worked as well), so I don't mind if 1/60 is not represented exactly.
Note You need to log in before you can comment on or make changes to this bug.