WebKit Bugzilla
New
Browse
Log In
×
Sign in with GitHub
or
Remember my login
Create Account
·
Forgot Password
Forgotten password account recovery
RESOLVED FIXED
181891
REGRESSION (macOS 10.13.2): imported/w3c/web-platform-tests/media-source/mediasource-* LayoutTests failing
https://bugs.webkit.org/show_bug.cgi?id=181891
Summary
REGRESSION (macOS 10.13.2): imported/w3c/web-platform-tests/media-source/medi...
Jer Noble
Reported
2018-01-19 16:12:57 PST
REGRESSION (macOS 10.13.2): imported/w3c/web-platform-tests/media-source/mediasource-* LayoutTests failing
Attachments
Patch
(8.15 KB, patch)
2018-01-19 16:42 PST
,
Jer Noble
no flags
Details
Formatted Diff
Diff
View All
Add attachment
proposed patch, testcase, etc.
Jer Noble
Comment 1
2018-01-19 16:42:21 PST
<
rdar://problem/35395437
>
Radar WebKit Bug Importer
Comment 2
2018-01-19 16:42:41 PST
<
rdar://problem/36676030
>
Jer Noble
Comment 3
2018-01-19 16:42:53 PST
Created
attachment 331815
[details]
Patch
WebKit Commit Bot
Comment 4
2018-01-21 18:38:56 PST
Comment on
attachment 331815
[details]
Patch Clearing flags on attachment: 331815 Committed
r227278
: <
https://trac.webkit.org/changeset/227278
>
WebKit Commit Bot
Comment 5
2018-01-21 18:38:57 PST
All reviewed patches have been landed. Closing bug.
Alicia Boya García
Comment 6
2018-05-14 13:12:49 PDT
Comment on
attachment 331815
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=331815&action=review
> Source/WebCore/ChangeLog:9 > + In macOS 10.13.2, CoreMedia changed the definition of CMSampleBufferGetDuration() to return > + the presentation duration rather than the decode duration. For media streams where those two
What was the motivation for this change in CoreMedia? Presentation duration is more complex to compute (AFAIK you can't compute it until you reach a sync frame, and only for the frames before it -- which is a bit problematic for fragmented media) and often ambiguous (in absence of a clsg box the presentation duration of the last frame is unknown). What are the implications for other platforms where decode durations are still used?
Jer Noble
Comment 7
2018-05-14 15:52:40 PDT
(In reply to Alicia Boya García from
comment #6
)
> Comment on
attachment 331815
[details]
> Patch > > View in context: >
https://bugs.webkit.org/attachment.cgi?id=331815&action=review
> > > Source/WebCore/ChangeLog:9 > > + In macOS 10.13.2, CoreMedia changed the definition of CMSampleBufferGetDuration() to return > > + the presentation duration rather than the decode duration. For media streams where those two > > What was the motivation for this change in CoreMedia? Presentation duration > is more complex to compute (AFAIK you can't compute it until you reach a > sync frame, and only for the frames before it -- which is a bit problematic > for fragmented media) and often ambiguous (in absence of a clsg box the > presentation duration of the last frame is unknown).
IIRC, CoreMedia's clients were using the return value of CMSampleBufferGetDuration() to mean the presentation duration, and not the decode duration. I definitely raised all the above issues with them at the time, to no avail.
> What are the implications for other platforms where decode durations are > still used?
It's a problem for the MSE spec in general; practically speaking, decoded-and-displayed frames are left in place until a new frame replaces them, so "decode duration" isn't really a thing. The MSE spec has kind of papered over this issue by adding "fudge factors" to a variety of places (1 millisecond here, 250 milliseconds there). Meanwhile, I've asked the CoreMedia team for a new API, such as CMSampleBufferGetDecodeDuration(). They've not been very receptive, however. From their POV, there's no use case for knowing the decode duration. I'd encourage anyone who can come up with one to file a bug at
http://bugreport.apple.com/
with their use cases.
Jer Noble
Comment 8
2018-05-14 16:03:13 PDT
(In reply to Alicia Boya García from
comment #6
)
> What are the implications for other platforms where decode durations are > still used?
Continuing: I carefully crafted the fix such that other platforms should not be affected. At runtime, we calculate the "greatest decode duration" based on successive decode timestamps, so the algorithm should work whether the platform supports decode durations or not. In this, the new behavior should match Firefox's.
Alicia Boya García
Comment 9
2018-05-15 07:07:30 PDT
I can't help but wonder what does CoreMedia to get presentation durations in fragmented media: How do you figure out the presentation duration of the last frame in a fragment? Wouldn't you need the first (sync) frame of the next fragment for that?
Jer Noble
Comment 10
2018-05-15 10:06:51 PDT
(In reply to Alicia Boya García from
comment #9
)
> I can't help but wonder what does CoreMedia to get presentation durations in > fragmented media: How do you figure out the presentation duration of the > last frame in a fragment? Wouldn't you need the first (sync) frame of the > next fragment for that?
Yes, you would. But since they don't (necessarily) have that information, they use the decode duration for the presentation duration, which is almost always going to be incorrect. I pointed this out at the time as well. :-/
Note
You need to
log in
before you can comment on or make changes to this bug.
Top of Page
Format For Printing
XML
Clone This Bug