RealtimeIncomingVideoSourceCocoa::OnFrame should use video frame timestamp
Created attachment 400317 [details] Patch
Created attachment 400320 [details] Patch
Comment on attachment 400320 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=400320&action=review Nice simplification! > Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm:331 > while (!m_pendingVideoSampleQueue.isEmpty()) { > - if (m_pendingVideoSampleQueue.first()->decodeTime() > now) > + auto presentationTime = m_pendingVideoSampleQueue.first()->presentationTime(); > + if (presentationTime.isValid() && presentationTime > now) > break; > m_pendingVideoSampleQueue.removeFirst(); Will this do the right thing with 'display immediately' samples?
(In reply to Eric Carlson from comment #3) > Comment on attachment 400320 [details] > Patch > > View in context: > https://bugs.webkit.org/attachment.cgi?id=400320&action=review > > Nice simplification! > > > Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm:331 > > while (!m_pendingVideoSampleQueue.isEmpty()) { > > - if (m_pendingVideoSampleQueue.first()->decodeTime() > now) > > + auto presentationTime = m_pendingVideoSampleQueue.first()->presentationTime(); > > + if (presentationTime.isValid() && presentationTime > now) > > break; > > m_pendingVideoSampleQueue.removeFirst(); > > Will this do the right thing with 'display immediately' samples? There might indeed be an issue in general. In practice, for WebRTC usage (which is the sole use of this class), we could probably further simplify and just use one sample as a buffer instead of a duque, since we want to use the newest sample for WebRTC.
Committed r262238: <https://trac.webkit.org/changeset/262238> All reviewed patches have been landed. Closing bug and clearing flags on attachment 400320 [details].
<rdar://problem/63710172>