Allow to pass webrtc video frame buffers through RealtimeMediaSource based pipelines
Created attachment 450881 [details] Patch
(In reply to youenn fablet from comment #1) > Created attachment 450881 [details] > Patch Not yet ready for review. Need to sync it with https://bugs.webkit.org/show_bug.cgi?id=236099
<rdar://problem/88805580>
Created attachment 452864 [details] Patch
Created attachment 452982 [details] Patch
Created attachment 452998 [details] Patch
Created attachment 453374 [details] Patch
Created attachment 453720 [details] Patch
Comment on attachment 453720 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=453720&action=review > Source/WebCore/ChangeLog:14 > + In case we send VideoFrameLibWebRTC through RealtimeOutgoingVideoSourceCocoa, directly send the video frame buffer. > + When receiving libwebrtc video frame buffers in RealtimeIncomingVideoSourceCocoa, make use of VideoFrameLibWebRTC to delay conversion of the video frame buffer in a CVPixelBufferRef. > + In most cases, the conversion is unneeded as the video frame buffer will be used for rendering and will be copied to shared memory through SharedVideoFrameWriter. > + Fix bugs in handling of YUV conversion libwebrtc routines. > + Minor refactoring to have a default asVideoFrameCV method implementatin in VideoFrame. Nit: these lines are really long
Committed r291051 (248225@main): <https://commits.webkit.org/248225@main> All reviewed patches have been landed. Closing bug and clearing flags on attachment 453720 [details].
r291076
r291078