NEW236140
MediaPlayerPrivateMediaStreamAVFObjC needs the GPUP context to create GPUP ImageBuffers
https://bugs.webkit.org/show_bug.cgi?id=236140
Summary MediaPlayerPrivateMediaStreamAVFObjC needs the GPUP context to create GPUP Im...
Kimmo Kinnunen
Reported 2022-02-04 07:15:02 PST
MediaPlayerPrivateMediaStreamAVFObjC needs the GPUP context to create GPUP ImageBuffers Captured video frames are stored in GPUP and accessed via RemoteVideoFrameProxy (which is a MediaSample for now). To paint this to document or canvas (not compositing), we need to get an ImageBuffer and draw it. To get an ImageBuffer, we need to ask GPUP for RemoteImageBufferProxy (WP) out of the video frame. RemoteVideoFrameProxy is constructed when GPUP sends a GPUP video frame. This construction needs access to other GPUP context to reference various implementations. In particular, it needs RemoteRenderingContextProxy (WP). The way this is done in other places is that WebChromeClient is the factory object that decides what kind of class to create: either local one or GPUP one. WebChromeClient has also access to all the page-related GPUP context. For example: RemoteGraphicsContextGL (GPUP) needs RemoteRenderingBackend (GPUP). So RemoteGraphicsContextProxy (WP) can refer to RemoteRenderingBackendProxy (WP) Same for MediaPlayerPrivateMediaStreamAVFObjC: Whatever takes GPUP video frame objects, creates the video frames for MediaPlayerPrivateMediaStreamAVFObjC. It needs to have access to WP-side GPUP context, essentially the objects held in WebChromeClient. Thus all these Media capture communication interfaces should be probably instantiated via WebChromeClient
Attachments
Kimmo Kinnunen
Comment 1 2022-02-04 07:19:52 PST
If I understand correctly, RemoteCaptureSampleManager is instantiated to capture the samples for a particular page, via UserMediaCaptureManager. So WebChromeClient should return such a UserMediaCaptureManager that captures in GPUP when the flags are set. Otherwise it should construct the a local one. For the Remote case, WebChromeClient should provide the dependency objects to RemoteCaptureSampleManager, and then the constructed object to UserMediaCaptureManager, and then return it. Media code should ask WebChromeClient to create the various objects. Probably this will be true for all other media related code too, for example here: void RemoteMediaPlayerManager::setUseGPUProcess(bool useGPUProcess) { auto registerEngine = [this](MediaEngineRegistrar registrar, MediaPlayerEnums::MediaEngineIdentifier remoteEngineIdentifier) { registrar(makeUnique<MediaPlayerRemoteFactory>(remoteEngineIdentifier, *this)); }; RemoteMediaPlayerSupport::setRegisterRemotePlayerCallback(useGPUProcess ? WTFMove(registerEngine) : RemoteMediaPlayerSupport::RegisterRemotePlayerCallback()); #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) if (useGPUProcess) { WebCore::SampleBufferDisplayLayer::setCreator([](auto& client) { return WebProcess::singleton().ensureGPUProcessConnection().sampleBufferDisplayLayerManager().createLayer(client); }); } #endif } E.g. WebChromeClient should instantiate SampleBufferDisplayLayer, so that page-dependent GPUP object can be passed to the remote SampleBufferDisplayLayer
Radar WebKit Bug Importer
Comment 2 2022-02-11 07:15:19 PST
Note You need to log in before you can comment on or make changes to this bug.