| Summary: | MediaPlayerPrivateMediaStreamAVFObjC needs the GPUP context to create GPUP ImageBuffers | ||
|---|---|---|---|
| Product: | WebKit | Reporter: | Kimmo Kinnunen <kkinnunen> |
| Component: | Media | Assignee: | Nobody <webkit-unassigned> |
| Status: | NEW --- | ||
| Severity: | Normal | CC: | webkit-bug-importer |
| Priority: | P2 | Keywords: | InRadar |
| Version: | WebKit Local Build | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
|
Description
Kimmo Kinnunen
2022-02-04 07:15:02 PST
If I understand correctly, RemoteCaptureSampleManager is instantiated to capture the samples for a particular page, via UserMediaCaptureManager.
So WebChromeClient should return such a UserMediaCaptureManager that captures in GPUP when the flags are set. Otherwise it should construct the a local one.
For the Remote case, WebChromeClient should provide the dependency objects to RemoteCaptureSampleManager, and then the constructed object to UserMediaCaptureManager, and then return it.
Media code should ask WebChromeClient to create the various objects.
Probably this will be true for all other media related code too, for example here:
void RemoteMediaPlayerManager::setUseGPUProcess(bool useGPUProcess)
{
auto registerEngine = [this](MediaEngineRegistrar registrar, MediaPlayerEnums::MediaEngineIdentifier remoteEngineIdentifier) {
registrar(makeUnique<MediaPlayerRemoteFactory>(remoteEngineIdentifier, *this));
};
RemoteMediaPlayerSupport::setRegisterRemotePlayerCallback(useGPUProcess ? WTFMove(registerEngine) : RemoteMediaPlayerSupport::RegisterRemotePlayerCallback());
#if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
if (useGPUProcess) {
WebCore::SampleBufferDisplayLayer::setCreator([](auto& client) {
return WebProcess::singleton().ensureGPUProcessConnection().sampleBufferDisplayLayerManager().createLayer(client);
});
}
#endif
}
E.g. WebChromeClient should instantiate SampleBufferDisplayLayer, so that page-dependent GPUP object can be passed to the remote SampleBufferDisplayLayer
|