WebGL video texture upload is very slow due to excessive transfer of the video pixel buffer MediaPlayer is running in GPU process WebGL is running in Web process The image needs to be transferred, and it's relatively slow. WebGL content typically just uploads the video frame, even though it has not changed. We have optimisation to skip the unchanged upload. However, the IPC was not skipped.
Created attachment 440598 [details] Patch
Comment on attachment 440598 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=440598&action=review > Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h:372 > +#if PLATFORM(COCOA) AVFOUNDATION
Is all of the video frame's data being passed back to the web process, or only a handle to data being held in the GPU process? If the former, is there any way to define some sort of remote handle to the data?
(In reply to Kenneth Russell from comment #3) > Is all of the video frame's data being passed back to the web process, or > only a handle to data being held in the GPU process? > > If the former, is there any way to define some sort of remote handle to the > data? Both, in a way. The handle to the data is the IOSurface. However, the IOSurface needs to "transfer" the data to WebP, since the WebGL works in WebContentP. So the IOSurface is not "real" remote handle as it also maps the data. CVPixelBuffer gets turned into IOSurface in GPUP media player remote IOSurface gets turned into CVPixelBuffer in WebP media player proxy CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE IOSurface gets turned into pbuffers by ANGLE EGL pbuffers get turned into yuv textures by ANGLE EGL yuv textures get turned into rgb texture by ANGLE GL
<rdar://problem/84160422>
(In reply to Kimmo Kinnunen from comment #4) > (In reply to Kenneth Russell from comment #3) > > Is all of the video frame's data being passed back to the web process, or > > only a handle to data being held in the GPU process? > > > > If the former, is there any way to define some sort of remote handle to the > > data? > > Both, in a way. The handle to the data is the IOSurface. > > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL > works in WebContentP. So the IOSurface is not "real" remote handle as it > also maps the data. > > > CVPixelBuffer gets turned into IOSurface in GPUP media player remote > IOSurface gets turned into CVPixelBuffer in WebP media player proxy > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE > IOSurface gets turned into pbuffers by ANGLE EGL > pbuffers get turned into yuv textures by ANGLE EGL > yuv textures get turned into rgb texture by ANGLE GL By the way, MediaPlayerPrivateRemote is in the WebContent Process, while RemoteMediaPlayerProxy is in the GPU process. I guess the "media player proxy" in your comment means MediaPlayerPrivateRemote?
Comment on attachment 440598 [details] Patch please do not commit this latest patch; there is a file in there that shouldn't be there.
(In reply to Kimmo Kinnunen from comment #4) > Both, in a way. The handle to the data is the IOSurface. > > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL > works in WebContentP. So the IOSurface is not "real" remote handle as it > also maps the data. > > > CVPixelBuffer gets turned into IOSurface in GPUP media player remote > IOSurface gets turned into CVPixelBuffer in WebP media player proxy > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE > IOSurface gets turned into pbuffers by ANGLE EGL > pbuffers get turned into yuv textures by ANGLE EGL > yuv textures get turned into rgb texture by ANGLE GL Thanks for the detailed flow! Conceptually, in this specific usage, it would be fine if the WebP never received even the IOSurface handle. All of the "real" work of feeding the IOSurface into ANGLE is done in the GPUP. Some ANGLE and EGL related APIs might need to change in the WebP so that they refer to some sort of remote IOSurface handle. Does that seem like a feasible direction for the future? Should we file an enhancement about making a change like that? It's probably impossible to know whether the WebP might need the real IOSurface for other reasons. Does rendering of the HTMLVideoElement need the IOSurface in the WebP, or could this be fully delegated to the GPUP via the same kind of remote handle?
(In reply to Kenneth Russell from comment #8) > (In reply to Kimmo Kinnunen from comment #4) > > Both, in a way. The handle to the data is the IOSurface. > > > > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL > > works in WebContentP. So the IOSurface is not "real" remote handle as it > > also maps the data. > > > > > > CVPixelBuffer gets turned into IOSurface in GPUP media player remote > > IOSurface gets turned into CVPixelBuffer in WebP media player proxy > > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE > > IOSurface gets turned into pbuffers by ANGLE EGL > > pbuffers get turned into yuv textures by ANGLE EGL > > yuv textures get turned into rgb texture by ANGLE GL > > Thanks for the detailed flow! > > Conceptually, in this specific usage, it would be fine if the WebP never > received even the IOSurface handle. All of the "real" work of feeding the > IOSurface into ANGLE is done in the GPUP. No, WebGL runs in WebP for now :( > Some ANGLE and EGL related APIs > might need to change in the WebP so that they refer to some sort of remote > IOSurface handle. > > Does that seem like a feasible direction for the future? Should we file an > enhancement about making a change like that? When WebGL runs in GPUP, everything stays GPUP side.
Created attachment 441039 [details] Patch
(In reply to Kimmo Kinnunen from comment #9) > When WebGL runs in GPUP, everything stays GPUP side. Ah, great!
Committed r284102 (242930@main): <https://commits.webkit.org/242930@main> All reviewed patches have been landed. Closing bug and clearing flags on attachment 441039 [details].