Bug 231425 - WebGL video texture upload is very slow due to excessive transfer of the video pixel buffer
Summary: WebGL video texture upload is very slow due to excessive transfer of the vide...
Status: RESOLVED FIXED
Alias: None
Product: WebKit
Classification: Unclassified
Component: WebGL (show other bugs)
Version: WebKit Local Build
Hardware: Unspecified Unspecified
: P2 Normal
Assignee: Kimmo Kinnunen
URL:
Keywords: InRadar
Depends on:
Blocks: 232296 231031
  Show dependency treegraph
 
Reported: 2021-10-08 05:31 PDT by Kimmo Kinnunen
Modified: 2022-01-11 09:46 PST (History)
15 users (show)

See Also:


Attachments
Patch (15.32 MB, text/plain)
2021-10-08 05:48 PDT, Kimmo Kinnunen
youennf: review+
Details
Patch (8.16 KB, patch)
2021-10-12 23:10 PDT, Kimmo Kinnunen
ews-feeder: commit-queue-
Details | Formatted Diff | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Kimmo Kinnunen 2021-10-08 05:31:05 PDT
WebGL video texture upload is very slow due to excessive transfer of the video pixel buffer

MediaPlayer is running in GPU process
WebGL is running in Web process
The image needs to be transferred, and it's relatively slow.
WebGL content typically just uploads the video frame, even though it has not changed.
We have optimisation to skip the unchanged upload.
However, the IPC was not skipped.
Comment 1 Kimmo Kinnunen 2021-10-08 05:48:57 PDT
Created attachment 440598 [details]
Patch
Comment 2 youenn fablet 2021-10-08 07:44:02 PDT
Comment on attachment 440598 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=440598&action=review

> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h:372
> +#if PLATFORM(COCOA)

AVFOUNDATION
Comment 3 Kenneth Russell 2021-10-08 16:54:48 PDT
Is all of the video frame's data being passed back to the web process, or only a handle to data being held in the GPU process?

If the former, is there any way to define some sort of remote handle to the data?
Comment 4 Kimmo Kinnunen 2021-10-12 12:07:19 PDT
(In reply to Kenneth Russell from comment #3)
> Is all of the video frame's data being passed back to the web process, or
> only a handle to data being held in the GPU process?
> 
> If the former, is there any way to define some sort of remote handle to the
> data?

Both, in a way. The handle to the data is the IOSurface.

However, the IOSurface needs to "transfer" the data to WebP, since the WebGL works in WebContentP. So the IOSurface is not "real" remote handle as it also maps the data.


CVPixelBuffer gets turned into IOSurface in GPUP media player remote
IOSurface gets turned into CVPixelBuffer in WebP media player proxy
CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE
IOSurface gets turned into pbuffers by ANGLE EGL
pbuffers get turned into yuv textures by ANGLE EGL
yuv textures get turned into rgb texture by ANGLE GL
Comment 5 Radar WebKit Bug Importer 2021-10-12 12:12:38 PDT
<rdar://problem/84160422>
Comment 6 Peng Liu 2021-10-12 12:13:36 PDT
(In reply to Kimmo Kinnunen from comment #4)
> (In reply to Kenneth Russell from comment #3)
> > Is all of the video frame's data being passed back to the web process, or
> > only a handle to data being held in the GPU process?
> > 
> > If the former, is there any way to define some sort of remote handle to the
> > data?
> 
> Both, in a way. The handle to the data is the IOSurface.
> 
> However, the IOSurface needs to "transfer" the data to WebP, since the WebGL
> works in WebContentP. So the IOSurface is not "real" remote handle as it
> also maps the data.
> 
> 
> CVPixelBuffer gets turned into IOSurface in GPUP media player remote
> IOSurface gets turned into CVPixelBuffer in WebP media player proxy
> CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE
> IOSurface gets turned into pbuffers by ANGLE EGL
> pbuffers get turned into yuv textures by ANGLE EGL
> yuv textures get turned into rgb texture by ANGLE GL

By the way, MediaPlayerPrivateRemote is in the WebContent Process, while RemoteMediaPlayerProxy is in the GPU process. I guess the "media player proxy" in your comment means MediaPlayerPrivateRemote?
Comment 7 Jon Lee 2021-10-12 13:36:50 PDT
Comment on attachment 440598 [details]
Patch

please do not commit this latest patch; there is a file in there that shouldn't be there.
Comment 8 Kenneth Russell 2021-10-12 14:49:40 PDT
(In reply to Kimmo Kinnunen from comment #4)
> Both, in a way. The handle to the data is the IOSurface.
> 
> However, the IOSurface needs to "transfer" the data to WebP, since the WebGL
> works in WebContentP. So the IOSurface is not "real" remote handle as it
> also maps the data.
> 
> 
> CVPixelBuffer gets turned into IOSurface in GPUP media player remote
> IOSurface gets turned into CVPixelBuffer in WebP media player proxy
> CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE
> IOSurface gets turned into pbuffers by ANGLE EGL
> pbuffers get turned into yuv textures by ANGLE EGL
> yuv textures get turned into rgb texture by ANGLE GL

Thanks for the detailed flow!

Conceptually, in this specific usage, it would be fine if the WebP never received even the IOSurface handle. All of the "real" work of feeding the IOSurface into ANGLE is done in the GPUP. Some ANGLE and EGL related APIs might need to change in the WebP so that they refer to some sort of remote IOSurface handle.

Does that seem like a feasible direction for the future? Should we file an enhancement about making a change like that?

It's probably impossible to know whether the WebP might need the real IOSurface for other reasons. Does rendering of the HTMLVideoElement need the IOSurface in the WebP, or could this be fully delegated to the GPUP via the same kind of remote handle?
Comment 9 Kimmo Kinnunen 2021-10-12 22:24:17 PDT
(In reply to Kenneth Russell from comment #8)
> (In reply to Kimmo Kinnunen from comment #4)
> > Both, in a way. The handle to the data is the IOSurface.
> > 
> > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL
> > works in WebContentP. So the IOSurface is not "real" remote handle as it
> > also maps the data.
> > 
> > 
> > CVPixelBuffer gets turned into IOSurface in GPUP media player remote
> > IOSurface gets turned into CVPixelBuffer in WebP media player proxy
> > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE
> > IOSurface gets turned into pbuffers by ANGLE EGL
> > pbuffers get turned into yuv textures by ANGLE EGL
> > yuv textures get turned into rgb texture by ANGLE GL
> 
> Thanks for the detailed flow!
> 
> Conceptually, in this specific usage, it would be fine if the WebP never
> received even the IOSurface handle. All of the "real" work of feeding the
> IOSurface into ANGLE is done in the GPUP. 

No, WebGL runs in WebP for now :(

> Some ANGLE and EGL related APIs
> might need to change in the WebP so that they refer to some sort of remote
> IOSurface handle.
> 
> Does that seem like a feasible direction for the future? Should we file an
> enhancement about making a change like that?

When WebGL runs in GPUP, everything stays GPUP side.
Comment 10 Kimmo Kinnunen 2021-10-12 23:10:51 PDT
Created attachment 441039 [details]
Patch
Comment 11 Kenneth Russell 2021-10-13 10:03:26 PDT
(In reply to Kimmo Kinnunen from comment #9)
> When WebGL runs in GPUP, everything stays GPUP side.

Ah, great!
Comment 12 EWS 2021-10-13 10:28:36 PDT
Committed r284102 (242930@main): <https://commits.webkit.org/242930@main>

All reviewed patches have been landed. Closing bug and clearing flags on attachment 441039 [details].