WebKit Bugzilla
New
Browse
Log In
×
Sign in with GitHub
or
Remember my login
Create Account
·
Forgot Password
Forgotten password account recovery
RESOLVED FIXED
Bug 231425
WebGL video texture upload is very slow due to excessive transfer of the video pixel buffer
https://bugs.webkit.org/show_bug.cgi?id=231425
Summary
WebGL video texture upload is very slow due to excessive transfer of the vide...
Kimmo Kinnunen
Reported
2021-10-08 05:31:05 PDT
WebGL video texture upload is very slow due to excessive transfer of the video pixel buffer MediaPlayer is running in GPU process WebGL is running in Web process The image needs to be transferred, and it's relatively slow. WebGL content typically just uploads the video frame, even though it has not changed. We have optimisation to skip the unchanged upload. However, the IPC was not skipped.
Attachments
Patch
(15.32 MB, text/plain)
2021-10-08 05:48 PDT
,
Kimmo Kinnunen
youennf
: review+
Details
Patch
(8.16 KB, patch)
2021-10-12 23:10 PDT
,
Kimmo Kinnunen
ews-feeder
: commit-queue-
Details
Formatted Diff
Diff
Show Obsolete
(1)
View All
Add attachment
proposed patch, testcase, etc.
Kimmo Kinnunen
Comment 1
2021-10-08 05:48:57 PDT
Created
attachment 440598
[details]
Patch
youenn fablet
Comment 2
2021-10-08 07:44:02 PDT
Comment on
attachment 440598
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=440598&action=review
> Source/WebKit/GPUProcess/media/RemoteMediaPlayerProxy.h:372 > +#if PLATFORM(COCOA)
AVFOUNDATION
Kenneth Russell
Comment 3
2021-10-08 16:54:48 PDT
Is all of the video frame's data being passed back to the web process, or only a handle to data being held in the GPU process? If the former, is there any way to define some sort of remote handle to the data?
Kimmo Kinnunen
Comment 4
2021-10-12 12:07:19 PDT
(In reply to Kenneth Russell from
comment #3
)
> Is all of the video frame's data being passed back to the web process, or > only a handle to data being held in the GPU process? > > If the former, is there any way to define some sort of remote handle to the > data?
Both, in a way. The handle to the data is the IOSurface. However, the IOSurface needs to "transfer" the data to WebP, since the WebGL works in WebContentP. So the IOSurface is not "real" remote handle as it also maps the data. CVPixelBuffer gets turned into IOSurface in GPUP media player remote IOSurface gets turned into CVPixelBuffer in WebP media player proxy CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE IOSurface gets turned into pbuffers by ANGLE EGL pbuffers get turned into yuv textures by ANGLE EGL yuv textures get turned into rgb texture by ANGLE GL
Radar WebKit Bug Importer
Comment 5
2021-10-12 12:12:38 PDT
<
rdar://problem/84160422
>
Peng Liu
Comment 6
2021-10-12 12:13:36 PDT
(In reply to Kimmo Kinnunen from
comment #4
)
> (In reply to Kenneth Russell from
comment #3
) > > Is all of the video frame's data being passed back to the web process, or > > only a handle to data being held in the GPU process? > > > > If the former, is there any way to define some sort of remote handle to the > > data? > > Both, in a way. The handle to the data is the IOSurface. > > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL > works in WebContentP. So the IOSurface is not "real" remote handle as it > also maps the data. > > > CVPixelBuffer gets turned into IOSurface in GPUP media player remote > IOSurface gets turned into CVPixelBuffer in WebP media player proxy > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE > IOSurface gets turned into pbuffers by ANGLE EGL > pbuffers get turned into yuv textures by ANGLE EGL > yuv textures get turned into rgb texture by ANGLE GL
By the way, MediaPlayerPrivateRemote is in the WebContent Process, while RemoteMediaPlayerProxy is in the GPU process. I guess the "media player proxy" in your comment means MediaPlayerPrivateRemote?
Jon Lee
Comment 7
2021-10-12 13:36:50 PDT
Comment on
attachment 440598
[details]
Patch please do not commit this latest patch; there is a file in there that shouldn't be there.
Kenneth Russell
Comment 8
2021-10-12 14:49:40 PDT
(In reply to Kimmo Kinnunen from
comment #4
)
> Both, in a way. The handle to the data is the IOSurface. > > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL > works in WebContentP. So the IOSurface is not "real" remote handle as it > also maps the data. > > > CVPixelBuffer gets turned into IOSurface in GPUP media player remote > IOSurface gets turned into CVPixelBuffer in WebP media player proxy > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE > IOSurface gets turned into pbuffers by ANGLE EGL > pbuffers get turned into yuv textures by ANGLE EGL > yuv textures get turned into rgb texture by ANGLE GL
Thanks for the detailed flow! Conceptually, in this specific usage, it would be fine if the WebP never received even the IOSurface handle. All of the "real" work of feeding the IOSurface into ANGLE is done in the GPUP. Some ANGLE and EGL related APIs might need to change in the WebP so that they refer to some sort of remote IOSurface handle. Does that seem like a feasible direction for the future? Should we file an enhancement about making a change like that? It's probably impossible to know whether the WebP might need the real IOSurface for other reasons. Does rendering of the HTMLVideoElement need the IOSurface in the WebP, or could this be fully delegated to the GPUP via the same kind of remote handle?
Kimmo Kinnunen
Comment 9
2021-10-12 22:24:17 PDT
(In reply to Kenneth Russell from
comment #8
)
> (In reply to Kimmo Kinnunen from
comment #4
) > > Both, in a way. The handle to the data is the IOSurface. > > > > However, the IOSurface needs to "transfer" the data to WebP, since the WebGL > > works in WebContentP. So the IOSurface is not "real" remote handle as it > > also maps the data. > > > > > > CVPixelBuffer gets turned into IOSurface in GPUP media player remote > > IOSurface gets turned into CVPixelBuffer in WebP media player proxy > > CVPixelBuffer gets turned into IOSurface in GraphicsContextGLCVANGLE > > IOSurface gets turned into pbuffers by ANGLE EGL > > pbuffers get turned into yuv textures by ANGLE EGL > > yuv textures get turned into rgb texture by ANGLE GL > > Thanks for the detailed flow! > > Conceptually, in this specific usage, it would be fine if the WebP never > received even the IOSurface handle. All of the "real" work of feeding the > IOSurface into ANGLE is done in the GPUP.
No, WebGL runs in WebP for now :(
> Some ANGLE and EGL related APIs > might need to change in the WebP so that they refer to some sort of remote > IOSurface handle. > > Does that seem like a feasible direction for the future? Should we file an > enhancement about making a change like that?
When WebGL runs in GPUP, everything stays GPUP side.
Kimmo Kinnunen
Comment 10
2021-10-12 23:10:51 PDT
Created
attachment 441039
[details]
Patch
Kenneth Russell
Comment 11
2021-10-13 10:03:26 PDT
(In reply to Kimmo Kinnunen from
comment #9
)
> When WebGL runs in GPUP, everything stays GPUP side.
Ah, great!
EWS
Comment 12
2021-10-13 10:28:36 PDT
Committed
r284102
(
242930@main
): <
https://commits.webkit.org/242930@main
> All reviewed patches have been landed. Closing bug and clearing flags on
attachment 441039
[details]
.
Note
You need to
log in
before you can comment on or make changes to this bug.
Top of Page
Format For Printing
XML
Clone This Bug