Bug 226408

Summary: [WebXR] Recommended framebuffer width incorrectly scaled
Product: WebKit Reporter: Dean Jackson <dino>
Component: WebXRAssignee: Dean Jackson <dino>
Status: RESOLVED FIXED    
Severity: Normal CC: adachan, ifernandez, sam, thorton, webkit-bug-importer
Priority: P2 Keywords: InRadar
Version: WebKit Nightly Build   
Hardware: Unspecified   
OS: Unspecified   
Attachments:
Description Flags
Patch thorton: review+

Description Dean Jackson 2021-05-28 17:34:34 PDT
When WebXRWebGLLayer creates the WebXROpaqueFramebuffer, it asks the session for the recommended framebuffer size. It then multiplies the width by 2 - I assume because there are two eyes. However, the specification [1] says that it is a "best estimate of the WebGL framebuffer resolution large enough to contain all of the session’s XRViews". So it should be the session that makes account for the multiple views, not the framebuffer.

Note also: "best estimate". Currently the dimensions of the framebuffer are set once as the WebGLLayer is created. I think it should be checked each frame. For example, the headset might be under load and start providing smaller textures for rendering.
Comment 1 Radar WebKit Bug Importer 2021-05-28 17:34:45 PDT
<rdar://problem/78638309>
Comment 2 Dean Jackson 2021-05-28 17:46:31 PDT
Created attachment 430080 [details]
Patch
Comment 3 Dean Jackson 2021-05-30 12:42:11 PDT
Committed r278255 (238292@main): <https://commits.webkit.org/238292@main>
Comment 4 Imanol Fernandez 2021-05-31 02:22:37 PDT
> 
> Note also: "best estimate". Currently the dimensions of the framebuffer are
> set once as the WebGLLayer is created. I think it should be checked each
> frame. For example, the headset might be under load and start providing
> smaller textures for rendering.

The spec supports Dynamic viewport scaling. The benefit is that you can change the resolution on a per-frame basis without reallocating the WebGLLayer framebuffer.

Right now we return nullop in WebXRView::recommendedViewportScale, so always the full viewport is used. If you have a  a heuristic to determine the recommended scale per frame you can implement it already on the cocoa platform. For this to work the underlying SDK should support setting the UV rect when submitting the frame to the headset.