Bug 211013

Summary: getShaderPrecisionFormat returns the wrong values
Product: WebKit Reporter: Gregg Tavares <gman>
Component: WebGLAssignee: Dean Jackson <dino>
Status: RESOLVED FIXED    
Severity: Normal CC: darin, dino, ews-watchlist, graouts, kbr, kondapallykalyan, webkit-bug-importer
Priority: P2 Keywords: InRadar
Version: WebKit Nightly Build   
Hardware: All   
OS: All   
Attachments:
Description Flags
Patch darin: review+

Description Gregg Tavares 2020-04-25 00:24:52 PDT
Checking the code it's appears to be hard coded

https://trac.webkit.org/browser/webkit/trunk/Source/WebCore/platform/graphics/opengl/GraphicsContextGLOpenGLBase.cpp#L362
Comment 1 Radar WebKit Bug Importer 2020-04-25 17:27:45 PDT
<rdar://problem/62378056>
Comment 2 Dean Jackson 2020-04-26 17:42:13 PDT
Created attachment 397638 [details]
Patch
Comment 3 Darin Adler 2020-04-26 17:58:47 PDT
Comment on attachment 397638 [details]
Patch

This wasn’t an anti-fingerprinting measure?
Comment 4 Dean Jackson 2020-04-27 11:30:54 PDT
(In reply to Darin Adler from comment #3)
> Comment on attachment 397638 [details]
> Patch
> 
> This wasn’t an anti-fingerprinting measure?

That's a good point. Let me look at the history.
Comment 5 Dean Jackson 2020-04-27 12:11:03 PDT
(In reply to Darin Adler from comment #3)
> Comment on attachment 397638 [details]
> Patch
> 
> This wasn’t an anti-fingerprinting measure?

They were always exposed as constant values!

https://trac.webkit.org/changeset/113092/webkit

+    // These constants came from the Chromium port; we believe they originally
+    // came from making the actual API call on a representative desktop system.

Unfortunately my macOS systems return the same value, so I can't see if Chrome/Edge + Firefox do this properly. I'll have to look at their code.

For iOS, this does expose different values (from macOS). I'll also check an Android device.
Comment 6 Kenneth Russell 2020-04-27 13:30:29 PDT
The original code came from early WebGL implementations, where this API call only existed on mobile devices and not desktop.

It's important for correctness and portability to expose this accurately. Gregg's encountered portability issues and a new WebGL conformance test which fails without this change is incoming in https://github.com/KhronosGroup/WebGL/pull/3061 .
Comment 7 Dean Jackson 2020-04-27 13:34:18 PDT
The main issue is that the hard-coded values are higher than what we support on iOS, which could lead to people thinking the device is more powerful (or accurate) than it really is.

So if we're going to hard-code values, we should pick the lowest.
Comment 8 Dean Jackson 2020-04-27 13:36:50 PDT
(In reply to Kenneth Russell from comment #6)
> The original code came from early WebGL implementations, where this API call
> only existed on mobile devices and not desktop.

It's strange then that we hard-coded it to a desktop value then, and never asked for the real value on a mobile device.

> 
> It's important for correctness and portability to expose this accurately.
> Gregg's encountered portability issues and a new WebGL conformance test
> which fails without this change is incoming in
> https://github.com/KhronosGroup/WebGL/pull/3061 .

It does expose a fingerprint though. I think I need to check how much the values can vary between devices (desktop to desktop, and mobile to mobile).
Comment 9 Kenneth Russell 2020-04-27 14:06:00 PDT
This will likely mainly allow distinguishing between desktop and mobile GPUs, especially since there are relatively few GPU types on both macOS and iOS. The user agent string already provides essentially this information, in allowing distinguishing between operating systems.
Comment 10 Dean Jackson 2020-04-27 15:04:57 PDT
Apple's OpenGL team confirmed that all iOS devices return the same values, as do all macOS devices. So this isn't extra fingerprinting.
Comment 11 Dean Jackson 2020-04-27 15:08:44 PDT
Committed r260789: <https://trac.webkit.org/changeset/260789>
Comment 12 Gregg Tavares 2020-05-02 21:55:34 PDT
Just FYI. Even if they were different on every GPU not exposing them would not add new fingerprinting. The whole point of the new conformance test is to actually render to find out what the precision is because the value was wrong here.

I'm sure a better test could derive any precision even if it was not reported here