RESOLVED FIXED 211013
getShaderPrecisionFormat returns the wrong values
https://bugs.webkit.org/show_bug.cgi?id=211013
Summary getShaderPrecisionFormat returns the wrong values
Gregg Tavares
Reported 2020-04-25 00:24:52 PDT
Attachments
Patch (2.76 KB, patch)
2020-04-26 17:42 PDT, Dean Jackson
darin: review+
Radar WebKit Bug Importer
Comment 1 2020-04-25 17:27:45 PDT
Dean Jackson
Comment 2 2020-04-26 17:42:13 PDT
Darin Adler
Comment 3 2020-04-26 17:58:47 PDT
Comment on attachment 397638 [details] Patch This wasn’t an anti-fingerprinting measure?
Dean Jackson
Comment 4 2020-04-27 11:30:54 PDT
(In reply to Darin Adler from comment #3) > Comment on attachment 397638 [details] > Patch > > This wasn’t an anti-fingerprinting measure? That's a good point. Let me look at the history.
Dean Jackson
Comment 5 2020-04-27 12:11:03 PDT
(In reply to Darin Adler from comment #3) > Comment on attachment 397638 [details] > Patch > > This wasn’t an anti-fingerprinting measure? They were always exposed as constant values! https://trac.webkit.org/changeset/113092/webkit + // These constants came from the Chromium port; we believe they originally + // came from making the actual API call on a representative desktop system. Unfortunately my macOS systems return the same value, so I can't see if Chrome/Edge + Firefox do this properly. I'll have to look at their code. For iOS, this does expose different values (from macOS). I'll also check an Android device.
Kenneth Russell
Comment 6 2020-04-27 13:30:29 PDT
The original code came from early WebGL implementations, where this API call only existed on mobile devices and not desktop. It's important for correctness and portability to expose this accurately. Gregg's encountered portability issues and a new WebGL conformance test which fails without this change is incoming in https://github.com/KhronosGroup/WebGL/pull/3061 .
Dean Jackson
Comment 7 2020-04-27 13:34:18 PDT
The main issue is that the hard-coded values are higher than what we support on iOS, which could lead to people thinking the device is more powerful (or accurate) than it really is. So if we're going to hard-code values, we should pick the lowest.
Dean Jackson
Comment 8 2020-04-27 13:36:50 PDT
(In reply to Kenneth Russell from comment #6) > The original code came from early WebGL implementations, where this API call > only existed on mobile devices and not desktop. It's strange then that we hard-coded it to a desktop value then, and never asked for the real value on a mobile device. > > It's important for correctness and portability to expose this accurately. > Gregg's encountered portability issues and a new WebGL conformance test > which fails without this change is incoming in > https://github.com/KhronosGroup/WebGL/pull/3061 . It does expose a fingerprint though. I think I need to check how much the values can vary between devices (desktop to desktop, and mobile to mobile).
Kenneth Russell
Comment 9 2020-04-27 14:06:00 PDT
This will likely mainly allow distinguishing between desktop and mobile GPUs, especially since there are relatively few GPU types on both macOS and iOS. The user agent string already provides essentially this information, in allowing distinguishing between operating systems.
Dean Jackson
Comment 10 2020-04-27 15:04:57 PDT
Apple's OpenGL team confirmed that all iOS devices return the same values, as do all macOS devices. So this isn't extra fingerprinting.
Dean Jackson
Comment 11 2020-04-27 15:08:44 PDT
Gregg Tavares
Comment 12 2020-05-02 21:55:34 PDT
Just FYI. Even if they were different on every GPU not exposing them would not add new fingerprinting. The whole point of the new conformance test is to actually render to find out what the precision is because the value was wrong here. I'm sure a better test could derive any precision even if it was not reported here
Note You need to log in before you can comment on or make changes to this bug.