|Summary:||Apple Emoji render slightly differently with GPU Process DOM Rendering enabled|
|Product:||WebKit||Reporter:||Cameron McCormack (:heycam) <heycam>|
|Component:||Layout and Rendering||Assignee:||Nobody <webkit-unassigned>|
|Severity:||Normal||CC:||bfulgham, mmaxfield, simon.fraser, webkit-bug-importer, zalan|
|Version:||WebKit Nightly Build|
Description Cameron McCormack (:heycam) 2022-05-23 22:33:30 PDT
Created attachment 459697 [details] screen shot with GPUP DOM rendering enabled Attaching screen shots on iPad with GPUP DOM Rendering enabled and disabled. Notice that the emoji rendering is slightly different. This makes me wonder if CoreText is not selecting an emoji of an appropriate size. Maybe it's unaware of the scale factor when it's drawing to the RemoteDisplayListRecorderProxy?
Comment 1 Cameron McCormack (:heycam) 2022-05-23 22:33:48 PDT
Created attachment 459698 [details] screen shot with GPUP DOM rendering disabled
Comment 3 Cameron McCormack (:heycam) 2022-05-23 22:46:38 PDT
Created attachment 459699 [details] screen shot with GPUP DOM rendering enabled
Comment 4 Cameron McCormack (:heycam) 2022-05-23 22:46:53 PDT
Created attachment 459700 [details] screen shot with GPUP DOM rendering disabled
Comment 5 Myles C. Maxfield 2022-05-24 12:35:02 PDT
The blacks look grey I guess. Perhaps there's an issue with transparency?
Comment 6 Cameron McCormack (:heycam) 2022-05-24 22:02:23 PDT
Looks like it's because the CGContext we create in the glyph recorder is created with type kCGContextTypeUnknown, and CoreText assumes that it's not going to get a sensible scale value out of the CTM, and instead assumes some large value, resulting in the largest bitmap strike being chosen.