Summary: | Apple Emoji render slightly differently with GPU Process DOM Rendering enabled | ||
---|---|---|---|
Product: | WebKit | Reporter: | Cameron McCormack (:heycam) <heycam> |
Component: | Layout and Rendering | Assignee: | Nobody <webkit-unassigned> |
Status: | NEW --- | ||
Severity: | Normal | CC: | bfulgham, mmaxfield, simon.fraser, webkit-bug-importer, zalan |
Priority: | P2 | Keywords: | InRadar |
Version: | WebKit Nightly Build | ||
Hardware: | Unspecified | ||
OS: | Unspecified | ||
See Also: | https://bugs.webkit.org/show_bug.cgi?id=240497 | ||
Attachments: |
Created attachment 459698 [details]
screen shot with GPUP DOM rendering disabled
Created attachment 459699 [details]
screen shot with GPUP DOM rendering enabled
Created attachment 459700 [details]
screen shot with GPUP DOM rendering disabled
The blacks look grey I guess. Perhaps there's an issue with transparency? Looks like it's because the CGContext we create in the glyph recorder is created with type kCGContextTypeUnknown, and CoreText assumes that it's not going to get a sensible scale value out of the CTM, and instead assumes some large value, resulting in the largest bitmap strike being chosen. |
Created attachment 459697 [details] screen shot with GPUP DOM rendering enabled Attaching screen shots on iPad with GPUP DOM Rendering enabled and disabled. Notice that the emoji rendering is slightly different. This makes me wonder if CoreText is not selecting an emoji of an appropriate size. Maybe it's unaware of the scale factor when it's drawing to the RemoteDisplayListRecorderProxy?