I found this while trying to debug the failure of webgl/1.0.2/resources/webgl_test_files/conformance/rendering/gl-scissor-test.html. The WebGL code assumes we are using multisampling when the WebGL "antialias" attribute is set, but the Mac port does not enable these flags in the underlying CGPixelFormat for the OpenGL context. This causes the gl-scissor-test to render garbage for the "antialias" case.
Created attachment 220348 [details] Patch
Comment on attachment 220348 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=220348&action=review > Source/WebCore/platform/graphics/mac/GraphicsContext3DMac.mm:145 > + bool useMultisampling = m_attrs.antialias; > + > + setPixelFormat(attribs, 32, 32, !attrs.forceSoftwareRenderer, true, false, useMultisampling); This preserves our existing behaviour to request supersampling when anti-aliasing was false. I guess that's ok.
Comment on attachment 220348 [details] Patch Clearing flags on attachment: 220348 Committed r161294: <http://trac.webkit.org/changeset/161294>
All reviewed patches have been landed. Closing bug.
<rdar://problem/15747163>