WebGL doesn't push frames when attached to a remote layer host.
<rdar://problem/15260182>
Created attachment 223489 [details] Patch
Comment on attachment 223489 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=223489&action=review > Source/WebKit2/WebProcess/WebPage/mac/PlatformCALayerRemoteCustom.h:51 > + bool m_providesContents; You forgot to initialize this in the ctor. > Source/WebKit2/WebProcess/WebPage/mac/PlatformCALayerRemoteCustom.mm:62 > + m_providesContents = [customLayer isKindOfClass:[WebGLLayer class]]; I think you could do this with the classname without pulling in WebGLLayer.h
Comment on attachment 223489 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=223489&action=review >> Source/WebKit2/WebProcess/WebPage/mac/PlatformCALayerRemoteCustom.h:51 >> + bool m_providesContents; > > You forgot to initialize this in the ctor. It's initalised in the ctor body. >> Source/WebKit2/WebProcess/WebPage/mac/PlatformCALayerRemoteCustom.mm:62 >> + m_providesContents = [customLayer isKindOfClass:[WebGLLayer class]]; > > I think you could do this with the classname without pulling in WebGLLayer.h Yeah, I'm now trying to get there via GraphicsLayer
Committed r163646: <http://trac.webkit.org/changeset/163646>