WebChromeClient::createImageBuffer should not create a connection to GPU Process if page does not want remote rendering
<rdar://problem/60020229>
Created attachment 392394 [details] Patch
Comment on attachment 392394 [details] Patch Clearing flags on attachment: 392394 Committed r257845: <https://trac.webkit.org/changeset/257845>
All reviewed patches have been landed. Closing bug.
Comment on attachment 392394 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=392394&action=review > Source/WebKit/WebProcess/WebCoreSupport/WebChromeClient.cpp:-910 > - RenderingMode mode; > - if (m_page.shouldUseRemoteRenderingFor(purpose)) > - mode = shouldAccelerate == ShouldAccelerate::Yes ? RenderingMode::RemoteAccelerated : RenderingMode::RemoteUnaccelerated; > - else if (shouldUseDisplayList == ShouldUseDisplayList::Yes) > - mode = shouldAccelerate == ShouldAccelerate::Yes ? RenderingMode::DisplayListAccelerated : RenderingMode::DisplayListUnaccelerated; > - else > - mode = shouldAccelerate == ShouldAccelerate::Yes ? RenderingMode::Accelerated : RenderingMode::Unaccelerated; > - return ensureRemoteRenderingBackend().createImageBuffer(size, mode, resolutionScale, colorSpace); With this change, the argument 'shouldUseDisplayList' became unused. How does not the compiler catch this?
> With this change, the argument 'shouldUseDisplayList' became unused. How > does not the compiler catch this? WebCore has that constraint, not WebKit. It would be nice to be consistent.