Bug 131867

Summary: [WebGL] Limit maximum texture sizes on older Intel hardware
Product: WebKit Reporter: Dean Jackson <dino>
Component: WebGLAssignee: Dean Jackson <dino>
Status: RESOLVED FIXED    
Severity: Normal CC: commit-queue, kondapallykalyan, noam, roger_fong
Priority: P2    
Version: 528+ (Nightly build)   
Hardware: Unspecified   
OS: Unspecified   
Attachments:
Description Flags
Patch ggaren: review+

Dean Jackson
Reported 2014-04-18 17:05:25 PDT
Chrome and Firefox limit the maximum size of textures on Intel hardware due to corruption that may reveal the contents of other windows. We should do the same. <rdar://problem/16582628>
Attachments
Patch (5.62 KB, patch)
2014-04-18 17:09 PDT, Dean Jackson
ggaren: review+
Dean Jackson
Comment 1 2014-04-18 17:09:37 PDT
Geoffrey Garen
Comment 2 2014-04-18 17:17:31 PDT
Comment on attachment 229686 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=229686&action=review r=me > Source/WebCore/platform/graphics/opengl/Extensions3DOpenGLCommon.cpp:103 > + // Intel HD 3000 devices have problems with > + // large textures. <rdar://problem/16649140> This can be one line. > Source/WebCore/platform/graphics/opengl/GraphicsContext3DOpenGL.cpp:252 > + case MAX_TEXTURE_SIZE: > + ::glGetIntegerv(MAX_TEXTURE_SIZE, value); > + if (getExtensions()->requiresRestrictedMaximumTextureSize()) > + *value = std::min(4096, *value); > + break; > + case MAX_CUBE_MAP_TEXTURE_SIZE: > + ::glGetIntegerv(MAX_CUBE_MAP_TEXTURE_SIZE, value); > + if (getExtensions()->requiresRestrictedMaximumTextureSize()) > + *value = std::min(1024, *value); > + break; Should we really silently truncate instead of just failing and returning an error? What's the expected behavior here?
Dean Jackson
Comment 3 2014-04-18 17:20:15 PDT
(In reply to comment #2) > (From update of attachment 229686 [details]) > View in context: https://bugs.webkit.org/attachment.cgi?id=229686&action=review > > r=me > > > Source/WebCore/platform/graphics/opengl/Extensions3DOpenGLCommon.cpp:103 > > + // Intel HD 3000 devices have problems with > > + // large textures. <rdar://problem/16649140> > > This can be one line. > > > Source/WebCore/platform/graphics/opengl/GraphicsContext3DOpenGL.cpp:252 > > + case MAX_TEXTURE_SIZE: > > + ::glGetIntegerv(MAX_TEXTURE_SIZE, value); > > + if (getExtensions()->requiresRestrictedMaximumTextureSize()) > > + *value = std::min(4096, *value); > > + break; > > + case MAX_CUBE_MAP_TEXTURE_SIZE: > > + ::glGetIntegerv(MAX_CUBE_MAP_TEXTURE_SIZE, value); > > + if (getExtensions()->requiresRestrictedMaximumTextureSize()) > > + *value = std::min(1024, *value); > > + break; > > Should we really silently truncate instead of just failing and returning an error? What's the expected behavior here? Yeah, we need to truncate. If we just fail then Intel 3000 cards can never render with textures, which breaks WebGL pretty completely. A limit of 4096*4096 is still a huge texture, larger than most screens. It's not an overly restrictive limit.
Dean Jackson
Comment 4 2014-04-18 17:30:40 PDT
Alexey Proskuryakov
Comment 5 2014-04-18 22:23:35 PDT
This broke two tests on bots: fast/canvas/webgl/webgl-compressed-texture-size-limit.html webgl/1.0.2/conformance/textures/texture-size-limit.html Given that the tests used to pass on the hardware we have, seems like the patch may be over-zealous? Failing a conformance test doesn't seem great either (we actually fail it, it's not just canted output).
Alexey Proskuryakov
Comment 6 2014-04-18 22:32:36 PDT
s/canted/changed/
Note You need to log in before you can comment on or make changes to this bug.