If you look at JSWebGLFloatArrayCustom.cpp you will see this in indexSetter: impl()->set(index, static_cast<float>(value.toInt32(exec))); So the value is getting truncated to an int. It needs to use toNumber instead. This error appears again in the ::set() method. There are a couple other problems. In the UnsignedInt variant, it is also using toInt32, which will truncate large positive numbers. It should use toUInt32 instead. In fact, I think all of the unsigned variants should use toUInt32 to avoid wrapping of negative numbers.
Are you looking at the latest code? I don't see the errors in either JSWebGLFloatArrayCustom.cpp nor JSWebGLUnsignedIntArrayCustom.cpp. I am pretty sure I fixed all of these bugs in https://bugs.webkit.org/show_bug.cgi?id=32456 .
Looks like you fixed it in JSWebGLFloatArray::set, but not in JSWebGLFloatArray::indexSetter. Also, JSWebGLUnsignedIntArray::indexSetter still uses toInt32 rather than toUInt32. JSWebGLUnsignedByteArray and JSWebGLUnsignedShortArray cases are still using toInt32 as well. Technically, I think the right solution for all the integer cases is to remember the sign then do abs() of the number, do toUint32() on that, restore the sign if needed and then cast the result. I think that's the only way to always get the correct result, isn't it?
We should also have testcases for all these.
I see. I'll fix these.
(In reply to comment #2) > Looks like you fixed it in JSWebGLFloatArray::set, but not in > JSWebGLFloatArray::indexSetter. Also, JSWebGLUnsignedIntArray::indexSetter > still uses toInt32 rather than toUInt32. JSWebGLUnsignedByteArray and > JSWebGLUnsignedShortArray cases are still using toInt32 as well. > > Technically, I think the right solution for all the integer cases is to > remember the sign then do abs() of the number, do toUint32() on that, restore > the sign if needed and then cast the result. I think that's the only way to > always get the correct result, isn't it? I don't think such complex logic is needed for the integer cases. We do not and should not have tests for how out-of-range values are coerced. We already have tests verifying that the in-range values can be represented by the various WebGLArray types. The only thing I see that needs to be fixed for these types is JSWebGLUnsignedIntArray::indexSetter needing to use toUInt32 -- although it already seems to work for all valid values because of how toInt32SlowCase in JSValue.cpp works.
Created attachment 46107 [details] Patch
style-queue ran check-webkit-style on attachment 46107 [details] without any errors.
Comment on attachment 46107 [details] Patch Which test case is fixed by the change from toInt32 to toUInt32? I don't see a test case that covers that. Can we make that change separately?
I wasn't able to show incorrect answers because of the toInt32 call for the in-range values for this data type (0..4294967295). I think this is because of how toInt32SlowCase works. Still, it's clear from code inspection that this change is correct and I'd rather do both at once.
Comment on attachment 46107 [details] Patch ok.
Comment on attachment 46107 [details] Patch Clearing flags on attachment: 46107 Committed r53009: <http://trac.webkit.org/changeset/53009>
All reviewed patches have been landed. Closing bug.