GCC 4.6 seems to be more picky about trying to stuff 64bit values into 32bit variables (as it should!). All the cases seem safe to fix by forcefully casting the value, since it seems either impossible (ByteCodeGenerator, UTF8, ...) or extremely unlikely (the string length in FontGtk) that the value would overflow a 32bit value. At worst, we are not really making things worse, but of course in some cases it might make sense to change the code to not force a cast here.
Created attachment 91136 [details] precission.diff
Comment on attachment 91136 [details] precission.diff View in context: https://bugs.webkit.org/attachment.cgi?id=91136&action=review I think that this should be split into parts that domain experts could look at. > Source/JavaScriptCore/bytecompiler/BytecodeGenerator.cpp:2102 > + SwitchInfo info = { static_cast<uint32_t>(instructions().size()), type }; I think that this patch is making things worse in that once this warning is enabled in some other compiler, we won't see the issues. It's not obvious to me that malicious code can't create 4 billion instructions. > Source/JavaScriptCore/wtf/unicode/UTF8.cpp:236 > +static const UChar32 offsetsFromUTF8[6] = { 0x00000000UL, 0x00003080UL, 0x000E2080UL, 0x03C82080UL, static_cast<const UChar32>(0xFA082080UL), static_cast<const UChar32>(0x82082080UL) }; Can this be fixed by not using UChar32? These are not Unicode characters, these are magic numbers. > Source/WebCore/plugins/PluginPackage.cpp:348 > + static_cast<unsigned int>(m_lastModified) We don't use "unsigned int", we use "unsigned".