Currently with the way that defineProperty works, we leave a stray low bit set in PropertyDescriptor::m_attributes in the following code: var o = {}; Object.defineProperty(o, 100, {writable:true, enumerable:true, configurable:true, value:"foo"}); This is due to the fact that the lowest non-zero attribute (ReadOnly) is represented as 1 << 1 instead of 1 << 0. We then calculate the default attributes as ((1 << 3) << 1) - 1, which is 0xF, but only the top three bits mean anything. Even in the case above, the top three bits are set to 0 but the bottom bit remains set, which causes us to think m_attributes is non-zero.
Created attachment 209660 [details] Patch
Comment on attachment 209660 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=209660&action=review > Source/JavaScriptCore/ChangeLog:17 > + This is due to the fact that the lowest non-zero attribute (ReadOnly) is represented as 1 << 1 > + instead of 1 << 0. We then calculate the default attributes as ((1 << 3) << 1) - 1, which is 0xF, > + but only the top three bits mean anything. Even in the case above, the top three bits are set > + to 0 but the bottom bit remains set, which causes us to think m_attributes is non-zero. I think the real problem is in the code that calculates the default attributes. Where is that code? Is there some reason it has to do "((1 << 3) << 1) - 1" instead of just or'ing together the specific attributes?
(In reply to comment #2) > (From update of attachment 209660 [details]) > View in context: https://bugs.webkit.org/attachment.cgi?id=209660&action=review > > > Source/JavaScriptCore/ChangeLog:17 > > + This is due to the fact that the lowest non-zero attribute (ReadOnly) is represented as 1 << 1 > > + instead of 1 << 0. We then calculate the default attributes as ((1 << 3) << 1) - 1, which is 0xF, > > + but only the top three bits mean anything. Even in the case above, the top three bits are set > > + to 0 but the bottom bit remains set, which causes us to think m_attributes is non-zero. > > I think the real problem is in the code that calculates the default attributes. Where is that code? Is there some reason it has to do "((1 << 3) << 1) - 1" instead of just or'ing together the specific attributes? It's in PropertyDescriptor.cpp: unsigned PropertyDescriptor::defaultAttributes = (DontDelete << 1) - 1; I agree that it would be better style to or the default attributes together. I'll make that change.
(In reply to comment #2) > (From update of attachment 209660 [details]) > View in context: https://bugs.webkit.org/attachment.cgi?id=209660&action=review > > > Source/JavaScriptCore/ChangeLog:17 > > + This is due to the fact that the lowest non-zero attribute (ReadOnly) is represented as 1 << 1 > > + instead of 1 << 0. We then calculate the default attributes as ((1 << 3) << 1) - 1, which is 0xF, > > + but only the top three bits mean anything. Even in the case above, the top three bits are set > > + to 0 but the bottom bit remains set, which causes us to think m_attributes is non-zero. > > I think the real problem is in the code that calculates the default attributes. Where is that code? Is there some reason it has to do "((1 << 3) << 1) - 1" instead of just or'ing together the specific attributes? Hmm, a couple of these constants are exposed as part of the JavaScriptCore framework's C API in JSObjectRef.h (e.g. kJSPropertyAttributeReadOnly). Perhaps it would be better to just change how the default attributes are calculated internally and leave the other values alone.
Committed r154630: <http://trac.webkit.org/changeset/154630>
(In reply to comment #4) > Hmm, a couple of these constants are exposed as part of the JavaScriptCore framework's C API in JSObjectRef.h (e.g. kJSPropertyAttributeReadOnly). Please add a COMPILE_ASSERT to keep these in sync.