Go to http://digg.com/about and when it's finished loading, click the link to http://digg.com/faq . The latter page (and subsequent pages on digg.com) will be blank or missing some content. This bug is a regression from r15075 (fix for bug 9317), and it happens when a script loaded from cache writes a <script> tag whose source is also in cache. The requesingScript state bit is messed up as a result. The fix is to replace the state bit with a count.
Created attachment 9118 [details] Replace the requestingScript state bit with a count
Comment on attachment 9118 [details] Replace the requestingScript state bit with a count r=me
Comment on attachment 9118 [details] Replace the requestingScript state bit with a count Another fix is to save, set, and then restore the bit rather than setting and clearing it. I think that would be better. Is there any reason that wouldn't work?
(In reply to comment #3) > (From update of attachment 9118 [details] [edit]) > Another fix is to save, set, and then restore the bit rather than setting and > clearing it. I think that would be better. > > Is there any reason that wouldn't work? > I think it would work, but it occurred to me that the bit didn't really belong in State anyway (it is a global property of the tokenizer), and I thought that saving and restoring a bool would just consume byets on the stack with no apparent benefit. I'm going to attach a save-and-restore version.
Created attachment 9120 [details] Save and restore the m_requestingScript flag See my previous comment as to why I think the bit doesn't belong in State. Of course, if you prefer it that way, I can go all the way and put it back into State.
Created attachment 9121 [details] Save and restore the m_requestingScript flag Added a missing test subresource. Please see previous comments.
Comment on attachment 9121 [details] Save and restore the m_requestingScript flag I think of the m_state as a more-or-less arbitrary set of bits, so the argument about this bit being global state and hence "not part of state" doesn't seem very compelling to me. Your argument about a byte on the stack vs. a count in the tokenizer object is actually a pretty compelling one. It didn't occur to me that the stack cost can be paid over and over again as you recurse and there is only one tokenizer. I think that either the original fix or this one is fine. r=me
Committed revision 15134.