The limit of 5 iterations in http://trac.webkit.org/changeset/53790 appears to be too low for the neopets page in the bug URL. Bumping the limit to 7 is sufficient to render this page correctly.
Here's the time spend parsing fast/parser/residual-style-hang.html with different limits on my Mac Pro (intel xeon @ 2.26 GHz):
limit / time
5 / 379ms
6 / 494ms
7 / 630ms
8 / 770ms
9 / 930ms
10 / 1073ms
This looks like a pretty much worst case scenario so I'd suggest raising the limit to 10 iterations just in case there are other sites out there that are broken.
Gecko manages to render this page correctly with and without their HTML5 parser enabled.
Created attachment 59710 [details]
I don’t remember how the constant 5 was chosen. Will 10 re-introduce the hang seen in bug 34059?
See comment #1 for timing data from the hanging test case. A limit of 10 is slower than a limit of 5 but not by too much (it is still too low to be likely to see a beachball).
The bug wasn’t about the test though, it was about a hang with some real-world content.
We (Maciej, Hixie and I) chatted about this a bit in IRC. We don't have any real-world websites that were seeing the hang (although we're pretty sure some did). I believe that the test case Maciej created is or is close to the worst case for the fixup algorithm and don't expect that sites in the wild will be slower than it. I hope so, anyway :)
I don't think we have too much choice, fundamentally we have to make the limit large enough to handle any websites that people care about.
The most recent Minefield nightly with the HTML5 parser enabled handles up to 100 nested "<b><div>" instances before it starts dropping content.
Comment on attachment 59710 [details]
Clearing flags on attachment: 59710
Committed r62029: <http://trac.webkit.org/changeset/62029>
All reviewed patches have been landed. Closing bug.