Summary: | [cURL] Acid2 test segmentation fault | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Product: | WebKit | Reporter: | Luca Bruno <lethalman88> | ||||||||
Component: | Page Loading | Assignee: | Nobody <webkit-unassigned> | ||||||||
Status: | RESOLVED FIXED | ||||||||||
Severity: | Normal | CC: | alp, lethalman88, sandshrew | ||||||||
Priority: | P2 | Keywords: | Curl | ||||||||
Version: | 528+ (Nightly build) | ||||||||||
Hardware: | PC | ||||||||||
OS: | All | ||||||||||
URL: | http://www.webstandards.org/files/acid2/test.html | ||||||||||
Attachments: |
|
Description
Luca Bruno
2007-12-09 02:40:51 PST
I realized that cURL crashes when redirecting to a Location: containing \r character. Is this a WebCore related issue or should we check for that in the curl ResourceHandleManager? Created attachment 17808 [details]
avoid curl crash on bad url
It's a cURL related issue, it crashes if the new url (a redirect in this case) contains a carriage return \r.
Comment on attachment 17808 [details]
avoid curl crash on bad url
r=me (I'm assuming this is covered by the existing acid 2 layout test)
Comment on attachment 17808 [details]
avoid curl crash on bad url
Hey,
As we discussed there's no evidence that this patch works.
Did it work a couple of weeks ago? If so, it'd be interesting to figure out when we regressed so much that this fix doesn't work any more.
Sounds like someone needs to try this patch with a build from a week or so ago and a clean build.
Created attachment 17884 [details] queue job cancellations Test it, works for me: http://www.webstandards.org/files/acid2/test.html Please check for regression on job removal. Created attachment 17885 [details]
Make Acid2 pass by deferring cancellation
This is a simplified fix for the issue found and fixed by Luca's previous patch.
It also prevents data transfer as early as possible to save resources when a job is cancelled.
Comment on attachment 17885 [details]
Make Acid2 pass by deferring cancellation
r=me
Oh very cool patch Alp!!! I saw from the patch that it could fix another bug and in fact it did. If you try opening pages clicking on links while the page is loading, foreign data was displayed instead of the page (e.g. one image), remember? Now seems fixed as well too. You did a good job. *** Bug 16341 has been marked as a duplicate of this bug. *** hi, in my tests with the patch i see that the cancel for cases where it is not called in the context of multi_peform (e.g. opening a new url on the same page), the previous url request does not end immediately. i was thinking if it would be be more optimized to queue cancel job only in case its called in context of multiperform. downloadTimerCallback: m_inMultiPerform = true; curl_multi_perform.. m_inMultiPerform = false; cancel: if (m_inMultiPerform) d->cancelled=true else removeFromCurl(job) Looks good, can you create a patch? Thanks. |