You need to
before you can comment on or make changes to this bug.
Loading this url
Created an attachment (id=17706) [details]
limit to 5 connections globally
The problem is that there's no way to limit curl to a specific number of connections.
This patch solves the problem.
However, we need to work more on that, because the limitation is globally and not per domain.
I suggest to land this patch which is useful for other purposes (for example working on wysiwyg, which are really heavy if the issue remains).
Created an attachment (id=17713) [details]
I tested the patch by browsing several websites that before had this problem due to bad cURL management.
The new patch is complete of ChangeLog. I also added a FIXME to create a separate stack of jobs per domain.
Created an attachment (id=17714) [details]
forgot header file
Created an attachment (id=17789) [details]
put jobs in the right queue
Using a list, all requests were satisfied with the wrong order.
Now a queue class has been added to do this work.
(From update of attachment 17789 [details])
So, this solves a lot of problems. Pages load faster, fewer resources used.
r- though due to the crasher when jobs are cancelled.
Created an attachment (id=17799) [details]
do not crash on job cancel
The previous patch crashed when a job was canceled. This one instead has the same behavior of the original code.
Notice how using the queue most of the code in the manager has been collapsed and therefore cleaned.
Still missing a separate stack of connections per domain, but this is another bug yet.
Created an attachment (id=17804) [details]
alternative using vectors
(From update of attachment 17799 [details])
Clearing review flag in favour of the Vector-based patch.
(From update of attachment 17804 [details])
r=me with the changes to the positioning of m_runningJobs increment/decrement we discussed.
Landed in r28573.