Allow robots to index bugs on Bugzilla
Created attachment 161923 [details] Patch
Hi Bill, Lucas, With performance of the WebKit sites having been improved significantly lately, I was hoping we could enable indexing of show_bug.cgi by search engines. Bugzilla's search function is not always ideal, and this will make bugs much more discoverable. Almost all bugs are referenced by commits, review requests and the webkit-unassigned list, so indexing should go quickly. One thing we preemptively could do against performance worries, is setting a crawl delay in the robots.txt file itself to limit the number of requests that will be made.
I'm fine with allowing more crawling but do please add the crawl delay. For trac, we set a delay of 20.
Created attachment 162490 [details] Patch
Comment on attachment 162490 [details] Patch Done.
Bill, could you please take another look? The scroll delay is now equal to trac's.
Patch looks okay to me.
Comment on attachment 162490 [details] Patch If bill is ok with the patch, then it's LGTM.
Comment on attachment 162490 [details] Patch Clearing flags on attachment: 162490 Committed r129369: <http://trac.webkit.org/changeset/129369>
All reviewed patches have been landed. Closing bug.
Thanks Bill and Eric!
Maybe we should also fix-up robots.txt to explicitly allow webkit-patch so webkit-patch can stop cheating the system: http://trac.webkit.org/browser/trunk/Tools/Scripts/webkitpy/common/net/bugzilla/bugzilla.py#L286 :) (I also doubt it matters much.)