When a Shockwave movie loads an external text resource over HTTP and the delivery uses the gzip
content-encoding, and the server sends the resource with the Content-Length
header set, the download always fails for some reason. If the server doesn't send the Content-Length
header and specifies "Transfer-Encoding: chunked", the download succeeds.
Safari seems to misreport the file size to the plugin in the case of failure - for some reason the file
size is reported as being larger than the gzipped version (correct) but smaller than the decompressed
resource so it seems that if the plugin asks for details of the file while the file is being decompressed,
the decompression is somehow interrupted and the length of the already decompressed content is
Clearly there's some functional difference in how the content is delivered to plugins depending on the
Steps to Reproduce:
See http://www.kotska.com/swtest/ for a testcase.
The file being downloaded is downloaded but never delivered to the plugin.
File should have been passed onto the plugin.
Tried this using both Safari 2.0.1 and the latest CVS version (AppleWebKit/420+ Safari/412.5) and the test
fails using both. Effectively this means any site using Shockwave has to disable gzip compression for
compatibility with Safari, causing textual content to require roughly 8-10x the amount of bandwidth
necessary with compression.
Good test case
The testcase fails the same way in Firefox (1.0 and 1.5), both on OS X and Windows.
The Firefox bug is reported here:
FYI: Internet Explorer (Mac and Windows) doesn't fail the test and since the plugin running within those browsers is (afaik) exactly the same code on a given platform, there has to be some differences in the browser implementation.
We have 100k+ Mac users on our site every month and as a result of this bug, most of these users have to download about 300kB of extra unnecessarily uncompressed data on every visit.
Created attachment 6922 [details]
Can we make automated tests for such bugs?
Comment on attachment 6922 [details]
This patch is not quite right. The content encoding might be "identity", so just the presence of a content encoding header field is not the correct check. Also, buggy servers sometimes send strings like "identity, identity".
It's hard to code this correctly at the WebKit level and stay in sync. with the features of the NSURL level; ideally we would have a Foundation method like expectedContentLength, but one that knew to only give us the length for things it won't decode. In general we don't want code that has to second-guess what NSURL is doing for us.
Current versions of NSURLConnection support "deflate" and "gzip"; anything else is treated as "identity". The comparison is done in a case insensitive manner. NSURLConnection knows how to parse off an optional "x-" prefix. It knows how to ignore parameters that come after an optional ";" separator. It knows how to ignore whitespace. It parses a list separated by commas; if all the items in the list are the same (ignoring case, whitespace, "x-" prefixes, and parameters) then it uses it, otherwise it treats it as "identity".
Hard-coding this precise rule into WebBaseNetscapePluginStream would be kinda lame, but I don't have a better idea.
<rdar://4470599> for NSURLConnection.
Actually, I think that expectedContentLength itself should only give us the length for things it won't decode, so I'm going to just wait for the resolution of the above Radar.
That said, I still don't really understand why a simplified rule like the one used in the patch is much worse for deciding what to pass to the plugin. Also, servers SHOULD NOT send Content-Encoding: identity, according to RFC2616.
Created attachment 15938 [details]
As mentioned above, I think we do not need to match NSURLConnection logic particularly well.
Comment on attachment 15938 [details]
OK, at this point it seems smart to work around the NSURL bug. r=me
Committed revision 25024.