Firefox's XMLHttpRequest can load binary data by overriding the MIME type to use "x-user-defined": req.overrideMimeType("text/plain; charset=x-user-defined"); (http://mgran.blogspot.com/2006/08/downloading-binary-streams-with.html) WebKit's XMLHttpRequest does not support this charset, so it defaults to another, which results in some of the binary values greater than 127 being mapped to incorrect values (see the red boxes in the test case).
I'm wondering if this encoding is of any use outside XHR - most likely, Firefox supports it for all page loads.
Created attachment 16860 [details] proposed fix This matches Firefox implementation, since I couldn't find MSIE one documented anywhere (this charset is not supposed to be used on the Web, according to MSDN).
But should it actually match Firefox's implementation? I'm wondering why Firefox returns the correct value OR'd with 0xF700 for values >= 0x80? Is there a good reason for this?
So, what does IE do? Is its implementation of x-user-defined different in some way?
In IE, responseText is trimmed at null bytes, so it apparently isn't suitable for binary data regardless of its encoding.
Comment on attachment 16860 [details] proposed fix + unsigned char c = bytes[i]; + characters[i] = (c < 0x80) ? c : 0xf700 + c; If you used "signed char" you wouldn't need to do as much math: signed char signedByte = bytes[i]; characters[i] = signedByte & 0xF7FF; + UChar32 highBits = c & 0xffffff80; + if (!highBits || highBits == 0xf780) + bytes[resultLength++] = static_cast<char>(c); And I'd just write this as: signed char signedByte = c; if (signedByte & 0xF7FF == c) bytes[resultLength++] = signedByte; r=me with or without my suggested optimization
Committed revision 27145 with suggested optimizations. I guess it remains to be seen how (and whether) this change affects HTML documents...