Bug 50794 - [chromium] const char* used for strings in a few places in the WebKit API
Summary: [chromium] const char* used for strings in a few places in the WebKit API
Status: RESOLVED WONTFIX
Alias: None
Product: WebKit
Classification: Unclassified
Component: New Bugs (show other bugs)
Version: 528+ (Nightly build)
Hardware: All All
: P2 Normal
Assignee: James Robinson
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2010-12-09 17:58 PST by Kenneth Russell
Modified: 2013-04-11 13:00 PDT (History)
5 users (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Kenneth Russell 2010-12-09 17:58:47 PST
There are several entry points on WebGraphicsContext3D which take const char* to match the OpenGL API, such as bindAttribLocation, getUniformLocation, shaderSource, etc. These should be changed to take const WebString&.
Comment 1 James Robinson 2010-12-09 18:26:17 PST
Broadening the scope.  I think it'll be easier to do a change like this across all Chromium WebKit APIs at once (which I don't mind doing).

A few bits of the chromium WebKit API use const char* for string data of a known encoding rather than something like WebString/WebCString.  I think we should eliminate these and only use const char*/size_t pairs for arbitrary binary data.  Current cases:

WebKitClient uses const char* for strings in the stats counter, tracing, and histogram APIs.  It looks like the original use of const char* was inherited from the initial import into the WebKit repo and then more callers followed suit.  The callers to this API typically pass in string literals, so WebCString seems like a good fit especially as strlen() will be a compile-time constant for these callers.

WebGraphicsContext3D uses const char* to mirror OpenGL APIs.  In this case the input is sometimes a string literal and sometimes a programatically computed string (possibly generated from javascript), but the strings always travel through a WTF::String before reaching WebGraphicsContext3D.  It looks like callers compute this string by calling str.utf8().data() but the implementation then uses strlen() on the string, assuming only ASCII data.  I'm pretty sure that's harmless in this case but it is a bit odd.  The typical use case here is ASCII so my inclination is to define the API as WebCString and make sure that callers convert from WTF::String to ascii().

My plan of attack:

1.) Add implementations for affected functions in the chromium repo that accept WebCString alongside the implementations that accept const char*.  The new implementations will be (temporarily) unreachable but should compile fine alongside the old ones since they differ in signature.
2.) Wait for rolls
3.) Switch the WebKit APIs over to use the new types
4.) Wait for rolls pt 2
5.) Remove the const char* implementations from Chromium.

Darin, as arbiter of the chromium WebKit API what do you think of all the above?  Should we also add a lint rule for const char* without size_t?