WebKit Bugzilla
New
Browse
Log In
×
Sign in with GitHub
or
Remember my login
Create Account
·
Forgot Password
Forgotten password account recovery
UNCONFIRMED
79353
Use Unicode 6.1.0 when determining whether an identifier is acceptable or not
https://bugs.webkit.org/show_bug.cgi?id=79353
Summary
Use Unicode 6.1.0 when determining whether an identifier is acceptable or not
Mathias Bynens
Reported
2012-02-23 02:48:31 PST
JavaScript currently uses an older version of the Unicode database. Here are some examples of identifiers that are currently failing because of this, even though they’re valid according to ES 5.1/Unicode 6.1: * `var \u0cf1;` —
http://mothereff.in/js-variables#%5Cu0cf1
* `var \ua7aa;` —
http://mothereff.in/js-variables#%5Cua7aa
* `var \u1bba;` —
http://mothereff.in/js-variables#%5Cu1bba
* `var a\ua674;` —
http://mothereff.in/js-variables#a%5Cua674
Of course, there are many more. Updating to Unicode 6.1 would improve interoperability. Is the list of allowed characters in `IdentifierStart` and `IdentifierPart` auto-generated based on a UnicodeData.txt file, or how is this done in JavaScriptCore?
Attachments
Add attachment
proposed patch, testcase, etc.
Note
You need to
log in
before you can comment on or make changes to this bug.
Top of Page
Format For Printing
XML
Clone This Bug