Bug 166413
Summary: | Math.min changes behavior with negative zero when tiering up | ||
---|---|---|---|
Product: | WebKit | Reporter: | Keith Miller <keith_miller> |
Component: | JavaScriptCore | Assignee: | Nobody <webkit-unassigned> |
Status: | NEW | ||
Severity: | Normal | CC: | webkit-bug-importer |
Priority: | P2 | Keywords: | InRadar |
Version: | WebKit Nightly Build | ||
Hardware: | Unspecified | ||
OS: | Unspecified |
Keith Miller
It looks like the code converts the min(a, b) into a < b ? a : b. This does not work for -0.0 and 0.0:
Test case:
function test(value, iter) {
if (Infinity/value !== -Infinity)
throw new Error(iter);
}
noInline(test);
function foo(a, b, iter) {
test(Math.min(a, b), iter);
}
noInline(foo);
for (let i = 0; i < 10000; i++) {
foo(-0.0, 0.0, i);
}
Attachments | ||
---|---|---|
Add attachment proposed patch, testcase, etc. |
Radar WebKit Bug Importer
<rdar://problem/29783164>