<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<!DOCTYPE bugzilla SYSTEM "https://bugs.webkit.org/page.cgi?id=bugzilla.dtd">

<bugzilla version="5.0.4.1"
          urlbase="https://bugs.webkit.org/"
          
          maintainer="admin@webkit.org"
>

    <bug>
          <bug_id>141290</bug_id>
          
          <creation_ts>2015-02-05 09:25:54 -0800</creation_ts>
          <short_desc>[ARM][Linux] GC sometimes stuck in an infinite loop if parallel GC is enabled</short_desc>
          <delta_ts>2015-03-12 07:51:07 -0700</delta_ts>
          <reporter_accessible>1</reporter_accessible>
          <cclist_accessible>1</cclist_accessible>
          <classification_id>1</classification_id>
          <classification>Unclassified</classification>
          <product>WebKit</product>
          <component>New Bugs</component>
          <version>528+ (Nightly build)</version>
          <rep_platform>Unspecified</rep_platform>
          <op_sys>Unspecified</op_sys>
          <bug_status>RESOLVED</bug_status>
          <resolution>FIXED</resolution>
          
          
          <bug_file_loc></bug_file_loc>
          <status_whiteboard></status_whiteboard>
          <keywords></keywords>
          <priority>P1</priority>
          <bug_severity>Critical</bug_severity>
          <target_milestone>---</target_milestone>
          <dependson>142513</dependson>
          <blocked>108645</blocked>
    
    <blocked>90568</blocked>
    
    <blocked>130177</blocked>
          <everconfirmed>1</everconfirmed>
          <reporter name="Csaba Osztrogonác">ossy</reporter>
          <assigned_to name="Csaba Osztrogonác">ossy</assigned_to>
          <cc>benjamin</cc>
    
    <cc>cgarcia</cc>
    
    <cc>clopez</cc>
    
    <cc>cmarcelo</cc>
    
    <cc>commit-queue</cc>
    
    <cc>ggaren</cc>
    
    <cc>gustavo</cc>
    
    <cc>gyuyoung.kim</cc>
    
    <cc>mark.lam</cc>
    
    <cc>mrobinson</cc>
    
    <cc>ossy</cc>
    
    <cc>ryuan.choi</cc>
    
    <cc>tobias.netzel</cc>
    
    <cc>zan</cc>
          

      

      

      

          <comment_sort_order>oldest_to_newest</comment_sort_order>  
          <long_desc isprivate="0" >
    <commentid>1067105</commentid>
    <comment_count>0</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-05 09:25:54 -0800</bug_when>
    <thetext>GTK bot: https://build.webkit.org/builders/GTK%20Linux%20ARM%20Release?numbuilds=200
EFL bot: http://build.webkit.sed.hu/builders/EFL%20ARMv7%20Linux%20Release%20%28Build%29?numbuilds=200

2-3-4 tests regularly fail with timeout on ARM Thumb2 Linux bots long long time ago.
Unfortunately there were a long period when ~200 tests failed on the ARM Thumb2 EFL
bot, that&apos;s why it wasn&apos;t so easy to found what and when happened.

But fortunately it is quite easy to reproduce the bug on
- jsc-layout-tests.yaml/js/script-tests/basic-set.js 
- jsc-layout-tests.yaml/js/script-tests/basic-map.js

They fail - stuck in an infinite loop - once from 10-15 runs.
GDB showed that it always happen somewhere in GC code.

I checked the bot history thoroughly too and I found that this kind of timeout
problems started when EFL port enabled parallel GC - http://trac.webkit.org/changeset/165527

And I can confirm that the problem goes away with disabled parallel GC.
(There is no problem on ARM traditional, because it doesn&apos;t support 
weakCompareAndSwap yet (bug127679), so paralllel GC is still disabled on it.)</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1067106</commentid>
    <comment_count>1</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-05 09:28:03 -0800</bug_when>
    <thetext>I think we should disable parallel GC on ARM Linux Thumb2 
platforms until we find and fix the real bug.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1067108</commentid>
    <comment_count>2</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-05 09:31:27 -0800</bug_when>
    <thetext>Just a note: I disabled parallel GC on EFL ARM Thumb2 bots locally
to see more (at least 8-10) test runs before disabling it in trunk.

- http://build.webkit.sed.hu/builders/EFL%20ARMv7%20Linux%20Release%20%28Build%29?numbuilds=200
(disabled since #8180)
- http://build.webkit.sed.hu/builders/EFL%20ARMv7-Thumb2%20Linux%20Release%20%28Build%29%201404?numbuilds=200
(disabled since #269)</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1067327</commentid>
    <comment_count>3</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-06 02:00:59 -0800</bug_when>
    <thetext>notes:
- EFL enabled parallel GC in http://trac.webkit.org/changeset/165527
- GTK enabled parallel GC in http://trac.webkit.org/changeset/121869</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1067330</commentid>
    <comment_count>4</comment_count>
      <attachid>246156</attachid>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-06 02:07:27 -0800</bug_when>
    <thetext>Created attachment 246156
Patch

I propose disabling parallel GC temporarily on Linux/Thumb2 until we can fix the bug in GC.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1067649</commentid>
    <comment_count>5</comment_count>
      <attachid>246156</attachid>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-07 23:54:27 -0800</bug_when>
    <thetext>Comment on attachment 246156
Patch

Clearing flags on attachment: 246156

Committed r179795: &lt;http://trac.webkit.org/changeset/179795&gt;</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1068677</commentid>
    <comment_count>6</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-02-12 11:21:43 -0800</bug_when>
    <thetext>Now that parallel GC is disabled, who is going to investigate the failure, so we can re-enable it?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1068681</commentid>
    <comment_count>7</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-12 11:34:35 -0800</bug_when>
    <thetext>(In reply to comment #6)
&gt; Now that parallel GC is disabled, who is going to investigate the failure,
&gt; so we can re-enable it?

The bug is assigned to me, so I&apos;m going to investigate the failure 
in the near future and will re-enable once this bug is fixed.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1069013</commentid>
    <comment_count>8</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-13 13:04:04 -0800</bug_when>
    <thetext>It might be interesting to know that in my PowerPC fork of WebKit I&apos;m experiencing similar (if not identical) problems on multicore systems.
Important similarities between my fork and the ARM Linux ports seem to be (I don&apos;t know the Linux ports and I might be wrong in my assumptions):
- similar level of out-of-order execution (and generally both are RISC architectures)
- threading API is pthreads exclusively
- built using GCC
Maybe that gives a hint.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070207</commentid>
    <comment_count>9</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-18 13:35:49 -0800</bug_when>
    <thetext>Maybe in CodeBlock::visitAggregate() after successfully passing
&quot;while (!WTF::weakCompareAndSwap(&amp;m_visitAggregateHasBeenCalled, 0, 1));&quot;
there is missing a call to WTF::memoryBarrierAfterLock().
And in all places where m_visitAggregateHasBeenCalled is set to false again a call to WTF::memoryBarrierBeforeUnlock() might be necessary.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070252</commentid>
    <comment_count>10</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-02-18 15:37:44 -0800</bug_when>
    <thetext>Does the hanging backtrace indicate a hang while marking a CodeBlock? Can someone post an example hang backtrace of all threads?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070360</commentid>
    <comment_count>11</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-18 23:10:39 -0800</bug_when>
    <thetext>Here is what I&apos;ve got (It&apos;s from a third person so it&apos;s just an excerpt):

  Thread id:      bf325f0
  User stack:
    130 ??? [0xb2fc]
      130 _NSApplicationMain + 444 (in AppKit) [0x92b8229c]
        130 -[NSApplication run] + 748 (in AppKit) [0x92bb18a0]
          130 ??? [0x18c6c]
            130 -[NSApplication nextEventMatchingMask:untilDate:inMode:dequeue:] + 116 (in AppKit) [0x92bb7c00]
              130 __DPSNextEvent + 600 (in AppKit) [0x92bb8248]
                130 _BlockUntilNextEventMatchingListInMode + 88 (in HIToolbox) [0x9020377c]
                  130 _ReceiveNextEventCommon + 264 (in HIToolbox) [0x902038a4]
                    130 _RunCurrentEventLoopInMode + 268 (in HIToolbox) [0x90203b18]
                      130 _CFRunLoopRunSpecific + 2972 (in CoreFoundation) [0x9229081c]
                        130 __ZN3JSC9HeapTimer12timerDidFireEP16__CFRunLoopTimerPv + 224 (in JavaScriptCore) [0x7fadf0]
                          130 __ZN3JSC18GCActivityCallback6doWorkEv + 184 (in JavaScriptCore) [0x7bead8]
                            130 __ZN3JSC4Heap7collectENS_13HeapOperationE + 676 (in JavaScriptCore) [0x7f80c4]
                              130 __ZN3JSC4Heap9markRootsEd + 828 (in JavaScriptCore) [0x7f6b9c]
                                130 __ZN3JSC11SlotVisitor15drainFromSharedENS0_15SharedDrainModeE + 536 (in JavaScriptCore) [0xaaadc8]
                                  130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      a2dcaf8
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTF25TCMalloc_Central_FreeList14FetchFromSpansEv + 0 (in JavaScriptCore) [0xb360b0]
        130 __ZN3WTF17TCMalloc_PageHeap15scavengerThreadEv + 136 (in JavaScriptCore) [0xb36048]
          130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      a646d40
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN7WebCore12IconDatabase22iconDatabaseSyncThreadEv + 564 (in WebCore) [0x2dbd3e4]
          130 __ZN7WebCore12IconDatabase18syncThreadMainLoopEv + 524 (in WebCore) [0x2dbd0bc]
            130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      a9bdd40
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __Z22CFURLCacheWorkerThreadPv + 292 (in CFNetwork) [0x96062394]
        130 _CFRunLoopRunSpecific + 1816 (in CoreFoundation) [0x92290398]
          130 _mach_msg_trap + 8 (in libSystem.B.dylib) [0x93f00078]

  Thread id:      bf31000
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3WTFL16compatEntryPointEPv + 44 (in JavaScriptCore) [0xb61edc]
          130 ??? [0x324a4]
            130 ??? [0x32518]
              130 _CFRunLoopRunSpecific + 1816 (in CoreFoundation) [0x92290398]
                130 _mach_msg_trap + 8 (in libSystem.B.dylib) [0x93f00078]

  Thread id:      bf33838
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3JSC14BlockAllocator22blockFreeingThreadMainEv + 260 (in JavaScriptCore) [0x753074]
          130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      9bf20e8
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3JSC8GCThread12gcThreadMainEv + 140 (in JavaScriptCore) [0x7ea6ec]
          130 __ZN3JSC11SlotVisitor15drainFromSharedENS0_15SharedDrainModeE + 432 (in JavaScriptCore) [0xaaad60]
            130 __ZN3JSC11SlotVisitor5drainEv + 944 (in JavaScriptCore) [0xaaa520]
              77 __ZN3JSC13JSFinalObject13visitChildrenEPNS_6JSCellERNS_11SlotVisitorE + 5232 (in JavaScriptCore) [0x8d7890]
              29 __ZN3JSC13JSFinalObject13visitChildrenEPNS_6JSCellERNS_11SlotVisitorE + 5256 (in JavaScriptCore) [0x8d78a8]
              20 __ZN3JSC13JSFinalObject13visitChildrenEPNS_6JSCellERNS_11SlotVisitorE + 5240 (in JavaScriptCore) [0x8d7898]
              3 __ZN3JSC13JSFinalObject13visitChildrenEPNS_6JSCellERNS_11SlotVisitorE + 5248 (in JavaScriptCore) [0x8d78a0]
              1 __ZN3JSC13JSFinalObject13visitChildrenEPNS_6JSCellERNS_11SlotVisitorE + 5244 (in JavaScriptCore) [0x8d789c]

  Thread id:      b0fa248
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3JSC8GCThread12gcThreadMainEv + 140 (in JavaScriptCore) [0x7ea6ec]
          130 __ZN3JSC11SlotVisitor15drainFromSharedENS0_15SharedDrainModeE + 336 (in JavaScriptCore) [0xaaad00]
            130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      bf31750
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3JSC8GCThread12gcThreadMainEv + 140 (in JavaScriptCore) [0x7ea6ec]
          130 __ZN3JSC11SlotVisitor15drainFromSharedENS0_15SharedDrainModeE + 336 (in JavaScriptCore) [0xaaad00]
            130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      987c000
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 ___NSThread__main__ + 1008 (in Foundation) [0x951e8d88]
        130 +[NSURLConnection(NSURLConnectionReallyInternal) _resourceLoadLoop:] + 284 (in Foundation) [0x9523fd54]
          130 _CFRunLoopRunSpecific + 1816 (in CoreFoundation) [0x92290398]
            130 _mach_msg_trap + 8 (in libSystem.B.dylib) [0x93f00078]

  Thread id:      9bf0af8
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3WTFL16compatEntryPointEPv + 44 (in JavaScriptCore) [0xb61edc]
          130 ??? [0x324a4]
            130 ??? [0x32518]
              130 _CFRunLoopRunSpecific + 1816 (in CoreFoundation) [0x92290398]
                130 _mach_msg_trap + 8 (in libSystem.B.dylib) [0x93f00078]

  Thread id:      99260e8
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN9CAPThread5EntryEPS_ + 108 (in CoreAudio) [0x927a9de8]
        130 __ZN10HALRunLoop9OwnThreadEPv + 216 (in CoreAudio) [0x927a9fac]
          130 _CFRunLoopRunSpecific + 1816 (in CoreFoundation) [0x92290398]
            130 _mach_msg_trap + 8 (in libSystem.B.dylib) [0x93f00078]

  Thread id:      c0ac5f0
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN7WebCore13StorageThread16threadEntryPointEv + 300 (in WebCore) [0x317b0cc]
          130 __ZN3WTF15ThreadCondition9timedWaitERNS_5MutexEd + 92 (in JavaScriptCore) [0xb6245c]
            130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      c0ad490
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN7WebCore13StorageThread16threadEntryPointEv + 300 (in WebCore) [0x317b0cc]
          130 __ZN3WTF15ThreadCondition9timedWaitERNS_5MutexEd + 92 (in JavaScriptCore) [0xb6245c]
            130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      bf32248
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __ZN3WTFL19wtfThreadEntryPointEPv + 44 (in JavaScriptCore) [0xb6239c]
        130 __ZN3WTFL16compatEntryPointEPv + 44 (in JavaScriptCore) [0xb61edc]
          130 ??? [0x49a1c]
            130 ??? [0x49b0c]
              130 ??? [0x1d5d34]
                130 __ZN3WTF15ThreadCondition9timedWaitERNS_5MutexEd + 92 (in JavaScriptCore) [0xb6245c]
                  130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      91b13a8
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 _select$DARWIN_EXTSN + 12 (in libSystem.B.dylib) [0x93f63c14]

  Thread id:      c26d5f0
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __Z11CMMConvTaskPv + 52 (in ColorSync) [0x9347fbb0]
        130 __Z20pthreadSemaphoreWaitP18t_pthreadSemaphore + 40 (in ColorSync) [0x9346c00c]
          130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      c26e490
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __Z11CMMConvTaskPv + 52 (in ColorSync) [0x9347fbb0]
        130 __Z20pthreadSemaphoreWaitP18t_pthreadSemaphore + 40 (in ColorSync) [0x9346c00c]
          130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]

  Thread id:      c0ab3a8
  User stack:
    130 __pthread_start + 320 (in libSystem.B.dylib) [0x93f41f74]
      130 __Z11CMMConvTaskPv + 52 (in ColorSync) [0x9347fbb0]
        130 __Z20pthreadSemaphoreWaitP18t_pthreadSemaphore + 40 (in ColorSync) [0x9346c00c]
          130 ___semwait_signal + 12 (in libSystem.B.dylib) [0x93f06a8c]</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070382</commentid>
    <comment_count>12</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-19 03:42:33 -0800</bug_when>
    <thetext>Keep in mind that since the hang backtrace is from my fork it may or may not be related to this bug.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070383</commentid>
    <comment_count>13</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-19 03:47:45 -0800</bug_when>
    <thetext>Also forgot to mention that my fork is currently based on the safari-600-5 branch.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070384</commentid>
    <comment_count>14</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-19 03:49:38 -0800</bug_when>
    <thetext>(In reply to comment #10)
&gt; Does the hanging backtrace indicate a hang while marking a CodeBlock? Can
&gt; someone post an example hang backtrace of all threads?

Sure, I&apos;ll post some ARM Linux backtraces later today.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070409</commentid>
    <comment_count>15</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-19 08:13:53 -0800</bug_when>
    <thetext>(In reply to comment #14)
&gt; (In reply to comment #10)
&gt; &gt; Does the hanging backtrace indicate a hang while marking a CodeBlock? Can
&gt; &gt; someone post an example hang backtrace of all threads?
&gt; 
&gt; Sure, I&apos;ll post some ARM Linux backtraces later today.

$ jsc standalone-pre.js basic-map.js standalone-post.js

This test normally ran in 1-2 seconds. After half minutes I attached
to the process with gdb and dumped backtrace of all threads. And then
continued and stopped. I got the same backtrace always.

Thread 3 (Thread 0xb4bbb460 (LWP 14564)):
#0  0xb6f41384 in __libc_do_syscall () from /lib/arm-linux-gnueabihf/libpthread.so.0
#1  0xb6f3cff6 in pthread_cond_wait@@GLIBC_2.4 () from /lib/arm-linux-gnueabihf/libpthread.so.0
#2  0xb6374a30 in std::condition_variable::wait(std::unique_lock&lt;std::mutex&gt;&amp;) ()
    at /home/rgabor/new_toolchain/compiler/.build/arm-szeged-linux-gnueabi/build/build-cc-final/arm-szeged-linux-gnueabi/libstdc++-v3/include/arm-szeged-linux-gnueabi/bits/gthr-default.h:864
#3  0xb6c94150 in JSC::BlockAllocator::blockFreeingThreadMain() () at ../../Source/JavaScriptCore/heap/BlockAllocator.cpp:144
#4  0xb6eaafd6 in std::_Function_handler&lt;void (), WTF::createThread(void (*)(void*), void*, char const*)::{lambda()#1}&gt;::_M_invoke(std::_Any_data const&amp;) () at ../../Source/WTF/wtf/Threading.cpp:81
#5  0xb6eab086 in WTF::threadEntryPoint(void*) () at /mnt/store/ARM/toolchain/hardfp/arm-szeged-linux-gnueabi-4.8.3-thumb2/arm-szeged-linux-gnueabi/include/c++/4.8.3/functional:2471
#6  0xb6ec4aa2 in WTF::wtfThreadEntryPoint(void*) () at ../../Source/WTF/wtf/ThreadingPthreads.cpp:170
#7  0xb6f39ed2 in start_thread () from /lib/arm-linux-gnueabihf/libpthread.so.0
#8  0xb6225338 in ?? () from /lib/arm-linux-gnueabihf/libc.so.6
#9  0xb6225338 in ?? () from /lib/arm-linux-gnueabihf/libc.so.6
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 2 (Thread 0xb1fff460 (LWP 14565)):
#0  JSC::CodeBlock::stronglyVisitStrongReferences(JSC::SlotVisitor&amp;) () at ../../Source/WTF/wtf/Atomics.h:317
#1  0xb6b51f02 in JSC::CodeBlock::visitAggregate(JSC::SlotVisitor&amp;) () at ../../Source/JavaScriptCore/bytecode/CodeBlock.cpp:2261
#2  0xb6dc96e2 in JSC::EvalExecutable::visitChildren(JSC::JSCell*, JSC::SlotVisitor&amp;) () at ../../Source/JavaScriptCore/runtime/Executable.cpp:437
#3  0xb6ca7ed2 in JSC::SlotVisitor::drain() () at ../../Source/JavaScriptCore/heap/SlotVisitor.cpp:104
#4  0xb6ca80fa in JSC::SlotVisitor::drainFromShared(JSC::SlotVisitor::SharedDrainMode) () at ../../Source/JavaScriptCore/heap/SlotVisitor.cpp:233
#5  0xb6c9a9cc in JSC::GCThread::gcThreadMain() () at ../../Source/JavaScriptCore/heap/GCThread.cpp:102
#6  0xb6eaafd6 in std::_Function_handler&lt;void (), WTF::createThread(void (*)(void*), void*, char const*)::{lambda()#1}&gt;::_M_invoke(std::_Any_data const&amp;) () at ../../Source/WTF/wtf/Threading.cpp:81
#7  0xb6eab086 in WTF::threadEntryPoint(void*) () at /mnt/store/ARM/toolchain/hardfp/arm-szeged-linux-gnueabi-4.8.3-thumb2/arm-szeged-linux-gnueabi/include/c++/4.8.3/functional:2471
#8  0xb6ec4aa2 in WTF::wtfThreadEntryPoint(void*) () at ../../Source/WTF/wtf/ThreadingPthreads.cpp:170
#9  0xb6f39ed2 in start_thread () from /lib/arm-linux-gnueabihf/libpthread.so.0
#10 0xb6225338 in ?? () from /lib/arm-linux-gnueabihf/libc.so.6
#11 0xb6225338 in ?? () from /lib/arm-linux-gnueabihf/libc.so.6
Backtrace stopped: previous frame identical to this frame (corrupt stack?)

Thread 1 (Thread 0xb4c57000 (LWP 14563)):
#0  0xb6f41384 in __libc_do_syscall () from /lib/arm-linux-gnueabihf/libpthread.so.0
#1  0xb6f3cff6 in pthread_cond_wait@@GLIBC_2.4 () from /lib/arm-linux-gnueabihf/libpthread.so.0
#2  0xb6374a30 in std::condition_variable::wait(std::unique_lock&lt;std::mutex&gt;&amp;) ()
    at /home/rgabor/new_toolchain/compiler/.build/arm-szeged-linux-gnueabi/build/build-cc-final/arm-szeged-linux-gnueabi/libstdc++-v3/include/arm-szeged-linux-gnueabi/bits/gthr-default.h:864
#3  0xb6ca8124 in JSC::SlotVisitor::drainFromShared(JSC::SlotVisitor::SharedDrainMode) () at ../../Source/JavaScriptCore/heap/SlotVisitor.cpp:212
#4  0xb6c9ce32 in JSC::Heap::markRoots(double) () at ../../Source/JavaScriptCore/heap/Heap.cpp:553
#5  0xb6ca06ca in JSC::Heap::collect(JSC::HeapOperation) () at ../../Source/JavaScriptCore/heap/Heap.cpp:1039
#6  0xb6ca0860 in JSC::Heap::collectAllGarbage() () at ../../Source/JavaScriptCore/heap/Heap.cpp:985
#7  0x0000e9a8 in functionGCAndSweep(JSC::ExecState*) ()
#8  0xb40f9aa8 in ?? ()
#9  0xb40f9aa8 in ?? ()
Backtrace stopped: previous frame identical to this frame (corrupt stack?)</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070410</commentid>
    <comment_count>16</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-02-19 08:15:36 -0800</bug_when>
    <thetext>( Of course I tried on ToT (r180345) after reenabled parallel GC. )</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070430</commentid>
    <comment_count>17</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-19 10:14:17 -0800</bug_when>
    <thetext>The backtraces seem to show the same situation to me.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070492</commentid>
    <comment_count>18</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-02-19 13:00:36 -0800</bug_when>
    <thetext>The backtrace in comment 11 does not show a CodeBlock, but the backtrace in comment 15 does.

The backtrace in comment 15 states that CodeBlock::visitAggregate made progress past the weakCompareAndSwap loop, which seems to contradict the missing fence theory.

If we can reproduce the backtrace in comment 15, can we add some logging, or just manual stepping, to show exactly how we end up looping? For example, are we repeatedly calling visit on the same CodeBlock? CodeBlock.cpp:2261 is not inside any loop, so it&apos;s not clear why we&apos;re looping.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1070498</commentid>
    <comment_count>19</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-19 13:28:07 -0800</bug_when>
    <thetext>I think it&apos;s the loop in SlotVisitor::drainFromShared() where SlotVisitor::drain() is called inside a loop.

Regarding the fence the call to WTF::memoryBarrierBeforeUnlock() before setting m_visitAggregateHasBeenCalled to false might be needed to ensure that this doesn&apos;t actually happen before all of the critical instructions have been executed. For example in tryHashConsLock() and releaseHashConsLock() it&apos;s implemented the same way.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1071741</commentid>
    <comment_count>20</comment_count>
      <attachid>247250</attachid>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-24 11:51:08 -0800</bug_when>
    <thetext>Created attachment 247250
proposed patch for discussion only

I discovered an optimization issue where GCC optimizes away a pointer deref.
This patch is not meant to be committed nor reviewed.
Since I generally cannot build and test ToT someone else has to pick it up in order to fix the bug in trunk.
I&apos;m still waiting for some testers of my fork to confirm it works for them, so I can&apos;t yet tell whether it will indeed fix this bug.
The memory barriers I added might or might not be needed but as far as I can tell spinlocks cannot be implemented correctly without them on (more aggressive) out-of-order-execution architectures.
Please note that SamplingRegion::Locker does also lack the memory barriers. The comment in SamplingRegion::Locker::~Locker() even shows that WTF::weakCompareAndSwap at some point did include memory barriers. So I think that comment has to be corrected and the missing barriers have to added.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1071976</commentid>
    <comment_count>21</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-02-24 22:54:02 -0800</bug_when>
    <thetext>My testers confirmed the patch indeed fixes the hangs.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1074034</commentid>
    <comment_count>22</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-03-04 03:20:26 -0800</bug_when>
    <thetext>Unfortunately this bug is valid on Aarch64 Linux too.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1074239</commentid>
    <comment_count>23</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-04 14:43:16 -0800</bug_when>
    <thetext>(In reply to comment #22)
&gt; Unfortunately this bug is valid on Aarch64 Linux too.

Why not try my proposed (and confirmedly working) patch?
What would be easier than trying, refactoring and committing it?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1074242</commentid>
    <comment_count>24</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-03-04 14:50:45 -0800</bug_when>
    <thetext>I think we&apos;d like to know more clearly why this patch works.

There are a lot of threading problems that go away if you just perform more barriers -- even if the barrier is not specifically related to the problem.

Do we have a specific explanation of the memory ordering problem we think we&apos;re fixing with this patch?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1074260</commentid>
    <comment_count>25</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-04 15:20:18 -0800</bug_when>
    <thetext>Well, I already tried to explain the patch in comment #20.

I definitely saw the GCC optimization issue in the disassembled code and it was definitely fixed by the cast to volatile. The cast to volatile was inspired by Atomic.h line 317.

The memory barriers were inspired by bug114934 and by the fact that the comment in SamplingRegion::Locker::~Locker() expects weakCompareAndSwap to perform a memory barrier. To me it seems it rather needs to be proved that memory barriers aren&apos;t needed in the places where I added them (see the changelog entry from bug114934).</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1074296</commentid>
    <comment_count>26</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-03-04 17:10:40 -0800</bug_when>
    <thetext>&gt; I discovered an optimization issue where GCC optimizes away a pointer deref.

Which pointer?

What did GCC replace the deref with?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1074407</commentid>
    <comment_count>27</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-04 23:10:16 -0800</bug_when>
    <thetext>I had to force GCC into dereferencing wordPtr like shown below because GCC would otherwise rely on the pointer&apos;s target to never change inside the loop and load it into a register only once before the loop:
- oldValue = *wordPtr;
+ oldValue = *const_cast&lt;volatile WordType*&gt;(wordPtr);</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075143</commentid>
    <comment_count>28</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-03-06 16:09:20 -0800</bug_when>
    <thetext>There appears to be a bug in the inline asm added for GCC:

#elif CPU(ARM64) &amp;&amp; COMPILER(GCC)
    unsigned tmp;
    unsigned result;
    asm volatile(
        &quot;mov %w1, #1\n\t&quot;
        &quot;ldxr %w2, [%0]\n\t&quot;
        &quot;cmp %w3, %w2\n\t&quot;
        &quot;b.ne 0f\n\t&quot;
        &quot;stxr %w1, %w4, [%0]\n\t&quot;
        &quot;0:&quot;
        : &quot;+r&quot;(location), &quot;=&amp;r&quot;(result), &quot;=&amp;r&quot;(tmp)
        : &quot;r&quot;(expected), &quot;r&quot;(newValue)
        : &quot;memory&quot;);
    result = !result;

The contents of the location pointed to by location do not look correctly annotated.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075356</commentid>
    <comment_count>29</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-08 00:10:08 -0800</bug_when>
    <thetext>Here the full loop from WTF::Bitmap::concurrentTestAndSet() where GCC assumes that the target of wordPtr wouldn&apos;t change inside the loop.

     WordType* wordPtr = bits.data() + index;
     WordType oldValue;
     do {
-        oldValue = *wordPtr;
+        oldValue = *const_cast&lt;volatile WordType*&gt;(wordPtr);
         if (oldValue &amp; mask)
             return true;
     } while (!weakCompareAndSwap(wordPtr, oldValue, oldValue | mask));

The loop in WTF::Bitmap::concurrentTestAndClear() is nearly identical.
Not forcing the compiler to deref the pointer each loop iteration seems an obvious bug to me.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075449</commentid>
    <comment_count>30</comment_count>
    <who name="Geoffrey Garen">ggaren</who>
    <bug_when>2015-03-08 15:49:11 -0700</bug_when>
    <thetext>&gt; Not forcing the compiler to deref the pointer each loop iteration seems an
&gt; obvious bug to me.

weakCompareAndSwap is (supposed to be) annotated as clobbering memory. That should be enough to force a pointer deref each time through the loop.

If you need volatile inside the loop in order to force a pointer deref, you either have a compiler bug or an incorrect implementation of weakCompareAndSwap.

As stated above, it looks like you have the latter -- an incorrect implementation of weakCompareAndSwap.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075542</commentid>
    <comment_count>31</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-09 00:10:27 -0700</bug_when>
    <thetext>&gt; As stated above, it looks like you have the latter -- an incorrect
&gt; implementation of weakCompareAndSwap.

Now that&apos;s potentially interesting for you, because the implementation of weakCompareAndSwap that is in use in my fork is the same as for the various ARM ports.
It&apos;s the workaround for non-Intel architectures not having a direct 8-bit-CAS operation: weakCompareAndSwap(uint8_t* location, uint8_t expected, uint8_t newValue) in Atomics.h .
Maybe location should be declared volatile in that function declaration?</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075760</commentid>
    <comment_count>32</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-09 14:40:20 -0700</bug_when>
    <thetext>Replacing

  inline bool weakCompareAndSwap(uint8_t* location, uint8_t expected, uint8_t newValue)

in Atomics.h with

  inline bool weakCompareAndSwap(volatile uint8_t* location, uint8_t expected, uint8_t newValue)

(added volatile to the declaration of location) is sufficient for GCC to deref the pointer each loop iteration; the disassembled code looks correct.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075834</commentid>
    <comment_count>33</comment_count>
    <who name="Mark Lam">mark.lam</who>
    <bug_when>2015-03-09 17:45:18 -0700</bug_when>
    <thetext>At least part of the issue (if not all) is caused by https://bugs.webkit.org/show_bug.cgi?id=142513.  Once that fix is landed, please check if there&apos;s any additional issue that still remains.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1075916</commentid>
    <comment_count>34</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-03-10 00:55:55 -0700</bug_when>
    <thetext>(In reply to comment #33)
&gt; At least part of the issue (if not all) is caused by
&gt; https://bugs.webkit.org/show_bug.cgi?id=142513.  Once that fix is landed,
&gt; please check if there&apos;s any additional issue that still remains.

Thanks, I&apos;ll check it today.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076018</commentid>
    <comment_count>35</comment_count>
    <who name="Tobias Netzel">tobias.netzel</who>
    <bug_when>2015-03-10 12:01:49 -0700</bug_when>
    <thetext>The change from r181319 (&lt;http://trac.webkit.org/r181319&gt;) makes GCC deref wordPtr each loop iteration, confirmed by analyzing the disassembly (PowerPC in my case).</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076609</commentid>
    <comment_count>36</comment_count>
      <attachid>248509</attachid>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-03-12 05:01:51 -0700</bug_when>
    <thetext>Created attachment 248509
Patch

r181319 ( bug142513 ) fixed the issue, thanks. Let&apos;s enable parallel GC again.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076615</commentid>
    <comment_count>37</comment_count>
      <attachid>248509</attachid>
    <who name="Carlos Garcia Campos">cgarcia</who>
    <bug_when>2015-03-12 05:39:07 -0700</bug_when>
    <thetext>Comment on attachment 248509
Patch

Ok, let&apos;s try again.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076619</commentid>
    <comment_count>38</comment_count>
    <who name="WebKit Commit Bot">commit-queue</who>
    <bug_when>2015-03-12 06:33:44 -0700</bug_when>
    <thetext>The commit-queue encountered the following flaky tests while processing attachment 248509:

The commit-queue is continuing to process your patch.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076623</commentid>
    <comment_count>39</comment_count>
    <who name="WebKit Commit Bot">commit-queue</who>
    <bug_when>2015-03-12 06:34:15 -0700</bug_when>
    <thetext>The commit-queue encountered the following flaky tests while processing attachment 248509:

transitions/default-timing-function.html bug 138901 (author: simon.fraser@apple.com)
The commit-queue is continuing to process your patch.</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076629</commentid>
    <comment_count>40</comment_count>
      <attachid>248509</attachid>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-03-12 07:50:54 -0700</bug_when>
    <thetext>Comment on attachment 248509
Patch

Clearing flags on attachment: 248509

Committed r181436: &lt;http://trac.webkit.org/changeset/181436&gt;</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>1076630</commentid>
    <comment_count>41</comment_count>
    <who name="Csaba Osztrogonác">ossy</who>
    <bug_when>2015-03-12 07:51:07 -0700</bug_when>
    <thetext>All reviewed patches have been landed.  Closing bug.</thetext>
  </long_desc>
      
          <attachment
              isobsolete="1"
              ispatch="1"
              isprivate="0"
          >
            <attachid>246156</attachid>
            <date>2015-02-06 02:07:27 -0800</date>
            <delta_ts>2015-03-12 05:01:33 -0700</delta_ts>
            <desc>Patch</desc>
            <filename>bug-141290-20150206020724.patch</filename>
            <type>text/plain</type>
            <size>1358</size>
            <attacher name="Csaba Osztrogonác">ossy</attacher>
            
              <data encoding="base64">U3VidmVyc2lvbiBSZXZpc2lvbjogMTc5NzAxCmRpZmYgLS1naXQgYS9Tb3VyY2UvV1RGL0NoYW5n
ZUxvZyBiL1NvdXJjZS9XVEYvQ2hhbmdlTG9nCmluZGV4IDQzZGFmYmViNThjYmQ2MDAyOWYyNjE3
ZTg0NDUyYjdkOGI3MWViNGUuLjM5ODgyNTIzMTY0ODY5MWM1NTFiNjUzMjZiNmQwYzgxOTY0MWY5
NjcgMTAwNjQ0Ci0tLSBhL1NvdXJjZS9XVEYvQ2hhbmdlTG9nCisrKyBiL1NvdXJjZS9XVEYvQ2hh
bmdlTG9nCkBAIC0xLDMgKzEsMTIgQEAKKzIwMTUtMDItMDYgIENzYWJhIE9zenRyb2dvbsOhYyAg
PG9zc3lAd2Via2l0Lm9yZz4KKworICAgICAgICBbQVJNXSBHQyBzb21ldGltZXMgc3R1Y2sgaW4g
YW4gaW5maW5pdGUgbG9vcCBpZiBwYXJhbGxlbCBHQyBpcyBlbmFibGVkCisgICAgICAgIGh0dHBz
Oi8vYnVncy53ZWJraXQub3JnL3Nob3dfYnVnLmNnaT9pZD0xNDEyOTAKKworICAgICAgICBSZXZp
ZXdlZCBieSBOT0JPRFkgKE9PUFMhKS4KKworICAgICAgICAqIHd0Zi9QbGF0Zm9ybS5oOgorCiAy
MDE1LTAyLTA1ICBZb3Vlbm4gRmFibGV0ICA8eW91ZW5uLmZhYmxldEBjcmYuY2Fub24uZnI+IGFu
ZCBYYWJpZXIgUm9kcmlndWV6IENhbHZhciA8Y2FsdmFyaXNAaWdhbGlhLmNvbT4KIAogICAgICAg
ICBbU3RyZWFtcyBBUEldIEltcGxlbWVudCBhIGJhcmVib25lIFJlYWRhYmxlU3RyZWFtIGludGVy
ZmFjZQpkaWZmIC0tZ2l0IGEvU291cmNlL1dURi93dGYvUGxhdGZvcm0uaCBiL1NvdXJjZS9XVEYv
d3RmL1BsYXRmb3JtLmgKaW5kZXggZDU2ZDFlYjQ0YTIzNzg3NWZhZDlmYmYxMzRjN2QwNGNjNzMz
MGI0OS4uMDJiYTM2YzY0ODEyZWE3Y2Q3ZjE2OTFkNjgxMTQ5YjkzMDRiYmQ0OSAxMDA2NDQKLS0t
IGEvU291cmNlL1dURi93dGYvUGxhdGZvcm0uaAorKysgYi9Tb3VyY2UvV1RGL3d0Zi9QbGF0Zm9y
bS5oCkBAIC05NjEsNiArOTYxLDEyIEBACiAjZGVmaW5lIEVOQUJMRV9DT01QQVJFX0FORF9TV0FQ
IDEKICNlbmRpZgogCisvKiBbQVJNXSBHQyBzb21ldGltZXMgc3R1Y2sgaW4gYW4gaW5maW5pdGUg
bG9vcCBpZiBwYXJhbGxlbCBHQyBpcyBlbmFibGVkCisgRklYTUU6IEVuYWJsZSBpdCBvbmNlICMx
NDEyOTAgaXMgZml4ZWQuICovCisjaWYgIWRlZmluZWQoRU5BQkxFX1BBUkFMTEVMX0dDKSAmJiBP
UyhMSU5VWCkgJiYgQ1BVKEFSTV9USFVNQjIpCisjZGVmaW5lIEVOQUJMRV9QQVJBTExFTF9HQyAw
CisjZW5kaWYKKwogI2lmICFkZWZpbmVkKEVOQUJMRV9QQVJBTExFTF9HQykgJiYgKE9TKERBUldJ
TikgfHwgUExBVEZPUk0oRUZMKSB8fCBQTEFURk9STShHVEspKSAmJiBFTkFCTEUoQ09NUEFSRV9B
TkRfU1dBUCkKICNkZWZpbmUgRU5BQkxFX1BBUkFMTEVMX0dDIDEKICNlbmRpZgo=
</data>

          </attachment>
          <attachment
              isobsolete="1"
              ispatch="1"
              isprivate="0"
          >
            <attachid>247250</attachid>
            <date>2015-02-24 11:51:08 -0800</date>
            <delta_ts>2015-03-12 05:01:42 -0700</delta_ts>
            <desc>proposed patch for discussion only</desc>
            <filename>proposed_patch</filename>
            <type>text/plain</type>
            <size>2835</size>
            <attacher name="Tobias Netzel">tobias.netzel</attacher>
            
              <data encoding="base64">SW5kZXg6IFNvdXJjZS9XVEYvd3RmL0JpdG1hcC5oCj09PT09PT09PT09PT09PT09PT09PT09PT09
PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT0KLS0tIFNvdXJjZS9XVEYv
d3RmL0JpdG1hcC5oCShSZXZpc2lvbiAxODA0NzcpCisrKyBTb3VyY2UvV1RGL3d0Zi9CaXRtYXAu
aAkoQXJiZWl0c2tvcGllKQpAQCAtMTIxLDEwICsxMjEsMTEgQEAKICAgICBXb3JkVHlwZSogd29y
ZFB0ciA9IGJpdHMuZGF0YSgpICsgaW5kZXg7CiAgICAgV29yZFR5cGUgb2xkVmFsdWU7CiAgICAg
ZG8gewotICAgICAgICBvbGRWYWx1ZSA9ICp3b3JkUHRyOworICAgICAgICBvbGRWYWx1ZSA9ICpj
b25zdF9jYXN0PHZvbGF0aWxlIFdvcmRUeXBlKj4od29yZFB0cik7CiAgICAgICAgIGlmIChvbGRW
YWx1ZSAmIG1hc2spCiAgICAgICAgICAgICByZXR1cm4gdHJ1ZTsKICAgICB9IHdoaWxlICghd2Vh
a0NvbXBhcmVBbmRTd2FwKHdvcmRQdHIsIG9sZFZhbHVlLCBvbGRWYWx1ZSB8IG1hc2spKTsKKyAg
ICBtZW1vcnlCYXJyaWVyQWZ0ZXJMb2NrKCk7CiAgICAgcmV0dXJuIGZhbHNlOwogfQogCkBAIC0x
NDAsOCArMTQxLDkgQEAKICAgICBzaXplX3QgaW5kZXggPSBuIC8gd29yZFNpemU7CiAgICAgV29y
ZFR5cGUqIHdvcmRQdHIgPSBiaXRzLmRhdGEoKSArIGluZGV4OwogICAgIFdvcmRUeXBlIG9sZFZh
bHVlOworICAgIG1lbW9yeUJhcnJpZXJCZWZvcmVVbmxvY2soKTsKICAgICBkbyB7Ci0gICAgICAg
IG9sZFZhbHVlID0gKndvcmRQdHI7CisgICAgICAgIG9sZFZhbHVlID0gKmNvbnN0X2Nhc3Q8dm9s
YXRpbGUgV29yZFR5cGUqPih3b3JkUHRyKTsKICAgICAgICAgaWYgKCEob2xkVmFsdWUgJiBtYXNr
KSkKICAgICAgICAgICAgIHJldHVybiBmYWxzZTsKICAgICB9IHdoaWxlICghd2Vha0NvbXBhcmVB
bmRTd2FwKHdvcmRQdHIsIG9sZFZhbHVlLCBvbGRWYWx1ZSAmIH5tYXNrKSk7CkluZGV4OiBTb3Vy
Y2UvSmF2YVNjcmlwdENvcmUvYnl0ZWNvZGUvQ29kZUJsb2NrLmNwcAo9PT09PT09PT09PT09PT09
PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09Ci0tLSBT
b3VyY2UvSmF2YVNjcmlwdENvcmUvYnl0ZWNvZGUvQ29kZUJsb2NrLmNwcAkoUmV2aXNpb24gMTgw
NDc3KQorKysgU291cmNlL0phdmFTY3JpcHRDb3JlL2J5dGVjb2RlL0NvZGVCbG9jay5jcHAJKEFy
YmVpdHNrb3BpZSkKQEAgLTE4OTYsNiArMTg5Niw3IEBACiAgICAgICAgICAgICByZXR1cm47CiAg
ICAgICAgIH0KICAgICB9IHdoaWxlICghV1RGOjp3ZWFrQ29tcGFyZUFuZFN3YXAoJm1fdmlzaXRB
Z2dyZWdhdGVIYXNCZWVuQ2FsbGVkLCAwLCAxKSk7CisgICAgV1RGOjptZW1vcnlCYXJyaWVyQWZ0
ZXJMb2NrKCk7CiAjZW5kaWYgLy8gRU5BQkxFKFBBUkFMTEVMX0dDKQogICAgIAogICAgIGlmICgh
IW1fYWx0ZXJuYXRpdmUpCkluZGV4OiBTb3VyY2UvSmF2YVNjcmlwdENvcmUvYnl0ZWNvZGUvQ29k
ZUJsb2NrLmgKPT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09
PT09PT09PT09PT09PT09PT09PQotLS0gU291cmNlL0phdmFTY3JpcHRDb3JlL2J5dGVjb2RlL0Nv
ZGVCbG9jay5oCShSZXZpc2lvbiAxODA0NzcpCisrKyBTb3VyY2UvSmF2YVNjcmlwdENvcmUvYnl0
ZWNvZGUvQ29kZUJsb2NrLmgJKEFyYmVpdHNrb3BpZSkKQEAgLTEyNzEsNiArMTI3MSw5IEBACiAg
ICAgICAgIHJldHVybjsKICAgICAKICAgICBjb2RlQmxvY2stPm1fbWF5QmVFeGVjdXRpbmcgPSB0
cnVlOworI2lmIEVOQUJMRShQQVJBTExFTF9HQykKKyAgICBXVEY6Om1lbW9yeUJhcnJpZXJCZWZv
cmVVbmxvY2soKTsKKyNlbmRpZgogICAgIC8vIFdlIG1pZ2h0IG5vdCBoYXZlIGNsZWFyZWQgdGhl
IG1hcmtzIGZvciB0aGlzIENvZGVCbG9jaywgYnV0IHdlIG5lZWQgdG8gdmlzaXQgaXQuCiAgICAg
Y29kZUJsb2NrLT5tX3Zpc2l0QWdncmVnYXRlSGFzQmVlbkNhbGxlZCA9IGZhbHNlOwogI2lmIEVO
QUJMRShHR0MpCkluZGV4OiBTb3VyY2UvSmF2YVNjcmlwdENvcmUvaGVhcC9Db2RlQmxvY2tTZXQu
Y3BwCj09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09PT09
PT09PT09PT09PT09PT0KLS0tIFNvdXJjZS9KYXZhU2NyaXB0Q29yZS9oZWFwL0NvZGVCbG9ja1Nl
dC5jcHAJKFJldmlzaW9uIDE4MDQ3NykKKysrIFNvdXJjZS9KYXZhU2NyaXB0Q29yZS9oZWFwL0Nv
ZGVCbG9ja1NldC5jcHAJKEFyYmVpdHNrb3BpZSkKQEAgLTY1LDYgKzY1LDkgQEAKIHsKICAgICBm
b3IgKENvZGVCbG9jayogY29kZUJsb2NrIDogbV9vbGRDb2RlQmxvY2tzKSB7CiAgICAgICAgIGNv
ZGVCbG9jay0+bV9tYXlCZUV4ZWN1dGluZyA9IGZhbHNlOworI2lmIEVOQUJMRShQQVJBTExFTF9H
QykKKyAgICAgICAgV1RGOjptZW1vcnlCYXJyaWVyQmVmb3JlVW5sb2NrKCk7CisjZW5kaWYKICAg
ICAgICAgY29kZUJsb2NrLT5tX3Zpc2l0QWdncmVnYXRlSGFzQmVlbkNhbGxlZCA9IGZhbHNlOwog
ICAgIH0KIApAQCAtODIsNiArODIsOSBAQAogICAgICAgICAgICAgY29udGludWU7CiAgICAgICAg
IGV4ZWN1dGFibGUtPmZvckVhY2hDb2RlQmxvY2soW10oQ29kZUJsb2NrKiBjb2RlQmxvY2spIHsK
ICAgICAgICAgICAgIGNvZGVCbG9jay0+bV9tYXlCZUV4ZWN1dGluZyA9IGZhbHNlOworI2lmIEVO
QUJMRShQQVJBTExFTF9HQykKKyAgICAgICAgICAgIFdURjo6bWVtb3J5QmFycmllckJlZm9yZVVu
bG9jaygpOworI2VuZGlmCiAgICAgICAgICAgICBjb2RlQmxvY2stPm1fdmlzaXRBZ2dyZWdhdGVI
YXNCZWVuQ2FsbGVkID0gZmFsc2U7CiAgICAgICAgIH0pOwogICAgIH0K
</data>

          </attachment>
          <attachment
              isobsolete="0"
              ispatch="1"
              isprivate="0"
          >
            <attachid>248509</attachid>
            <date>2015-03-12 05:01:51 -0700</date>
            <delta_ts>2015-03-12 07:50:54 -0700</delta_ts>
            <desc>Patch</desc>
            <filename>bug-141290-20150312050125.patch</filename>
            <type>text/plain</type>
            <size>1384</size>
            <attacher name="Csaba Osztrogonác">ossy</attacher>
            
              <data encoding="base64">U3VidmVyc2lvbiBSZXZpc2lvbjogMTgxNDM1CmRpZmYgLS1naXQgYS9Tb3VyY2UvV1RGL0NoYW5n
ZUxvZyBiL1NvdXJjZS9XVEYvQ2hhbmdlTG9nCmluZGV4IGUwMjg3MTRlNDFmMGU5YWFmMjJjMGFl
OGVkYjVlOGVjZjFlMGVhYzMuLjZjZDA0YmI0MDg1MWNjMmFjNzcxZjYzZGI5ZjIyZDFhMWM5YjVj
N2IgMTAwNjQ0Ci0tLSBhL1NvdXJjZS9XVEYvQ2hhbmdlTG9nCisrKyBiL1NvdXJjZS9XVEYvQ2hh
bmdlTG9nCkBAIC0xLDMgKzEsMTIgQEAKKzIwMTUtMDMtMTIgIENzYWJhIE9zenRyb2dvbsOhYyAg
PG9zc3lAd2Via2l0Lm9yZz4KKworICAgICAgICBbQVJNXVtMaW51eF0gR0Mgc29tZXRpbWVzIHN0
dWNrIGluIGFuIGluZmluaXRlIGxvb3AgaWYgcGFyYWxsZWwgR0MgaXMgZW5hYmxlZAorICAgICAg
ICBodHRwczovL2J1Z3Mud2Via2l0Lm9yZy9zaG93X2J1Zy5jZ2k/aWQ9MTQxMjkwCisKKyAgICAg
ICAgUmV2aWV3ZWQgYnkgTk9CT0RZIChPT1BTISkuCisKKyAgICAgICAgKiB3dGYvUGxhdGZvcm0u
aDogRW5hYmxlIHBhcmFsbGVsIEdDIGFmdGVyIHIxODEzMTkuCisKIDIwMTUtMDMtMTEgIE15bGVz
IEMuIE1heGZpZWxkICA8bW1heGZpZWxkQGFwcGxlLmNvbT4KIAogICAgICAgICBVc2Ugb3V0LW9m
LWJhbmQgbWVzc2FnaW5nIGZvciBSZW5kZXJCb3g6OmZpcnN0TGluZUJhc2VsaW5lKCkgYW5kIFJl
bmRlckJveDo6aW5saW5lQmxvY2tCYXNlbGluZSgpCmRpZmYgLS1naXQgYS9Tb3VyY2UvV1RGL3d0
Zi9QbGF0Zm9ybS5oIGIvU291cmNlL1dURi93dGYvUGxhdGZvcm0uaAppbmRleCAxM2U5YWI1ZTll
ZjNkOTMwODkyMTJjODQ2MmY5M2RlYzg5NjA5MmFmLi40ZDI5YTlmMjZmYmE3ZWYyMWFlYmFmZjJk
MTY1MTY1NDEwYzg1MjgzIDEwMDY0NAotLS0gYS9Tb3VyY2UvV1RGL3d0Zi9QbGF0Zm9ybS5oCisr
KyBiL1NvdXJjZS9XVEYvd3RmL1BsYXRmb3JtLmgKQEAgLTk2MSwxMiArOTYxLDYgQEAKICNkZWZp
bmUgRU5BQkxFX0NPTVBBUkVfQU5EX1NXQVAgMQogI2VuZGlmCiAKLS8qIFtBUk1dIEdDIHNvbWV0
aW1lcyBzdHVjayBpbiBhbiBpbmZpbml0ZSBsb29wIGlmIHBhcmFsbGVsIEdDIGlzIGVuYWJsZWQK
LSBGSVhNRTogRW5hYmxlIGl0IG9uY2UgIzE0MTI5MCBpcyBmaXhlZC4gKi8KLSNpZiAhZGVmaW5l
ZChFTkFCTEVfUEFSQUxMRUxfR0MpICYmIE9TKExJTlVYKSAmJiBDUFUoQVJNX1RIVU1CMikKLSNk
ZWZpbmUgRU5BQkxFX1BBUkFMTEVMX0dDIDAKLSNlbmRpZgotCiAjaWYgIWRlZmluZWQoRU5BQkxF
X1BBUkFMTEVMX0dDKSAmJiAoT1MoREFSV0lOKSB8fCBQTEFURk9STShFRkwpIHx8IFBMQVRGT1JN
KEdUSykpICYmIEVOQUJMRShDT01QQVJFX0FORF9TV0FQKQogI2RlZmluZSBFTkFCTEVfUEFSQUxM
RUxfR0MgMQogI2VuZGlmCg==
</data>

          </attachment>
      

    </bug>

</bugzilla>