LibWebRTCCodecs::setEncodeRates should send LibWebRTCCodecsProxy::SetEncodeRates only when the encoder is live
Created attachment 453722 [details] Patch
Comment on attachment 453722 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=453722&action=review > Source/WebKit/WebProcess/GPU/webrtc/LibWebRTCCodecs.cpp:525 > + auto* encoder = m_encoders.get(encoderIdentifier); If this is guaranteed non-null, then I suggest putting in a reference, not a pointer. Maybe also add an assertion? Not clear to me why this is strongly guaranteed.
(In reply to Darin Adler from comment #2) > Comment on attachment 453722 [details] > Patch > > View in context: > https://bugs.webkit.org/attachment.cgi?id=453722&action=review > > > Source/WebKit/WebProcess/GPU/webrtc/LibWebRTCCodecs.cpp:525 > > + auto* encoder = m_encoders.get(encoderIdentifier); > > If this is guaranteed non-null, then I suggest putting in a reference, not a > pointer. Maybe also add an assertion? Not clear to me why this is strongly > guaranteed. Will do. It is guaranteed to be non null as creation/deletion of the pointer is doing the same journey (hop to main-thread then hop to work queue). I'll update the code to directly use the LibWebRTCCodecs connection instead, which is more efficient anyway.
Created attachment 453818 [details] Patch for landing
Committed r290829 (248065@main): <https://commits.webkit.org/248065@main> All reviewed patches have been landed. Closing bug and clearing flags on attachment 453818 [details].
<rdar://problem/89809391>