Uploading a camera stream to a WebGL texture using getUserMedia(...), a <video> element, and gl.texImage2D(...) is failing in Safari on the iOS 15.1 beta 1 release. It works correctly on the release version of iOS 15.0 and earlier. I've prepared a minimal test case here: https://clv.zappar.io/3571850569055456732/1.0.0/ The page should show a live camera feed in the canvas after allowing camera permissions. On iOS 15.1 beta 1 no camera feed appears. The source for this test case is entirely within the page itself should you wish to take a look at how it's implemented. This is a common flow for web-based augmented reality and image processing applications. We have many sites and customers who rely on this behaviour to work correctly, so this is a major regression and concern for us. The same issue is also reported in the following bug but I've filed this one with a minimal test case and a more specific title in the hopes that it helps surface this significant regression to the right team :-) https://bugs.webkit.org/show_bug.cgi?id=230589 Let me know if I can help with any questions!
<rdar://problem/83407577>
*** Bug 230589 has been marked as a duplicate of this bug. ***
It looks like this regressed in Bug 228821.
Thanks Brett. I see the commit in that bug references it affecting the CPU codepath only - @kkinnunen is this perhaps another case like in 216259 and 215908 where some change has forced these uploads down the CPU path rather than the GPU one?
Created attachment 439150 [details] Patch
Comment on attachment 439150 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=439150&action=review > Source/WebCore/html/HTMLVideoElement.cpp:-318 > -RefPtr<NativeImage> HTMLVideoElement::nativeImageForCurrentTime() > -{ > - if (!player()) > - return nullptr; > - > - return player()->nativeImageForCurrentTime(); > -} > - I would rather not remove this as I think MediaPlayer should be an implementation detail, and I'd like to make it private. > Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm:2559 > + if (m_lastImage) > + return std::nullopt; You return null if there *is* an image?
Comment on attachment 439150 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=439150&action=review >> Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm:2559 >> + return std::nullopt; > > You return null if there *is* an image? Looks like a typo compared to MediaPlayerPrivateMediaStreamAVFObjC - maybe the cause of the widespread test failures.
The new accelerated code paths added in this patch look very cool though!
Just a note to say that we still see this issue on iOS 15.1 Beta 2.
(In reply to connell from comment #10) > Just a note to say that we still see this issue on iOS 15.1 Beta 2. I believe this will be part of Beta 3.
*** Bug 230879 has been marked as a duplicate of this bug. ***
Regression was fixed by reverting r280963 (Bug 228821) for iOS 15.2 (rdar://83587220) This bug still persists in trunk. Repurposing this bug to fix in trunk.
Created attachment 449160 [details] Patch
Comment on attachment 449160 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=449160&action=review > Source/WebCore/ChangeLog:9 > + Fix MSE camera to WebGL texture uploads. s/MSE/MediaStreamTrack/ > Source/WebCore/ChangeLog:12 > + Turns out MSE does not have these implemented, so currently fall back It should be easy to add such methods. We have m_currentVideoSample we can grab for that using a lock.
Committed r288025 (246051@main): <https://commits.webkit.org/246051@main> All reviewed patches have been landed. Closing bug and clearing flags on attachment 449160 [details].
I'm currently seeing this behavior (black screen, no camera feed) on iOS 15.4b1 (iPhone 11) on all 8th Wall and Zappar websites, including the test case listed at the top: https://clv.zappar.io/3571850569055456732/1.0.0/