Created attachment 437113 [details]
Bug screenshot in codepen debug view
<video> element with vertical video appears to be rendered with a black landscape-oriented rectangle behind it.
First noticed when using getUserMedia on iPhone in iOS 15 (19A5340a). Same issue is reproducible on iPadOS 15 (same build) when used in portrait mode.
Reproduction codepen: https://codepen.io/alx-khovansky/pen/PojNgqR
Attached a screenshot taken in codepen debug view, notice the black rectangle sticking from the sides of the video.
> First noticed when using getUserMedia on iPhone in iOS 15
Could you please confirm whether this behaves as expected on iOS 14.x?
Created attachment 437373 [details]
Same demo running on iOS 14.7.1
Attached a new screenshot: same demo running on iOS 14.7.1.
Video is rendered normally.
I'm facing the same problem.
I found that setting the object-fit of the video element to none temporarily solves the problem, however it changes that setting.
The black rectangle disappears when there is inline content to the left of the video, so adding transparent colored text with a negative margin seems to be a temporary workaround.
I can reproduce this in the basic getUserMedia test over on webrtc samples as well.
Tested on iPhone XR MRY42QN/A running 15.0 Public Beta 8 (19A5340a)
Created attachment 438241 [details]
Basic peer connection demo
Shows that this is broken also in basicPeerConnection demo for both local and remote video in portrait mode
Got an update to 19A344, still not working
This hit production now on iOS 15 (19A346) and it's still as broken as before.
Created attachment 438940 [details]
Created attachment 438972 [details]
Comment on attachment 438972 [details]
View in context: https://bugs.webkit.org/attachment.cgi?id=438972&action=review
> + We also need to ompute m_sampleBufferDisplayLayer position based on m_rootLayer coordinates.
> + await new Promise(resolve => setTimeout(resolve, 3000));
This hard coded wait is unfortunate because it will almost always be way too long, but it can presumably be too short on a very heavily loaded machine. Can we avoid it by doing something like rendering to a canvas until we see that someone has been rendered?
(In reply to Eric Carlson from comment #11)
> Comment on attachment 438972 [details]
> View in context:
> > Source/WebCore/ChangeLog:10
> > + We also need to ompute m_sampleBufferDisplayLayer position based on m_rootLayer coordinates.
> > LayoutTests/fast/mediastream/video-rotation.html:22
> > + await new Promise(resolve => setTimeout(resolve, 3000));
> This hard coded wait is unfortunate because it will almost always be way too
> long, but it can presumably be too short on a very heavily loaded machine.
> Can we avoid it by doing something like rendering to a canvas until we see
> that someone has been rendered?
Hum, we can try to improve things.
A canvas might not be good enough since it is not tightly related to the rendering code path.
But I can add a test runner API that gets a screenshot and we can then do some measurements on it (and redo screenshots as needed).
Created attachment 439138 [details]
Use new testRunner API
Committed r283035 (242095@main): <https://commits.webkit.org/242095@main>
All reviewed patches have been landed. Closing bug and clearing flags on attachment 439138 [details].
*** Bug 231461 has been marked as a duplicate of this bug. ***
This should be fixed in latest iOS 15 beta.
(In reply to youenn fablet from comment #16)
> This should be fixed in latest iOS 15 beta.
I take it back actually.
Looked fixed to me on 15.1 beta 3.
*** Bug 231903 has been marked as a duplicate of this bug. ***