WebKit Bugzilla
New
Browse
Log In
×
Sign in with GitHub
or
Remember my login
Create Account
·
Forgot Password
Forgotten password account recovery
RESOLVED FIXED
Bug 86410
[texmap][GStreamer][GTK] Composited Video support
https://bugs.webkit.org/show_bug.cgi?id=86410
Summary
[texmap][GStreamer][GTK] Composited Video support
Noam Rosenthal
Reported
2012-05-14 16:30:45 PDT
Support accelerated/composited video for Qt WebKit2, possibly the same way we're going to use for WebGL via GraphicsSurfaces.
Attachments
wip patch
(9.03 KB, patch)
2012-05-22 21:27 PDT
,
Philippe Normand
no flags
Details
Formatted Diff
Diff
Patch
(10.17 KB, patch)
2012-06-26 05:33 PDT
,
Simon Hausmann
no flags
Details
Formatted Diff
Diff
Patch
(9.92 KB, patch)
2012-06-26 06:31 PDT
,
Simon Hausmann
no flags
Details
Formatted Diff
Diff
patch based on simon's one
(11.60 KB, patch)
2012-10-11 10:46 PDT
,
arno.
no flags
Details
Formatted Diff
Diff
fixes a memory leak
(11.54 KB, patch)
2012-10-12 13:24 PDT
,
arno.
no flags
Details
Formatted Diff
Diff
support gst-vaapi to upload vaSurface to webkit texture
(8.24 KB, patch)
2012-11-02 02:59 PDT
,
Zhao, Halley
no flags
Details
Formatted Diff
Diff
temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, fail to compile
(10.34 KB, application/octet-stream)
2012-11-07 02:35 PST
,
Zhao, Halley
no flags
Details
add eglCreateImage/glEGLImageTargetTexture2DOES to webkit
(12.45 KB, patch)
2012-11-14 19:29 PST
,
Zhao, Halley
no flags
Details
Formatted Diff
Diff
Bind texture from hw video surface which exports EGLClientBuffer information
(18.98 KB, patch)
2012-12-03 23:28 PST
,
Zhao, Halley
no flags
Details
Formatted Diff
Diff
Rebase of the proposed patches (without gst surfaces)
(12.24 KB, patch)
2013-02-07 03:39 PST
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
simple patch for video composition
(7.39 KB, patch)
2013-02-22 03:50 PST
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
simple patch for video composition
(8.25 KB, patch)
2013-03-04 07:56 PST
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
New version of the composited video
(8.88 KB, patch)
2013-03-20 10:45 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
New version of the composited video
(19.93 KB, patch)
2013-03-25 10:05 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
compsited video with GstSurface{Meta/Convert} support
(8.88 KB, patch)
2013-04-17 03:27 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
Composited video support
(5.80 KB, patch)
2013-05-03 09:40 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
GstSurface{Meta/Convert} support for gstreamer-vaapi elements (Gst 1.0)
(6.30 KB, patch)
2013-05-03 09:46 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
Patch
(7.23 KB, patch)
2013-05-13 06:19 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
Patch
(7.36 KB, patch)
2013-05-13 07:23 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
Patch
(7.60 KB, patch)
2013-05-29 02:54 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
Patch
(7.76 KB, patch)
2013-05-29 04:16 PDT
,
Víctor M. Jáquez L.
no flags
Details
Formatted Diff
Diff
Show Obsolete
(17)
View All
Add attachment
proposed patch, testcase, etc.
Philippe Normand
Comment 1
2012-05-22 21:17:03 PDT
I have a working patch for the GStreamer player using the TextureMapper, it works but performance is not there yet. Any help would be much appreciated, I'm not very familiar with the TextureMapper yet. Would you like to see the patch Noam?
Philippe Normand
Comment 2
2012-05-22 21:27:28 PDT
Created
attachment 143452
[details]
wip patch
Philippe Normand
Comment 3
2012-05-22 21:31:24 PDT
I tested this on WebKitGTK WK1 and WK2 (with Martin's patch), I can clearly see that playback is not smooth.
Noam Rosenthal
Comment 4
2012-05-22 21:45:58 PDT
(In reply to
comment #2
)
> Created an attachment (id=143452) [details] > wip patch
Cool! You might want to convert to TextureMapperGL, and draw the texture with its ID directly. See what we're doing in GraphicsContext3DQt.cpp.
Philippe Normand
Comment 5
2012-05-22 21:48:15 PDT
(In reply to
comment #4
)
> (In reply to
comment #2
) > > Created an attachment (id=143452) [details] [details] > > wip patch > Cool! > > You might want to convert to TextureMapperGL, and draw the texture with its ID directly. See what we're doing in GraphicsContext3DQt.cpp.
I thought I was already using the GL implementation. Hum I'll continue work on the patch, thanks for the help :)
Noam Rosenthal
Comment 6
2012-05-22 21:52:06 PDT
> I thought I was already using the GL implementation. Hum I'll continue work on the patch, thanks for the help :)
You might, but if you cast to it, you can call drawTexture with a texture id instead of keeping your own BitmapTexture around, which feels a bit akward.
Noam Rosenthal
Comment 7
2012-05-22 21:56:30 PDT
Comment on
attachment 143452
[details]
wip patch View in context:
https://bugs.webkit.org/attachment.cgi?id=143452&action=review
> Source/WebCore/platform/graphics/gstreamer/VideoSinkGStreamer.cpp:164 > + priv->texture->updateContents(GST_BUFFER_DATA(buffer), WebCore::IntRect(WebCore::IntPoint(0, 0), size), WebCore::IntPoint(0,0), bytesPerLine);
I think the problem lies here... looks like there's going to be some expensive software copy here. I'm not familiar enough with gst texture sink to know how to go about it though.
Philippe Normand
Comment 8
2012-05-23 08:04:08 PDT
Retitling the bug, both Qt and GTK use the GStreamer player, so I think this new title makes more sense than the previous one. Thanks Noam for your insights on the wip patch, I'll try to send a new version soon.
Philippe Normand
Comment 9
2012-05-23 17:56:56 PDT
(In reply to
comment #7
)
> (From update of
attachment 143452
[details]
) > View in context:
https://bugs.webkit.org/attachment.cgi?id=143452&action=review
> > > Source/WebCore/platform/graphics/gstreamer/VideoSinkGStreamer.cpp:164 > > + priv->texture->updateContents(GST_BUFFER_DATA(buffer), WebCore::IntRect(WebCore::IntPoint(0, 0), size), WebCore::IntPoint(0,0), bytesPerLine); > > I think the problem lies here... looks like there's going to be some expensive software copy here. I'm not familiar enough with gst texture sink to know how to go about it though.
Right, also we can't use this texture sink, I think.
Philippe Normand
Comment 10
2012-05-30 10:52:18 PDT
Maybe someone clicked on the fancy "Submit for EWS analysis" button then :) Anyway, no big deal...
Philippe Normand
Comment 11
2012-05-30 10:52:43 PDT
(In reply to
comment #10
)
> Maybe someone clicked on the fancy "Submit for EWS analysis" button then :) Anyway, no big deal...
wrong bug, sorry
Philippe Normand
Comment 12
2012-06-07 13:31:52 PDT
Maybe one way to proceed would be to use the GstSurfaceBuffer, like the Clutter video-sink. This API is in -bad though and its future seems uncertain, I wonder if we should target only gst 0.11/1.0 support and forget about 0.10 :)
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-libs/html/gst-plugins-bad-libs-GstSurfaceBuffer.html
Simon Hausmann
Comment 13
2012-06-11 02:03:07 PDT
(In reply to
comment #12
)
> Maybe one way to proceed would be to use the GstSurfaceBuffer, like the Clutter video-sink. This API is in -bad though and its future seems uncertain, I wonder if we should target only gst 0.11/1.0 support and forget about 0.10 :) > > >
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-libs/html/gst-plugins-bad-libs-GstSurfaceBuffer.html
That looks great. Screw old versions :)
Simon Hausmann
Comment 14
2012-06-26 05:33:15 PDT
Created
attachment 149515
[details]
Patch
Simon Hausmann
Comment 15
2012-06-26 05:34:42 PDT
(In reply to
comment #14
)
> Created an attachment (id=149515) [details] > Patch
This is a first shot at an implementation that uses GstSurfaceBuffer and SurfaceBufferConverter. I didn't mark it for review yet because there are still bugs to iron out: * Sometimes the decoder gets stuck and stops delivering new frames. * Shutdown is "crashy" :) Nevertheless any feedback is welcome!
Noam Rosenthal
Comment 16
2012-06-26 06:17:36 PDT
Comment on
attachment 149515
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=149515&action=review
Awesome!
> Source/WebCore/WebCore.pri:203 > + contains(DEFINES, WTF_USE_GSTREAMER_SURFACEBUFFER=1): PKGCONFIG += gstreamer-basevideo-0.10
I'd probably use HAVE(GSTREAMER_SURFACEBUFFER) instead of USE;
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1642 > +#if USE(ACCELERATED_COMPOSITING) && USE(GSTREAMER_SURFACEBUFFER) && USE(TEXTURE_MAPPER_GL)
Let's just go with USE(TEXTURE_MAPPER_GL)
Simon Hausmann
Comment 17
2012-06-26 06:28:46 PDT
Comment on
attachment 149515
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=149515&action=review
>> Source/WebCore/WebCore.pri:203 >> + contains(DEFINES, WTF_USE_GSTREAMER_SURFACEBUFFER=1): PKGCONFIG += gstreamer-basevideo-0.10 > > I'd probably use HAVE(GSTREAMER_SURFACEBUFFER) instead of USE;
Ahh right. I always confuse ENABLE, USE and HAVE.
>> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1642 >> +#if USE(ACCELERATED_COMPOSITING) && USE(GSTREAMER_SURFACEBUFFER) && USE(TEXTURE_MAPPER_GL) > > Let's just go with USE(TEXTURE_MAPPER_GL)
Alright.
Simon Hausmann
Comment 18
2012-06-26 06:31:13 PDT
Created
attachment 149522
[details]
Patch
Noam Rosenthal
Comment 19
2012-06-26 06:34:34 PDT
Comment on
attachment 149522
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=149522&action=review
I only have nitpicks, maybe someone with more GStreamer experience has comments of value :) The TextureMapper stuff looks good.
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1688 > + if (m_buffer && GST_IS_SURFACE_BUFFER(m_buffer)) { > + GstSurfaceBuffer* surface = GST_SURFACE_BUFFER(m_buffer); > + > + GstCaps* caps = GST_BUFFER_CAPS(m_buffer); > + GstVideoFormat format; > + int width, height; > + gst_video_format_parse_caps(caps, &format, &width, &height); > + IntSize size(width, height); > + > + if (!m_texture) > + m_texture = textureMapper->createTexture(); > + > + if (m_texture->size() != size) { > + m_texture->reset(size); > + m_surfaceConverter.clear(); > + } > + > + if (!m_surfaceConverter) { > + GValue textureValue = G_VALUE_INIT; > + g_value_init(&textureValue, G_TYPE_UINT); > + g_value_set_uint(&textureValue, static_cast<BitmapTextureGL*>(m_texture.get())->id()); > + > + m_surfaceConverter.set(gst_surface_buffer_create_converter(surface, "opengl", &textureValue)); > + g_return_if_fail(m_surfaceConverter); > + } > + > + gst_surface_converter_upload(m_surfaceConverter.get(), surface); > + > + gst_buffer_unref(m_buffer); > + m_buffer = 0; > + }
Maybe this can live in its own function, e.g. convertBufferToTextureIfNeeded
Philippe Normand
Comment 20
2012-06-26 07:26:05 PDT
FYI I recently started work on this again, as well. Not using SurfaceBuffer yet though as I'm uncertain about its future in GStreamer 1.0
Simon Hausmann
Comment 21
2012-06-26 07:30:25 PDT
(In reply to
comment #20
)
> FYI I recently started work on this again, as well. Not using SurfaceBuffer yet though as I'm uncertain about its future in GStreamer 1.0
What's the new interface called?
Philippe Normand
Comment 22
2012-06-26 08:19:17 PDT
(In reply to
comment #21
)
> (In reply to
comment #20
) > > FYI I recently started work on this again, as well. Not using SurfaceBuffer yet though as I'm uncertain about its future in GStreamer 1.0 > > What's the new interface called?
There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not.
Simon Hausmann
Comment 23
2012-06-26 08:38:20 PDT
(In reply to
comment #22
)
> (In reply to
comment #21
) > > (In reply to
comment #20
) > > > FYI I recently started work on this again, as well. Not using SurfaceBuffer yet though as I'm uncertain about its future in GStreamer 1.0 > > > > What's the new interface called? > > There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not.
But there has to be an alternative in the works, right? Otherwise the clutter-gst video sink will also regress.
Philippe Normand
Comment 24
2012-06-26 09:12:34 PDT
(In reply to
comment #23
)
> (In reply to
comment #22
) > > (In reply to
comment #21
) > > > (In reply to
comment #20
) > > > > FYI I recently started work on this again, as well. Not using SurfaceBuffer yet though as I'm uncertain about its future in GStreamer 1.0 > > > > > > What's the new interface called? > > > > There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not. > > But there has to be an alternative in the works, right? Otherwise the clutter-gst video sink will also regress.
The clutter-gst video-sink might need an update, if GstVideoSurfaceBuffer changes, I guess :) There's a very interesting draft about the hw-accel plans in 1.0 there:
http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-hw-acceleration.txt
Simon Hausmann
Comment 25
2012-06-26 09:59:05 PDT
(In reply to
comment #24
)
> (In reply to
comment #23
) > > (In reply to
comment #22
) > > > (In reply to
comment #21
) > > > > (In reply to
comment #20
) > > > > > FYI I recently started work on this again, as well. Not using SurfaceBuffer yet though as I'm uncertain about its future in GStreamer 1.0 > > > > > > > > What's the new interface called? > > > > > > There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not. > > > > But there has to be an alternative in the works, right? Otherwise the clutter-gst video sink will also regress. > > The clutter-gst video-sink might need an update, if GstVideoSurfaceBuffer changes, I guess :)
Right.
> There's a very interesting draft about the hw-accel plans in 1.0 there:
http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-hw-acceleration.txt
Ohh, that is interesting indeed. I'll try to follow what's going on there and continue to polish this patch. I have an idea what's causing the issue I'm currently seeing, so I can't give up ;)
Philippe Normand
Comment 26
2012-06-26 10:24:53 PDT
Comment on
attachment 149522
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=149522&action=review
If that patch lands it'd be great if it doesn't break the GStreamer 0.11 WebKit build :) There's no EWS for this setup yet though.
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1665 > + GstCaps* caps = GST_BUFFER_CAPS(m_buffer); > + GstVideoFormat format; > + int width, height; > + gst_video_format_parse_caps(caps, &format, &width, &height); > + IntSize size(width, height);
Please use getVideoSizeAndFormatFromCaps()
Philippe Normand
Comment 27
2012-06-26 20:47:06 PDT
Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :) For 0.11, the GstSurfaceBuffer/Converter is currently implemented as buffer metadata btw.
Simon Hausmann
Comment 28
2012-06-26 23:09:22 PDT
(In reply to
comment #27
)
> Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :)
I'm using (K)Uubuntu 12.04, which is shipping libgstreamer-plugins-bad0.10-dev 0.10.22.3-2ubuntu2
> For 0.11, the GstSurfaceBuffer/Converter is currently implemented as buffer metadata btw.
Ah yeah, I noticed. I think if we end up landing this patch in the trunk, we should either use the one or the other API. I'd be happy to use 0.11 myself, if I can find convenient packages somewhere.
Philippe Normand
Comment 29
2012-06-27 19:30:42 PDT
(In reply to
comment #28
)
> (In reply to
comment #27
) > > Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :) > > I'm using (K)Uubuntu 12.04, which is shipping libgstreamer-plugins-bad0.10-dev 0.10.22.3-2ubuntu2 >
Oh right, I got it now. Managed to test your patch in WebKitGTK but I see no video at all, the video rectangle is correctly sized though and the controls are visible. Haven't debugged this yet. Anyway, thanks a lot for the patch Simon! I'm not sure about using this interface unless its future gets more clear :)
Simon Hausmann
Comment 30
2012-06-27 21:52:03 PDT
(In reply to
comment #29
)
> (In reply to
comment #28
) > > (In reply to
comment #27
) > > > Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :) > > > > I'm using (K)Uubuntu 12.04, which is shipping libgstreamer-plugins-bad0.10-dev 0.10.22.3-2ubuntu2 > > > > Oh right, I got it now. Managed to test your patch in WebKitGTK but I see no video at all, the video rectangle is correctly sized though and the controls are visible. Haven't debugged this yet.
Hm, that's odd. I'm using an nvidia card and the vdpau-va driver to be able to use gst-vaapi, which seems to be the only decoder that currently supports ths interface. The problem I'm seeing is that at some point the video sink in WebKit doesn't receive any frame anymore, until I seek. It feels like something is stuck. Oh the joy of multi-threading :)
> Anyway, thanks a lot for the patch Simon! I'm not sure about using this interface unless its future gets more clear :)
I totally understand. I'll be OOO for a while, but I'd like to pick it up again when I'm back. I don't even mind if the patch continues to just live in Bugzilla in the end. But I want to get it working properly first. I do see one problem with the interface and that's that I feel that it would make proxying tedious. Imagine an architecture like OMAP, where there's a good chance that you already have a gst decoder that can use the DSP to decode efficiently into memory. At the same time you have an existing kernel interface to allocating contiguous framebuffer memory the GPU can use (OMAPFB_SETUP_MEM) and your EGL implementation supports binding such a piece of memory against a texture somehow. In such an architecture it feels cumbersome to implement a proxying element that allocates the GstBuffers backed by FB memory, let the dsp decoding codec element write into those and then do essentially free conversion to a texture. In order to completely transparently implement the GstSufaceBuffer interface it would _have_ to make an unecessary copy of the buffer to hide the fact that the GPU can render the YUV surface natively but it requires the entity drawing (application) to use the GL_OES_EGL_image_external extension in the fragment shader. I'm looking forward to what the gst folks will come up with :)
Simon Hausmann
Comment 31
2012-07-11 02:00:59 PDT
I've been looking a bit at vdpau, which mesa also might get support for. NVidia has some really nice GL extensions that allow conveniently binding a vdp video surface to a texture. If the gstreamer vdpau sink had a way of exposing the vdpau output surface as a property and working without an x-window, then that would be a nice alternative. We could use that as video sink, retrieving the vdp output surface pointer, bind it to a texture with the GL extensions and be pretty much done.
Philippe Normand
Comment 32
2012-07-11 03:31:31 PDT
As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?
Simon Hausmann
Comment 33
2012-07-11 05:24:33 PDT
(In reply to
comment #32
)
> As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?
I don't think that would buy us anything, our pipeline would be as slow as it is today. I'm interested in a solution that gives the best possible performance. If it's restricted to one type of video acceleration API initially, then that's fine. Having one _properly_ accelerated code path is great for demonstrating the required interfaces and benchmarking new approaches. It also makes it easier to then iterate maybe with the gstreamer folks towards an interface that is more generic.
Martin Robinson
Comment 34
2012-07-11 11:05:54 PDT
(In reply to
comment #33
)
> (In reply to
comment #32
) > > As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver? > > I don't think that would buy us anything, our pipeline would be as slow as it is today.
One benefit is that color space conversion could be moved to a shader.
Simon Hausmann
Comment 35
2012-10-08 01:46:02 PDT
Not actively working on this patch anymore. Would love to pick it up again, as it seems the latest gstreamer 1.0 source have received fixes that may improve the reliability of this interface and maybe make it usable, but I prioritize other issues over this currently.
arno.
Comment 36
2012-10-11 10:40:13 PDT
(In reply to
comment #35
)
> Not actively working on this patch anymore. Would love to pick it up again, as it seems the latest gstreamer 1.0 source have received fixes that may improve the reliability of this interface and maybe make it usable, but I prioritize other issues over this currently.
Hi, From what I understand from
http://blogs.gnome.org/uraeus/2012/09/26/gstreamer-1-0-released/
> The changes will also make it a lot easier to write plugins and use plugins on PC platforms which use the GPU for decoding or encoding and that use OpenGL for rendering the playback
it is currently not possible to get an opengl texture from gstreamer-1.0 yet (although it could be possible in the future). Is there some way we can take advantage of gstreamer now to accelerate video ? (In reply to
comment #34
)
> (In reply to
comment #33
) > > (In reply to
comment #32
) > > > As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver? > > > > I don't think that would buy us anything, our pipeline would be as slow as it is today. > > One benefit is that color space conversion could be moved to a shader.
do you suggest we ask gstreamer to send the buffer data as yuv, and perform the conversion ourselves. But so, how can we evaluate the performance win ?
arno.
Comment 37
2012-10-11 10:46:49 PDT
Created
attachment 168253
[details]
patch based on simon's one Here is a wip patch I've been working on. It just uploads a texture to textureMapper, and applies a shader on it (which does nothing, but may be used if we decide to perform color space conversions). It is mainly simon's patch with the following changes: set needs display to the layer in renderLayerBacking otherwise, the rendering was not updated when new frames were available. update player in renderer when source changed, to avoid a crash when changing video.src
Simon Hausmann
Comment 38
2012-10-12 01:47:16 PDT
(In reply to
comment #36
)
> (In reply to
comment #35
) > > Not actively working on this patch anymore. Would love to pick it up again, as it seems the latest gstreamer 1.0 source have received fixes that may improve the reliability of this interface and maybe make it usable, but I prioritize other issues over this currently. > > Hi, > > From what I understand from >
http://blogs.gnome.org/uraeus/2012/09/26/gstreamer-1-0-released/
> > > The changes will also make it a lot easier to write plugins and use plugins on PC platforms which use the GPU for decoding or encoding and that use OpenGL for rendering the playback > > it is currently not possible to get an opengl texture from gstreamer-1.0 yet (although it could be possible in the future). Is there some way we can take advantage of gstreamer now to accelerate video ?
Well, I think what the blog is referring to is the fact that there's no stable supported API yet. However there exists an unstable API and I think the best way to make that unstable API stable is to use it, provide feedback if it needs changes and otherwise report back to the gstreamer folks that it's working great (in case it is), so that they can get confidence in it and declare it stable. That API is now called GstSurfaceMeta (see gstreamer-1.0/gst/video/gstsurfacemeta.h).
Simon Hausmann
Comment 39
2012-10-12 01:51:08 PDT
(In reply to
comment #34
)
> (In reply to
comment #33
) > > (In reply to
comment #32
) > > > As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver? > > > > I don't think that would buy us anything, our pipeline would be as slow as it is today. > > One benefit is that color space conversion could be moved to a shader.
Sure, but that's a halfway-there solution only. Other frameworks (like Stagefright on Android, AVFoundation on OS X and Media Foundation on Windows) demonstrate nicely that all this belongs behind a curtain that WebKit shouldn't have to be concerned about.
Martin Robinson
Comment 40
2012-10-12 08:23:37 PDT
(In reply to
comment #39
)
> > One benefit is that color space conversion could be moved to a shader. > > Sure, but that's a halfway-there solution only. Other frameworks (like Stagefright on Android, AVFoundation on OS X and Media Foundation on Windows) demonstrate nicely that all this belongs behind a curtain that WebKit shouldn't have to be concerned about.
Sure, in an ideal world and for the future, but we can also make things faster for people not using the bleedingest edge versions of Gstreamer.
arno.
Comment 41
2012-10-12 11:04:56 PDT
(In reply to
comment #38
)
> That API is now called GstSurfaceMeta (see gstreamer-1.0/gst/video/gstsurfacemeta.h).
I'm not successful in using it. - documentation is obsolete. It references #GST_VIDEO_CAPS_SURFACE which is no more defined. - I've tried to set the GST_STATIC_PADS to "video/x-surface, opengl=true" or "video/x-surface, type=vaapi" and the video does not render (render sink callback is not called) - in gstreamer sources, x-surface is only checked for subtitle overlays.
arno.
Comment 42
2012-10-12 11:10:51 PDT
(In reply to
comment #36
)
> do you suggest we ask gstreamer to send the buffer data as yuv, and perform the conversion ourselves. But so, how can we evaluate the performance win ?
I made some tests: I applied my patch. Then I asked gstreamer for some planar yuv format, and performed the conversion on the shader. With a few videos on a page, GtkLauncher seemed less responsive with accelerated compositing than without. So unless, there is a better way to achieve that, it seems like it decreases performance. I do not have numbers to back that claim though.
Martin Robinson
Comment 43
2012-10-12 11:33:39 PDT
(In reply to
comment #42
)
> (In reply to
comment #36
) > > > do you suggest we ask gstreamer to send the buffer data as yuv, and perform the conversion ourselves. But so, how can we evaluate the performance win ? > > I made some tests: I applied my patch. Then I asked gstreamer for some planar yuv format, and performed the conversion on the shader. With a few videos on a page, GtkLauncher seemed less responsive with accelerated compositing than without. So unless, there is a better way to achieve that, it seems like it decreases performance. I do not have numbers to back that claim though.
Could it be that you are uploading and downloading to th GPU more than you think? What happens if you turn the shader off and just render the YUV data directly? Is accelerated compositing slow for you, in general?
arno.
Comment 44
2012-10-12 13:24:13 PDT
Created
attachment 168470
[details]
fixes a memory leak (In reply to
comment #43
)
> > I made some tests: I applied my patch. Then I asked gstreamer for some planar yuv format, and performed the conversion on the shader. With a few videos on a page, GtkLauncher seemed less responsive with accelerated compositing than without. So unless, there is a better way to achieve that, it seems like it decreases performance. I do not have numbers to back that claim though. > > Could it be that you are uploading and downloading to th GPU more than you think? What happens if you turn the shader off and just render the YUV data directly? Is accelerated compositing slow for you, in general?
There was indeed a problem with the patch. It did not unref the gstreamer buffer properly, and a lot of memory leaked. Here is an updated patch. With it, there is no noticeable difference between requesting xrgb or yuv format.
arno.
Comment 45
2012-10-24 11:53:37 PDT
(In reply to
comment #40
)
> (In reply to
comment #39
) > > > > One benefit is that color space conversion could be moved to a shader. > > > > Sure, but that's a halfway-there solution only. Other frameworks (like Stagefright on Android, AVFoundation on OS X and Media Foundation on Windows) demonstrate nicely that all this belongs behind a curtain that WebKit shouldn't have to be concerned about. > > Sure, in an ideal world and for the future, but we can also make things faster for people not using the bleedingest edge versions of Gstreamer.
Hi, I could indeed measure a performance win when performing the color conversion in a shader. I measured gst_video_convert_transform_frame execution time in gstreamer-1.0 for a 480x360 video. color conversion from I420 to BGRx takes between 2000ns and 6000ns (2 to 6ms) color conversion from I420 to AYUV takes between 300ns and 500ns current paint method takes between 400ns and 2000ns Without a shader, paintTextureMapper takes between 400ns and 700ns. With a AYUV -> BGRx shader, paintTextureMapper takes between 600ns and 800ns. Except first time where it take about 12000ns (12ms). So it looks like with current gstreamer implementation, we have some win by performing the conversion ourself. But I'm concerned about doing that: What will happen when gstreamer change ? For example by implementing the plugins which will allow us to get the meta surface, or by optimizing the color conversion in some way. Does that mean we would have to perform some version check and have two different code paths then ? Alternatively, it looks like we still have a (small) gain just by using the texture mapper without a shader.
Zhao, Halley
Comment 46
2012-10-26 02:28:13 PDT
I just start to look at webkit, please correct me if there is any misunderstanding. here are some concern from me for the patch: 1. paint video to graphics layer by drawTexture() is heavy I expect (and think it is) that video is the only RenderObject in a graphics layer, even NOT share the same graphics layer with player controls; otherwise we composite each video frame twice. one is in the backing surface of graphics layer, another is for final compositing of each graphics layer. compositing includes glClear/drawTexture/swapBuffer, it is really heavy. how about the following suggestion: set video texture to graphics layer by bindSurface() and skip (or do nothing) in beginPainting/endPainting/drawTexture I had thought to use cairo gl surface for video before, since sw path of gtk/efl uses cairo. do you think it is possible to create a simple graphics layer with one texture only(without the complexity of TextureMapper)? 2. There is both ::paint and ::paintToTextureMapper in MediaPlayerPrivateGStreamer, they may conflict I don’t know where ::paint inherits from, but when TextureMapperPlatformLayer is used, should we avoid inherit the interface introduce ::paint function. should we only use ::paint in sw path and ::paintToTextureMapper in ACCELERATED_COMPOSITING path? 3. Create an empty video texture (m_texture), then upload video data to this texture introduce data copy I’d like create texture from existing native image like: glEGLImageTargetTexture2DOES() from an existing EGLImage how could I configure webkit to use EGL/GLES instead of GLX/GL? for Qt port, it may decided inside qt. I cares more for gtk/efl port. thanks
Simon Hausmann
Comment 47
2012-10-26 02:58:22 PDT
(In reply to
comment #46
)
> I just start to look at webkit, please correct me if there is any misunderstanding. > here are some concern from me for the patch: > > 1. paint video to graphics layer by drawTexture() is heavy > I expect (and think it is) that video is the only RenderObject in a graphics layer, even NOT share the same graphics layer with player controls; otherwise we composite each video frame twice. one is in the backing surface of graphics layer, another is for final compositing of each graphics layer. > compositing includes glClear/drawTexture/swapBuffer, it is really heavy. > how about the following suggestion: > set video texture to graphics layer by bindSurface() and skip (or do nothing) in beginPainting/endPainting/drawTexture
That is kind of what is happening. Each graphics layer has a corresponding TextureMapperPlatformLayer. When the texture mapper renders the scene, it asks each platform layer to paint itself (this is a somewhat simplified model). So if you have a html document with a video element and a bunch of controls on top, you should end up with three platform layers: (1) The background / regular content is cached in tile textures as part of a TextureMapperTiledBackingStore (which is a sub-class of TextureMapperPlatformLayer). Its paintTextTextureMapper draws the tile textures. (2) The video gets its own TextureMapperPlatformLayer, that's what MediaPlayerPrivateGStreamer is and implements. So after painting the tiles of the static content, it's up to the MediaPlayerPrivateGStreamer to draw the video texture. (3) Lastly the controls on top are cached in another layer (I think usually also again a TextureMapperTiledBackingStore) and are drawn in one shot. So for an entire scene there is no double-composition, the video frames are not rendered into an intermediate buffer. (This is a WK1 TextureMapperGL model, with WK2 video is not implemented yet and might either get an intermediate GraphicsSurface or if the platform supports it perhaps we could also avoid the extra composition).
> I had thought to use cairo gl surface for video before, since sw path of gtk/efl uses cairo. do you think it is possible to create a simple graphics layer with one texture only(without the complexity of TextureMapper)?
I think that's what we do have today (well, not really, but could :)
> 2. There is both ::paint and ::paintToTextureMapper in MediaPlayerPrivateGStreamer, they may conflict > I don’t know where ::paint inherits from, but when TextureMapperPlatformLayer is used, should we avoid inherit the interface introduce ::paint function. > should we only use ::paint in sw path and ::paintToTextureMapper in ACCELERATED_COMPOSITING path?
::paint() will still be called when we're in the accelerated compositing code path, but we do an early return from there if we are, so no extra pixels are painted. I don't remember exactly where it is called from, it might be the population of the root layer? I've seen it get called at least, but the early return should catch it.
> 3. Create an empty video texture (m_texture), then upload video data to this texture introduce data copy > I’d like create texture from existing native image like: glEGLImageTargetTexture2DOES() from an existing EGLImage > how could I configure webkit to use EGL/GLES instead of GLX/GL? for Qt port, it may decided inside qt. I cares more for gtk/efl port.
This may be a missing feature in BitmapTextureGL in the texture mapper I would say. But conceptually it sounds like what we should indeed aim for.
Zhao, Halley
Comment 48
2012-10-26 03:11:59 PDT
I run into SoftwareMode in TexTureMapper::create(), and shows nothing since hw video frame doesn't match sw path. I uses Ubuntu12.04 with KDE desktop, build webkit with "./build-webkit --qt", my qt version 4.8 which is installed from default repo where is wrong for my system or my build configuration? thanks.
Zhao, Halley
Comment 49
2012-10-26 03:27:05 PDT
(In reply to
comment #47
)
> (2) The video gets its own TextureMapperPlatformLayer, that's what MediaPlayerPrivateGStreamer is and implements. So after painting the tiles of the static content, it's up to the MediaPlayerPrivateGStreamer to draw the video texture.
do you mean beginPainting and endPainting will NOT called for each video frame? if yes, that's great. since I run into SoftwareMode on my platform, I don't know exactly how data flows. I just read the source code that "textureMapper->drawTexture" (in MediaPlayerPrivateGStreamer::paintToTextureMapper) will call into TextureMapperGL::drawTextureQuadWithProgram(), I think glClear() is required before the draw func in it.
Simon Hausmann
Comment 50
2012-10-26 03:30:19 PDT
(In reply to
comment #49
)
> (In reply to
comment #47
) > > (2) The video gets its own TextureMapperPlatformLayer, that's what MediaPlayerPrivateGStreamer is and implements. So after painting the tiles of the static content, it's up to the MediaPlayerPrivateGStreamer to draw the video texture. > > do you mean beginPainting and endPainting will NOT called for each video frame? > if yes, that's great.
I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen. I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second). Can you elaborate how your ideal rendering pass would look like?
Zhao, Halley
Comment 51
2012-10-26 05:15:43 PDT
> I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen. > > I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second). > > Can you elaborate how your ideal rendering pass would look like?
eventually, we just need one texture for each Graphics Layer. so 3D pipeline (glClear/drawTexture/swapBuffer) isn't necessary when we have such texture alerady. for example: we can use bindSurface only with video texture as input. any gap for it? if we reach the above solution, it means TextuerMapper is heavy for us, we can create a simple class for it. comparing to controls, video layer is larger and higher update frequency, it is really expensive to do glClear and copy the texture.
Simon Hausmann
Comment 52
2012-10-26 05:25:47 PDT
(In reply to
comment #51
)
> > I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen. > > > > I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second). > > > > Can you elaborate how your ideal rendering pass would look like? > > eventually, we just need one texture for each Graphics Layer.
That is what we have. Actually, regular content layers usually have multiple textures because of tiling. But video for example has one texture only. That texture is an implementation detail of MediaPlayerPrivateGStreamer as implementation of TextureMapperPlatformLayer.
> so 3D pipeline (glClear/drawTexture/swapBuffer) isn't necessary when we have such texture alerady. for example: we can use bindSurface only with video texture as input. any gap for it? > > if we reach the above solution, it means TextuerMapper is heavy for us, we can create a simple class for it. > > comparing to controls, video layer is larger and higher update frequency, it is really expensive to do glClear and copy the texture.
But we don't copy the texture, we only draw it once, onto the screen (into the back-buffer). Can you elaborate what you mean with bindSurface()? I think I must be misunderstanding something here. Note that the video texture is _not_ copied into an intermediate surface, it is copied straight onto the screen.
Zhao, Halley
Comment 53
2012-10-26 06:46:08 PDT
sorry, I misreading the patch. "textureMapperGL->bindSurface(m_texture.get());" it uses video texture for video's graphics layer. "textureMapper->drawTexture(*m_texture.get(), targetRect, matrix, opacity, mask);" from your explanation, I understand that it happens in 2nd phase compositing. -- I had thought it happens in 1st phase compositing before. thanks for your detailed explanation.
Zhao, Halley
Comment 54
2012-10-31 00:08:12 PDT
Is it possible to use YUV texture in 2nd phase compositing? the advantage is that we needn't convert YUV to RGB before 2nd phase compositing, (something like reduce memcpy once). the disadvantage is that we have to introduce some complexity ( some detail of YUV texture) in TextureMapperGL.
Simon Hausmann
Comment 55
2012-10-31 02:42:16 PDT
(In reply to
comment #54
)
> Is it possible to use YUV texture in 2nd phase compositing? > > the advantage is that we needn't convert YUV to RGB before 2nd phase compositing, (something like reduce memcpy once). > > the disadvantage is that we have to introduce some complexity ( some detail of YUV texture) in TextureMapperGL.
I think that would be a reasonable extension to TextureMapperGL and I think we need it anyway (despite being SoC/GPU specific). Of course the GStreamer interface doesn't specify anything there right now, so I think it will require some extensions. Perhaps as simple as the GstSurfaceConverter implementation providing a function for compiling the fragment shader needed for rendering the texture.
Zhao, Halley
Comment 56
2012-11-02 02:59:03 PDT
Created
attachment 172019
[details]
support gst-vaapi to upload vaSurface to webkit texture revise Simon's patch with fixes(share the X Display between webkit and gst pipeline), it runs on my sandy bridge PC; however, playback stops after several seconds. it it not my target since there is TFP and FBO in gst-vaapi to support gst_surface_converter_upload(), it is something like 3 times memory copy. next, I will work on the following proposal: 1. gst-vaapi export a 'handler' for video surface 2. a EGLImage can be create from this handler by eglCreateImageKHR() 3. texture is created from the above EGLImage by glEGLImageTargetTexture2DOES() something are not certain are: a) should the step 2 be done in gst-vaapi? pros: EGLImage is generic, webkit will be happy cons: EGL context should be shared between webkit and gst-vaapi, it increases the complexity b) should we export YUV (indicated by the 'handler') from gst-vaapi? pros: it is better for performance, driver needn't convert YUV to RGB before export the buffer. cons: complexity will be introduced on webkit side: detail info of the YUV buffer layout and corresponding shader to render such YUV texture. anyway, I will try RGB first. my SandyBridge can convert 720p video frame to RGB format at ~400fps I'm not familiar with webkit, could someone give clues on adding glEGLImageTargetTexture2DOES? and how could I make sure EGL/GLES is used instead of GLX/OGL?
Simon Hausmann
Comment 57
2012-11-02 03:13:05 PDT
(In reply to
comment #56
)
> Created an attachment (id=172019) [details] > support gst-vaapi to upload vaSurface to webkit texture > > revise Simon's patch with fixes(share the X Display between webkit and gst pipeline), > it runs on my sandy bridge PC; however, playback stops after several seconds.
I've observed the same :)
> it it not my target since there is TFP and FBO in gst-vaapi to support gst_surface_converter_upload(), > it is something like 3 times memory copy. > > next, I will work on the following proposal: > 1. gst-vaapi export a 'handler' for video surface > 2. a EGLImage can be create from this handler by eglCreateImageKHR() > 3. texture is created from the above EGLImage by glEGLImageTargetTexture2DOES() > > something are not certain are: > a) should the step 2 be done in gst-vaapi? > pros: EGLImage is generic, webkit will be happy > cons: EGL context should be shared between webkit and gst-vaapi, it increases the complexity
What if WebKit provides the EGL context but gst-vaapi creates its own context that is initialized to _share_ resources with the one provided by WebKit (or the app in general)?
> b) should we export YUV (indicated by the 'handler') from gst-vaapi? > pros: it is better for performance, driver needn't convert YUV to RGB before export the buffer. > cons: complexity will be introduced on webkit side: detail info of the YUV buffer layout and corresponding shader to render such YUV texture. > anyway, I will try RGB first. my SandyBridge can convert 720p video frame to RGB format at ~400fps
I wonder if it would be possible to have API in the gst_surface_converter interfaces that would allow for the app/webkit to call into the implementation (gst-vaapi) to create (compile) the fragment shader required for rendering the texture? Then WebKit could call that function when it needs to. Another approach would be for gst-vaapi to provide the shader as a string, but I don't think that's as clean.
> I'm not familiar with webkit, > could someone give clues on adding glEGLImageTargetTexture2DOES?
Would this be actually needed in WebKit if gst-vaapi continues to support as texture as interface instead of EGL image? Then again, I suppose one argument in favour of using EGL images instead of texture is that OpenMax IL has IIRC an extension that allows for binding a video surface to an EGL image. So maybe that would make the gstreamer interface more useable in the long run.
> and how could I make sure EGL/GLES is used instead of GLX/OGL?
If you decide to use the Qt port for testing/developing it, just make sure Qt was configured with -opengl es2. You might also want to use wayland as windowing system, i.e. build the qtwayland module.
Zhao, Halley
Comment 58
2012-11-07 02:23:49 PST
(In reply to
comment #57
)
> (In reply to
comment #56
) > > Created an attachment (id=172019) [details] [details] > > What if WebKit provides the EGL context but gst-vaapi creates its own context that is initialized to _share_ > resources with the one provided by WebKit (or the app in general)?
EGLClientBuffer is a void* indeed, gst-vaapi can export it without link to EGL. so I will try it first. your suggestion looks good, eglInitilize/eglTerminate/eglCreateContext is still required, right? how about we just pass EGLDisplay/EGLContext from webkit to gst-vaapi?
> > I wonder if it would be possible to have API in the gst_surface_converter interfaces that would allow for > the app/webkit to call into the implementation (gst-vaapi) to create (compile) the fragment shader required for rendering > the texture? Then WebKit could call that function when it needs to. > > Another approach would be for gst-vaapi to provide the shader as a string, but I don't think that's as clean. >
if there is implementation for sw decoded YUV frame, gst-vaapi can try to follow it. in my opinion, it is not so important to support YUV texture by gl shader. many driver support it inside 3d driver. (though Intel driver doesn't, it should be a push to us. and on the other hand, we can convert yuv to rgb before export it, it doesn't hit performance much)
> Would this be actually needed in WebKit if gst-vaapi continues to support as > texture as interface instead of EGL image?
texture depends on egl context and gl context, and the option between openGL and gles is decide at runtime for webkit. so, I hesitate to support texture as interface
> > Then again, I suppose one argument in favour of using EGL images instead of texture is that OpenMax IL has IIRC an extension > that allows for binding a video surface to an EGL image. So maybe that would make the gstreamer interface more useable in the long run. >
I also favour of EGLImage, just postpone it for several days
> > If you decide to use the Qt port for testing/developing it, just make sure > Qt was configured with -opengl es2. You might also want to use wayland as > windowing system, i.e. build the qtwayland module.
now, I'm using gtk port on Ubuntu12.04
Zhao, Halley
Comment 59
2012-11-07 02:35:12 PST
Created
attachment 172745
[details]
temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, fail to compile temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, but fail to compile I met a strange issue that: compile fail if I try to include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp. "MediaPlayerPrivateGStreamer.cpp:321:35: error: expected unqualified-id before numeric constant" it seems to be namespace issue, but i don't know where breaks. you can reproduce the issue by simple include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp
Philippe Normand
Comment 60
2012-11-08 02:22:30 PST
(In reply to
comment #59
)
> Created an attachment (id=172745) [details] > temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, fail to compile > > temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, but fail to compile > > I met a strange issue that: compile fail if I try to include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp. > "MediaPlayerPrivateGStreamer.cpp:321:35: error: expected unqualified-id before numeric constant" > it seems to be namespace issue, but i don't know where breaks. > > you can reproduce the issue by simple include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp
Preprocess the file with g++ -E ....cpp -o foo.cpp and check the resulting file.
Zhao, Halley
Comment 61
2012-11-08 23:49:02 PST
(In reply to
comment #60
)
> (In reply to
comment #59
) > Preprocess the file with g++ -E ....cpp -o foo.cpp and check the resulting file.
thanks Philippe. the reason is MediaPlayer::None change to MediaPlayer::0L after c pre-process. I change the definition of MediaPlayer::None to MediaPlayer::PreloadNone, it fixed the issue.
Philippe Normand
Comment 62
2012-11-12 03:30:44 PST
CCing GStreamer hacker extraordinaire, Sebastian Dröge.
Sebastian Dröge (slomo)
Comment 63
2012-11-12 06:24:00 PST
Note that the removal/replacement of the surfaceconverter/surface(buffer|meta) API will be really soon and it will be replaced with something more generic. I wouldn't worry about doing anything for 0.10 here, just concentrate on 1.0/1.2.
Zhao, Halley
Comment 64
2012-11-12 17:12:36 PST
(In reply to
comment #63
)
> Note that the removal/replacement of the surfaceconverter/surface(buffer|meta) API will be really soon and it will be replaced with something more generic. > I wouldn't worry about doing anything for 0.10 here, just concentrate on 1.0/1.2.
thanks for reminder. I works on gst 0.10 first, since it's my platform priority and gst-vaapi doesn't work with gst 1.0 yet. after enabling with gst 0.10, I will work on gst 1.0 and try to push it upstream.
Simon Hausmann
Comment 65
2012-11-12 23:57:22 PST
(In reply to
comment #63
)
> Note that the removal/replacement of the surfaceconverter/surface(buffer|meta) API will be really soon and it will be replaced with something more generic.
Interesting, I was secretly hoping for that :). Any hints about the successor?
> I wouldn't worry about doing anything for 0.10 here, just concentrate on 1.0/1.2.
Agreed!
Zhao, Halley
Comment 66
2012-11-14 19:29:17 PST
Created
attachment 174322
[details]
add eglCreateImage/glEGLImageTargetTexture2DOES to webkit after adding eglCreateImage/glEGLImageTargetTexture2DOES in webkit, I can chain up each component and run it on my PC, however pictures shows incorrectly (far from ok). 1, EGLImage seems to be created correctly after debug, I will continue debug it in gl lib 2, on gst side, finally I come to the same concept as meta data in gst 1.0.
Zhao, Halley
Comment 67
2012-12-03 23:28:33 PST
Created
attachment 177424
[details]
Bind texture from hw video surface which exports EGLClientBuffer information this is the patch works for me on Intel SandyBridge platform, which uses extended GstSurfaceBuffer to get EGL client buffer information. It is similar idea of Gst 1.0 meta data when I extend GstSurfaceBuffer.
Philippe Normand
Comment 68
2012-12-03 23:34:50 PST
We're not likely to upstream a patch based on an unstable, deprecated API I'm afraid. The focus should be on GStreamer 1.0, in my opinion.
Zhao, Halley
Comment 69
2012-12-04 01:27:13 PST
I can port it to gst 1.0(meta data interface) when hw codec is ready for gst 1.0. gst-vaapi is porting to gst 1.0, the best case is by end of this year to finish the porting.
Philippe Normand
Comment 70
2013-01-08 07:35:16 PST
(In reply to
comment #69
)
> I can port it to gst 1.0(meta data interface) when hw codec is ready for gst 1.0. > > gst-vaapi is porting to gst 1.0, the best case is by end of this year to finish the porting.
https://gitorious.org/~sreerenj/vaapi/sree-gstreamer-vaapi/commits/1.0
I got this working here with gst 1.0 but it still use the -bad unstable API. :(
Víctor M. Jáquez L.
Comment 71
2013-02-07 03:39:01 PST
Created
attachment 187046
[details]
Rebase of the proposed patches (without gst surfaces) I've updated the patch to use the current TextureMapperShaderProgram API. Please note that it still is a work in progress.
Philippe Normand
Comment 72
2013-02-07 03:48:02 PST
(In reply to
comment #71
)
> Created an attachment (id=187046) [details] > Rebase of Phil's patch >
FTR this is not my patch.
Víctor M. Jáquez L.
Comment 73
2013-02-22 03:50:17 PST
Created
attachment 189741
[details]
simple patch for video composition Since the gstreamer backend has been refactored in two classes, I applied the previous patches manually and this the result. I tried to keep the simplest approach, taking many of the elements proposed by arno., such as update player in renderer when source changed and to set needs display to the layer in renderLayerBacking, but removing the shader. What I want is to keep most of the video processing in the GStreamer pipeline, and try to avoid the context change of the buffer (GPU<->CPU). That is why applying the proposed shader would imply to move the buffer back and forth, impacting the performance. Still, the performance is bad. I would try to remove the color space conversion (YUV->RGB), and also try to avoid bring the buffer into the CPU, but push the WebKit's texture to the GStreamer context. Does this make sense?
Víctor M. Jáquez L.
Comment 74
2013-02-22 06:11:18 PST
(In reply to
comment #73
)
> Created an attachment (id=189741) [details] > simple patch for video composition >
About all these patches, I've observed two bugs: * The video doesn't play if it doesn't have the mouse above the video widget * When two or more videos are rendered in the same page, I got this error: [xcb] Unknown request in queue while dequeuing [xcb] Most likely this is a multi-threaded client and XInitThreads has not been called [xcb] Aborting, sorry about that. GtkLauncher: xcb_io.c:178: dequeue_pending_request: La declaración `!xcb_xlib_unknown_req_in_deq' no se cumple. and the GtkLauncher crashes.
Víctor M. Jáquez L.
Comment 75
2013-03-04 07:56:23 PST
Created
attachment 191247
[details]
simple patch for video composition This new patch fix the X problem, because the ::paintToTextureMapper() method is fast as possible. The buffer handler is move out to the gstreamer video sink callback. Still the method ::paintToTextureMapper() doesn't happen if the mouse is not above the video area. Furthermore, this method is not called as frequently as the video buffers arrive, so the video looks choppy.
Víctor M. Jáquez L.
Comment 76
2013-03-20 10:45:42 PDT
Created
attachment 194084
[details]
New version of the composited video This new version of the patch fixes the problem of the repainting when the pointer is out of the video area.
Víctor M. Jáquez L.
Comment 77
2013-03-25 10:05:54 PDT
Created
attachment 194877
[details]
New version of the composited video This new version of the patch honors the --enable-accelerated-compositing flag
Víctor M. Jáquez L.
Comment 78
2013-04-17 03:27:57 PDT
Created
attachment 198493
[details]
compsited video with GstSurface{Meta/Convert} support This an alternative version of my previous patch (
attachment 194877
[details]
) This version of the patch depends on: 1. The GstSurfaceMeta/GstSurfaceConvert interface, available in gst-plugins-bad v1.0 -though it is already removed for v1.2- 2. The current HEAD of official gstreamer-vaapi (git://gitorious.org/vaapi/gstreamer-vaapi.git - a0e5876) With this patch I'm able to render 1080p videos in GtkLaunch using a NVIDIA GPU (GT218 ION)
Víctor M. Jáquez L.
Comment 79
2013-04-17 03:34:22 PDT
My plan for this bug is 1. Split this patch, taking out the render related bits and file it in a different bug. 2. The other part of the patch will remain as a work in progress within this bug. 3. Work in gstreamer-vaapi and vdpau decoders to support the interfaces in GStreamer. See
http://blogs.igalia.com/vjaquez/2013/04/08/gstreamer-hackfest-2013-milan/
4. Update the patch to use those new interfaces 5. Add full-screen support Does this plan make sense?
Philippe Normand
Comment 80
2013-04-17 04:20:56 PDT
Comment on
attachment 198493
[details]
compsited video with GstSurface{Meta/Convert} support View in context:
https://bugs.webkit.org/attachment.cgi?id=198493&action=review
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:311 > + const void *srcData;
* misplaced
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:323 > + // @TODO: support cropping
What's the plan for this? Get the GstCropMeta (or whatever the name is :)) and apply in ::updateContents below?
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:381 > + m_textureMapper = textureMapper; // @TODO Is this a sane enough?
Seems sane to me!
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:385 > + if (m_texture) { > + textureMapper->drawTexture(*m_texture.get(), targetRect, matrix, opacity); > + }
No need for curly braces.
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:107 > + virtual void paintToTextureMapper(TextureMapper*, const FloatRect& targetRect, const TransformationMatrix&, float opacity);
targetRect doesn't need to be named.
> Source/WebCore/rendering/RenderLayerBacking.cpp:1785 > + if (m_graphicsLayer->hasContentsLayer()) { > + m_graphicsLayer->setContentsNeedsDisplay(); > + } else if (m_graphicsLayer->drawsContent()) { > + m_graphicsLayer->setNeedsDisplay(); > + }
Curly braces can be avoided here too, I suppose
Víctor M. Jáquez L.
Comment 81
2013-05-03 09:40:36 PDT
Created
attachment 200421
[details]
Composited video support This patch enables the video rendering using the TextureMapper. Depends on the fix at
bug #114742
Víctor M. Jáquez L.
Comment 82
2013-05-03 09:46:36 PDT
Created
attachment 200423
[details]
GstSurface{Meta/Convert} support for gstreamer-vaapi elements (Gst 1.0) This patch applies on top of
attachment #200421
[details]
. It enables the FullHD rendering using the GstSurfaceMeta and GstSurfaceConvert from GStreamer 1.0 (note: those APIs are already deprecated and will be replaced by new one in GStreamer 1.2) with gstreamer-vaapi decoding element. This patch was split because it is not mandatory for the composited video and uses deprecated GStreamer API.
Víctor M. Jáquez L.
Comment 83
2013-05-13 06:19:49 PDT
Created
attachment 201553
[details]
Patch
Gustavo Noronha (kov)
Comment 84
2013-05-13 06:34:32 PDT
Comment on
attachment 201553
[details]
Patch View in context:
https://bugs.webkit.org/attachment.cgi?id=201553&action=review
> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:394 > + return;
This got me curious, you get the texture here, why then you return instead of drawing?
Philippe Normand
Comment 85
2013-05-13 06:43:19 PDT
(In reply to
comment #84
)
> (From update of
attachment 201553
[details]
) > View in context:
https://bugs.webkit.org/attachment.cgi?id=201553&action=review
> > > Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:394 > > + return; > > This got me curious, you get the texture here, why then you return instead of drawing?
Because it's empty when created I suppose :)
Gustavo Noronha (kov)
Comment 86
2013-05-13 07:12:39 PDT
Ah! I thought it was drawing to not from the texture, thanks =)
Víctor M. Jáquez L.
Comment 87
2013-05-13 07:23:11 PDT
Created
attachment 201564
[details]
Patch
Noam Rosenthal
Comment 88
2013-05-13 07:57:06 PDT
Texmap changes seem ok, I'll let kov or Philippe the honors of r+ing/r-ing.
Philippe Normand
Comment 89
2013-05-13 08:01:09 PDT
Comment on
attachment 201564
[details]
Patch Looks good, thanks!
WebKit Commit Bot
Comment 90
2013-05-13 08:14:11 PDT
Comment on
attachment 201564
[details]
Patch Clearing flags on attachment: 201564 Committed
r150014
: <
http://trac.webkit.org/changeset/150014
>
WebKit Commit Bot
Comment 91
2013-05-13 08:14:22 PDT
All reviewed patches have been landed. Closing bug.
Chris Dumez
Comment 92
2013-05-13 12:49:55 PDT
This appears to have caused several regressions on EFL:
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgb565-pretty-diff.html
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgba4444-pretty-diff.html
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgba5551-pretty-diff.html
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-pretty-diff.html
Philippe Normand
Comment 93
2013-05-13 12:58:42 PDT
(In reply to
comment #92
)
> This appears to have caused several regressions on EFL: >
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgb565-pretty-diff.html
>
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgba4444-pretty-diff.html
>
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgba5551-pretty-diff.html
>
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-pretty-diff.html
GTK doesn't run these tests yet so I suppose that's why this regression wasn't spotted by Victor.
Chris Dumez
Comment 94
2013-05-13 13:14:25 PDT
(In reply to
comment #93
)
> (In reply to
comment #92
) > > This appears to have caused several regressions on EFL: > >
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgb565-pretty-diff.html
> >
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgba4444-pretty-diff.html
> >
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-rgba5551-pretty-diff.html
> >
http://build.webkit.org/results/EFL%20Linux%2064-bit%20Debug%20WK2/r150014%20(12195)/fast/canvas/webgl/tex-image-and-sub-image-2d-with-video-pretty-diff.html
> > GTK doesn't run these tests yet so I suppose that's why this regression wasn't spotted by Victor.
It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/
Philippe Normand
Comment 95
2013-05-13 13:29:28 PDT
Reverted
r150014
for reason: Broke video playback on WK2 and some webgl tests Committed
r150035
: <
http://trac.webkit.org/changeset/150035
>
Víctor M. Jáquez L.
Comment 96
2013-05-16 08:16:34 PDT
(In reply to
comment #94
)
> It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/
That's odd. I have tested with a nvidia and an intel GPUs and the HTML5 videos work OK. And it is plain OpenGL's texture, no other library is involved.
Kalyan
Comment 97
2013-05-16 12:35:58 PDT
(In reply to
comment #96
)
> (In reply to
comment #94
) > > > It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/ > > That's odd. I have tested with a nvidia and an intel GPUs and the HTML5 videos work OK. > > And it is plain OpenGL's texture, no other library is involved.
WebGL test failures seem bit odd here. The current solution would break HTML5 video for ports like EFL, Qt(??) which handle the final composition on UI process side. We would need to have a solution similar to WebGL(using GraphicsSurface).
Kalyan
Comment 98
2013-05-16 12:39:39 PDT
(In reply to
comment #97
)
> (In reply to
comment #96
) > > (In reply to
comment #94
) > > > > > It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/ > > > > That's odd. I have tested with a nvidia and an intel GPUs and the HTML5 videos work OK. > > > > And it is plain OpenGL's texture, no other library is involved. > > WebGL test failures seem bit odd here. >
The current solution would break HTML5 video for ports like EFL, Qt(??) which handle the final composition on UI process side. We would need to have a solution similar to WebGL(using GraphicsSurface) for us(EFL port atleast). Perhaps you could enable this only for GTK port for now (atleast with WebKit2) while we add the needed support from our side??
Víctor M. Jáquez L.
Comment 99
2013-05-29 02:54:19 PDT
Created
attachment 203145
[details]
Patch
Noam Rosenthal
Comment 100
2013-05-29 03:02:45 PDT
Comment on
attachment 203145
[details]
Patch How about !USE(COORDINATED_GRAPHICS) instead of PLATFORM(GTK) ?
Kalyan
Comment 101
2013-05-29 03:16:04 PDT
(In reply to
comment #100
)
> (From update of
attachment 203145
[details]
) > How about !USE(COORDINATED_GRAPHICS) instead of PLATFORM(GTK) ?
!USE(COORDINATED_GRAPHICS) seems a better choice for me too.
Víctor M. Jáquez L.
Comment 102
2013-05-29 04:16:57 PDT
Created
attachment 203151
[details]
Patch
WebKit Commit Bot
Comment 103
2013-05-29 06:24:38 PDT
Comment on
attachment 203151
[details]
Patch Clearing flags on attachment: 203151 Committed
r150890
: <
http://trac.webkit.org/changeset/150890
>
WebKit Commit Bot
Comment 104
2013-05-29 06:24:46 PDT
All reviewed patches have been landed. Closing bug.
Note
You need to
log in
before you can comment on or make changes to this bug.
Top of Page
Format For Printing
XML
Clone This Bug