Bug 86410 - [texmap][GStreamer][GTK] Composited Video support
Summary: [texmap][GStreamer][GTK] Composited Video support
Status: RESOLVED FIXED
Alias: None
Product: WebKit
Classification: Unclassified
Component: Media (show other bugs)
Version: 528+ (Nightly build)
Hardware: Unspecified Unspecified
: P2 Normal
Assignee: Nobody
URL:
Keywords: Qt
Depends on: 116056 114742
Blocks: 116040 116042
  Show dependency treegraph
 
Reported: 2012-05-14 16:30 PDT by Noam Rosenthal
Modified: 2019-05-02 16:20 PDT (History)
32 users (show)

See Also:


Attachments
wip patch (9.03 KB, patch)
2012-05-22 21:27 PDT, Philippe Normand
no flags Details | Formatted Diff | Diff
Patch (10.17 KB, patch)
2012-06-26 05:33 PDT, Simon Hausmann
no flags Details | Formatted Diff | Diff
Patch (9.92 KB, patch)
2012-06-26 06:31 PDT, Simon Hausmann
no flags Details | Formatted Diff | Diff
patch based on simon's one (11.60 KB, patch)
2012-10-11 10:46 PDT, arno.
no flags Details | Formatted Diff | Diff
fixes a memory leak (11.54 KB, patch)
2012-10-12 13:24 PDT, arno.
no flags Details | Formatted Diff | Diff
support gst-vaapi to upload vaSurface to webkit texture (8.24 KB, patch)
2012-11-02 02:59 PDT, Zhao, Halley
no flags Details | Formatted Diff | Diff
temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, fail to compile (10.34 KB, application/octet-stream)
2012-11-07 02:35 PST, Zhao, Halley
no flags Details
add eglCreateImage/glEGLImageTargetTexture2DOES to webkit (12.45 KB, patch)
2012-11-14 19:29 PST, Zhao, Halley
no flags Details | Formatted Diff | Diff
Bind texture from hw video surface which exports EGLClientBuffer information (18.98 KB, patch)
2012-12-03 23:28 PST, Zhao, Halley
no flags Details | Formatted Diff | Diff
Rebase of the proposed patches (without gst surfaces) (12.24 KB, patch)
2013-02-07 03:39 PST, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
simple patch for video composition (7.39 KB, patch)
2013-02-22 03:50 PST, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
simple patch for video composition (8.25 KB, patch)
2013-03-04 07:56 PST, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
New version of the composited video (8.88 KB, patch)
2013-03-20 10:45 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
New version of the composited video (19.93 KB, patch)
2013-03-25 10:05 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
compsited video with GstSurface{Meta/Convert} support (8.88 KB, patch)
2013-04-17 03:27 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
Composited video support (5.80 KB, patch)
2013-05-03 09:40 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
GstSurface{Meta/Convert} support for gstreamer-vaapi elements (Gst 1.0) (6.30 KB, patch)
2013-05-03 09:46 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
Patch (7.23 KB, patch)
2013-05-13 06:19 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
Patch (7.36 KB, patch)
2013-05-13 07:23 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
Patch (7.60 KB, patch)
2013-05-29 02:54 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff
Patch (7.76 KB, patch)
2013-05-29 04:16 PDT, Víctor M. Jáquez L.
no flags Details | Formatted Diff | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Noam Rosenthal 2012-05-14 16:30:45 PDT
Support accelerated/composited video for Qt WebKit2, possibly the same way we're going to use for WebGL via GraphicsSurfaces.
Comment 1 Philippe Normand 2012-05-22 21:17:03 PDT
I have a working patch for the GStreamer player using the TextureMapper, it works but performance is not there yet. Any help would be much appreciated, I'm not very familiar with the TextureMapper yet. Would you like to see the patch Noam?
Comment 2 Philippe Normand 2012-05-22 21:27:28 PDT
Created attachment 143452 [details]
wip patch
Comment 3 Philippe Normand 2012-05-22 21:31:24 PDT
I tested this on WebKitGTK WK1 and WK2 (with Martin's patch), I can clearly see that playback is not smooth.
Comment 4 Noam Rosenthal 2012-05-22 21:45:58 PDT
(In reply to comment #2)
> Created an attachment (id=143452) [details]
> wip patch
Cool!

You might want to convert to TextureMapperGL, and draw the texture with its ID directly. See what we're doing in GraphicsContext3DQt.cpp.
Comment 5 Philippe Normand 2012-05-22 21:48:15 PDT
(In reply to comment #4)
> (In reply to comment #2)
> > Created an attachment (id=143452) [details] [details]
> > wip patch
> Cool!
> 
> You might want to convert to TextureMapperGL, and draw the texture with its ID directly. See what we're doing in GraphicsContext3DQt.cpp.

I thought I was already using the GL implementation. Hum I'll continue work on the patch, thanks for the help :)
Comment 6 Noam Rosenthal 2012-05-22 21:52:06 PDT
> I thought I was already using the GL implementation. Hum I'll continue work on the patch, thanks for the help :)

You might, but if you cast to it, you can call drawTexture with a texture id instead of keeping your own BitmapTexture around, which feels a bit akward.
Comment 7 Noam Rosenthal 2012-05-22 21:56:30 PDT
Comment on attachment 143452 [details]
wip patch

View in context: https://bugs.webkit.org/attachment.cgi?id=143452&action=review

> Source/WebCore/platform/graphics/gstreamer/VideoSinkGStreamer.cpp:164
> +        priv->texture->updateContents(GST_BUFFER_DATA(buffer), WebCore::IntRect(WebCore::IntPoint(0, 0), size), WebCore::IntPoint(0,0), bytesPerLine);

I think the problem lies here... looks like there's going to be some expensive software copy here. I'm not familiar enough with gst texture sink to know how to go about it though.
Comment 8 Philippe Normand 2012-05-23 08:04:08 PDT
Retitling the bug, both Qt and GTK use the GStreamer player, so I think this new title makes more sense than the previous one.

Thanks Noam for your insights on the wip patch, I'll try to send a new version soon.
Comment 9 Philippe Normand 2012-05-23 17:56:56 PDT
(In reply to comment #7)
> (From update of attachment 143452 [details])
> View in context: https://bugs.webkit.org/attachment.cgi?id=143452&action=review
> 
> > Source/WebCore/platform/graphics/gstreamer/VideoSinkGStreamer.cpp:164
> > +        priv->texture->updateContents(GST_BUFFER_DATA(buffer), WebCore::IntRect(WebCore::IntPoint(0, 0), size), WebCore::IntPoint(0,0), bytesPerLine);
> 
> I think the problem lies here... looks like there's going to be some expensive software copy here. I'm not familiar enough with gst texture sink to know how to go about it though.

Right, also we can't use this texture sink, I think.
Comment 10 Philippe Normand 2012-05-30 10:52:18 PDT
Maybe someone clicked on the fancy "Submit for EWS analysis" button then :) Anyway, no big deal...
Comment 11 Philippe Normand 2012-05-30 10:52:43 PDT
(In reply to comment #10)
> Maybe someone clicked on the fancy "Submit for EWS analysis" button then :) Anyway, no big deal...

wrong bug, sorry
Comment 12 Philippe Normand 2012-06-07 13:31:52 PDT
Maybe one way to proceed would be to use the GstSurfaceBuffer, like the Clutter video-sink. This API is in -bad though and its future seems uncertain, I wonder if we should target only gst 0.11/1.0 support and forget about 0.10 :)


http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-libs/html/gst-plugins-bad-libs-GstSurfaceBuffer.html
Comment 13 Simon Hausmann 2012-06-11 02:03:07 PDT
(In reply to comment #12)
> Maybe one way to proceed would be to use the GstSurfaceBuffer, like the Clutter video-sink. This API is in -bad though and its future seems uncertain, I wonder if we should target only gst 0.11/1.0 support and forget about 0.10 :)
> 
> 
> http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-bad-libs/html/gst-plugins-bad-libs-GstSurfaceBuffer.html

That looks great. Screw old versions :)
Comment 14 Simon Hausmann 2012-06-26 05:33:15 PDT
Created attachment 149515 [details]
Patch
Comment 15 Simon Hausmann 2012-06-26 05:34:42 PDT
(In reply to comment #14)
> Created an attachment (id=149515) [details]
> Patch

This is a first shot at an implementation that uses GstSurfaceBuffer and SurfaceBufferConverter. I didn't mark it for review yet because there are still bugs to iron out:

  * Sometimes the decoder gets stuck and stops delivering new frames.
  * Shutdown is "crashy" :)

Nevertheless any feedback is welcome!
Comment 16 Noam Rosenthal 2012-06-26 06:17:36 PDT
Comment on attachment 149515 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=149515&action=review

Awesome!

> Source/WebCore/WebCore.pri:203
> +        contains(DEFINES, WTF_USE_GSTREAMER_SURFACEBUFFER=1): PKGCONFIG += gstreamer-basevideo-0.10

I'd probably use HAVE(GSTREAMER_SURFACEBUFFER) instead of USE;

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1642
> +#if USE(ACCELERATED_COMPOSITING) && USE(GSTREAMER_SURFACEBUFFER) && USE(TEXTURE_MAPPER_GL)

Let's just go with USE(TEXTURE_MAPPER_GL)
Comment 17 Simon Hausmann 2012-06-26 06:28:46 PDT
Comment on attachment 149515 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=149515&action=review

>> Source/WebCore/WebCore.pri:203
>> +        contains(DEFINES, WTF_USE_GSTREAMER_SURFACEBUFFER=1): PKGCONFIG += gstreamer-basevideo-0.10
> 
> I'd probably use HAVE(GSTREAMER_SURFACEBUFFER) instead of USE;

Ahh right. I always confuse ENABLE, USE and HAVE.

>> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1642
>> +#if USE(ACCELERATED_COMPOSITING) && USE(GSTREAMER_SURFACEBUFFER) && USE(TEXTURE_MAPPER_GL)
> 
> Let's just go with USE(TEXTURE_MAPPER_GL)

Alright.
Comment 18 Simon Hausmann 2012-06-26 06:31:13 PDT
Created attachment 149522 [details]
Patch
Comment 19 Noam Rosenthal 2012-06-26 06:34:34 PDT
Comment on attachment 149522 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=149522&action=review

I only have nitpicks, maybe someone with more GStreamer experience has comments of value :)
The TextureMapper stuff looks good.

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1688
> +    if (m_buffer && GST_IS_SURFACE_BUFFER(m_buffer)) {
> +        GstSurfaceBuffer* surface = GST_SURFACE_BUFFER(m_buffer);
> +
> +         GstCaps* caps = GST_BUFFER_CAPS(m_buffer);
> +         GstVideoFormat format;
> +         int width, height;
> +         gst_video_format_parse_caps(caps, &format, &width, &height);
> +         IntSize size(width, height);
> +
> +        if (!m_texture)
> +            m_texture = textureMapper->createTexture();
> +
> +        if (m_texture->size() != size) {
> +            m_texture->reset(size);
> +            m_surfaceConverter.clear();
> +        }
> +
> +        if (!m_surfaceConverter) {
> +            GValue textureValue = G_VALUE_INIT;
> +            g_value_init(&textureValue, G_TYPE_UINT);
> +            g_value_set_uint(&textureValue, static_cast<BitmapTextureGL*>(m_texture.get())->id());
> +
> +            m_surfaceConverter.set(gst_surface_buffer_create_converter(surface, "opengl", &textureValue));
> +            g_return_if_fail(m_surfaceConverter);
> +        }
> +
> +        gst_surface_converter_upload(m_surfaceConverter.get(), surface);
> +
> +        gst_buffer_unref(m_buffer);
> +        m_buffer = 0;
> +    }

Maybe this can live in its own function, e.g. convertBufferToTextureIfNeeded
Comment 20 Philippe Normand 2012-06-26 07:26:05 PDT
FYI I recently started work on this again, as well. Not using SurfaceBuffer  yet though as I'm uncertain about its future in GStreamer 1.0
Comment 21 Simon Hausmann 2012-06-26 07:30:25 PDT
(In reply to comment #20)
> FYI I recently started work on this again, as well. Not using SurfaceBuffer  yet though as I'm uncertain about its future in GStreamer 1.0

What's the new interface called?
Comment 22 Philippe Normand 2012-06-26 08:19:17 PDT
(In reply to comment #21)
> (In reply to comment #20)
> > FYI I recently started work on this again, as well. Not using SurfaceBuffer  yet though as I'm uncertain about its future in GStreamer 1.0
> 
> What's the new interface called?

There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not.
Comment 23 Simon Hausmann 2012-06-26 08:38:20 PDT
(In reply to comment #22)
> (In reply to comment #21)
> > (In reply to comment #20)
> > > FYI I recently started work on this again, as well. Not using SurfaceBuffer  yet though as I'm uncertain about its future in GStreamer 1.0
> > 
> > What's the new interface called?
> 
> There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not.

But there has to be an alternative in the works, right? Otherwise the clutter-gst video sink will also regress.
Comment 24 Philippe Normand 2012-06-26 09:12:34 PDT
(In reply to comment #23)
> (In reply to comment #22)
> > (In reply to comment #21)
> > > (In reply to comment #20)
> > > > FYI I recently started work on this again, as well. Not using SurfaceBuffer  yet though as I'm uncertain about its future in GStreamer 1.0
> > > 
> > > What's the new interface called?
> > 
> > There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not.
> 
> But there has to be an alternative in the works, right? Otherwise the clutter-gst video sink will also regress.

The clutter-gst video-sink might need an update, if GstVideoSurfaceBuffer changes, I guess :)
There's a very interesting draft about the hw-accel plans in 1.0 there: http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-hw-acceleration.txt
Comment 25 Simon Hausmann 2012-06-26 09:59:05 PDT
(In reply to comment #24)
> (In reply to comment #23)
> > (In reply to comment #22)
> > > (In reply to comment #21)
> > > > (In reply to comment #20)
> > > > > FYI I recently started work on this again, as well. Not using SurfaceBuffer  yet though as I'm uncertain about its future in GStreamer 1.0
> > > > 
> > > > What's the new interface called?
> > > 
> > > There's no exact plan about keeping GstSurfaceBuffer in 1.0 or not.
> > 
> > But there has to be an alternative in the works, right? Otherwise the clutter-gst video sink will also regress.
> 
> The clutter-gst video-sink might need an update, if GstVideoSurfaceBuffer changes, I guess :)

Right.

> There's a very interesting draft about the hw-accel plans in 1.0 there: http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-hw-acceleration.txt

Ohh, that is interesting indeed. I'll try to follow what's going on there and continue to polish this patch. I have an idea what's causing the issue I'm currently seeing, so I can't give up ;)
Comment 26 Philippe Normand 2012-06-26 10:24:53 PDT
Comment on attachment 149522 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=149522&action=review

If that patch lands it'd be great if it doesn't break the GStreamer 0.11 WebKit build :) There's no EWS for this setup yet though.

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:1665
> +         GstCaps* caps = GST_BUFFER_CAPS(m_buffer);
> +         GstVideoFormat format;
> +         int width, height;
> +         gst_video_format_parse_caps(caps, &format, &width, &height);
> +         IntSize size(width, height);

Please use getVideoSizeAndFormatFromCaps()
Comment 27 Philippe Normand 2012-06-26 20:47:06 PDT
Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :)

For 0.11, the GstSurfaceBuffer/Converter is currently implemented as buffer metadata btw.
Comment 28 Simon Hausmann 2012-06-26 23:09:22 PDT
(In reply to comment #27)
> Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :)

I'm using (K)Uubuntu 12.04, which is shipping libgstreamer-plugins-bad0.10-dev 0.10.22.3-2ubuntu2

> For 0.11, the GstSurfaceBuffer/Converter is currently implemented as buffer metadata btw.

Ah yeah, I noticed. I think if we end up landing this patch in the trunk, we should either use the one or the other API. I'd be happy to use 0.11 myself, if I can find convenient packages somewhere.
Comment 29 Philippe Normand 2012-06-27 19:30:42 PDT
(In reply to comment #28)
> (In reply to comment #27)
> > Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :)
> 
> I'm using (K)Uubuntu 12.04, which is shipping libgstreamer-plugins-bad0.10-dev 0.10.22.3-2ubuntu2
> 

Oh right, I got it now. Managed to test your patch in WebKitGTK but I see no video at all, the video rectangle is correctly sized though and the controls are visible. Haven't debugged this yet.

Anyway, thanks a lot for the patch Simon! I'm not sure about using this interface  unless its future gets more clear :)
Comment 30 Simon Hausmann 2012-06-27 21:52:03 PDT
(In reply to comment #29)
> (In reply to comment #28)
> > (In reply to comment #27)
> > > Simon, which distro do you use? There's no gst-plugins-bad pkg shipping gstreamer-basevideo-0.10 in Debian unstable, I can't test your patch :)
> > 
> > I'm using (K)Uubuntu 12.04, which is shipping libgstreamer-plugins-bad0.10-dev 0.10.22.3-2ubuntu2
> > 
> 
> Oh right, I got it now. Managed to test your patch in WebKitGTK but I see no video at all, the video rectangle is correctly sized though and the controls are visible. Haven't debugged this yet.

Hm, that's odd. I'm using an nvidia card and the vdpau-va driver to be able to use gst-vaapi, which seems to be the only decoder that currently supports ths interface.

The problem I'm seeing is that at some point the video sink in WebKit doesn't receive any frame anymore, until I seek. It feels like something is stuck. Oh the joy of multi-threading :)

> Anyway, thanks a lot for the patch Simon! I'm not sure about using this interface  unless its future gets more clear :)

I totally understand.

I'll be OOO for a while, but I'd like to pick it up again when I'm back. I don't even mind if the patch continues to just live in Bugzilla in the end. But I want to get it working properly first.

I do see one problem with the interface and that's that I feel that it would make proxying tedious. Imagine an architecture like OMAP, where there's a good chance that you already have a gst decoder that can use the DSP to decode efficiently into memory. At the same time you have an existing kernel interface to allocating contiguous framebuffer memory the GPU can use (OMAPFB_SETUP_MEM) and your EGL implementation supports binding such a piece of memory against a texture somehow. In such an architecture it feels cumbersome to implement a proxying element that allocates the GstBuffers backed by FB memory, let the dsp decoding codec element write into those and then do essentially free conversion to a texture. In order to completely transparently implement the GstSufaceBuffer interface it would _have_ to make an unecessary copy of the buffer to hide the fact that the GPU can render the YUV surface natively but it requires the entity drawing (application) to use the GL_OES_EGL_image_external extension in the fragment shader.

I'm looking forward to what the gst folks will come up with :)
Comment 31 Simon Hausmann 2012-07-11 02:00:59 PDT
I've been looking a bit at vdpau, which mesa also might get support for. NVidia has some really nice GL extensions that allow conveniently binding a vdp video surface to a texture.

If the gstreamer vdpau sink had a way of exposing the vdpau output surface as a property and working without an x-window, then that would be a nice alternative. We could use that as video sink, retrieving the vdp output surface pointer, bind it to a texture with the GL extensions and be pretty much done.
Comment 32 Philippe Normand 2012-07-11 03:31:31 PDT
As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?
Comment 33 Simon Hausmann 2012-07-11 05:24:33 PDT
(In reply to comment #32)
> As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?

I don't think that would buy us anything, our pipeline would be as slow as it is today.

I'm interested in a solution that gives the best possible performance. If it's restricted to one type of video acceleration API initially, then that's fine.

Having one _properly_ accelerated code path is great for demonstrating the required interfaces and benchmarking new approaches. It also makes it easier to then iterate maybe with the gstreamer folks towards an interface that is more generic.
Comment 34 Martin Robinson 2012-07-11 11:05:54 PDT
(In reply to comment #33)
> (In reply to comment #32)
> > As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?
> 
> I don't think that would buy us anything, our pipeline would be as slow as it is today.

One benefit is that color space conversion could be moved to a shader.
Comment 35 Simon Hausmann 2012-10-08 01:46:02 PDT
Not actively working on this patch anymore. Would love to pick it up again, as it seems the latest gstreamer 1.0 source have received fixes that may improve the reliability of this interface and maybe make it usable, but I prioritize other issues over this currently.
Comment 36 arno. 2012-10-11 10:40:13 PDT
(In reply to comment #35)
> Not actively working on this patch anymore. Would love to pick it up again, as it seems the latest gstreamer 1.0 source have received fixes that may improve the reliability of this interface and maybe make it usable, but I prioritize other issues over this currently.

Hi,

From what I understand from
http://blogs.gnome.org/uraeus/2012/09/26/gstreamer-1-0-released/

> The changes will also make it a lot easier to write plugins and use plugins on PC platforms which use the GPU for decoding or encoding and that use OpenGL for rendering the playback

it is currently not possible to get an opengl texture from gstreamer-1.0 yet (although it could be possible in the future). Is there some way we can take advantage of gstreamer now to accelerate video ?

(In reply to comment #34)
> (In reply to comment #33)
> > (In reply to comment #32)
> > > As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?
> > 
> > I don't think that would buy us anything, our pipeline would be as slow as it is today.
> 
> One benefit is that color space conversion could be moved to a shader.

do you suggest we ask gstreamer to send the buffer data as yuv, and perform the conversion ourselves. But so, how can we evaluate the performance win ?
Comment 37 arno. 2012-10-11 10:46:49 PDT
Created attachment 168253 [details]
patch based on simon's one

Here is a wip patch I've been working on. It just uploads a texture to textureMapper, and applies a shader on it (which does nothing, but may be used if we decide to perform color space conversions).
It is mainly simon's patch with the following changes:
set needs display to the layer in renderLayerBacking otherwise, the rendering was not updated when new frames were available.
update player in renderer when source changed, to avoid a crash when changing video.src
Comment 38 Simon Hausmann 2012-10-12 01:47:16 PDT
(In reply to comment #36)
> (In reply to comment #35)
> > Not actively working on this patch anymore. Would love to pick it up again, as it seems the latest gstreamer 1.0 source have received fixes that may improve the reliability of this interface and maybe make it usable, but I prioritize other issues over this currently.
> 
> Hi,
> 
> From what I understand from
> http://blogs.gnome.org/uraeus/2012/09/26/gstreamer-1-0-released/
> 
> > The changes will also make it a lot easier to write plugins and use plugins on PC platforms which use the GPU for decoding or encoding and that use OpenGL for rendering the playback
> 
> it is currently not possible to get an opengl texture from gstreamer-1.0 yet (although it could be possible in the future). Is there some way we can take advantage of gstreamer now to accelerate video ?

Well, I think what the blog is referring to is the fact that there's no stable supported API yet. However
there exists an unstable API and I think the best way to make that unstable API stable is to use it, provide
feedback if it needs changes and otherwise report back to the gstreamer folks that it's working great (in case it is),
so that they can get confidence in it and declare it stable.

That API is now called GstSurfaceMeta (see gstreamer-1.0/gst/video/gstsurfacemeta.h).
Comment 39 Simon Hausmann 2012-10-12 01:51:08 PDT
(In reply to comment #34)
> (In reply to comment #33)
> > (In reply to comment #32)
> > > As a first step would it be possible to "simply" make our video sink upload to GL texture, without any specifics about HW-decoding and implication on specific GPU chipset/driver?
> > 
> > I don't think that would buy us anything, our pipeline would be as slow as it is today.
> 
> One benefit is that color space conversion could be moved to a shader.

Sure, but that's a halfway-there solution only. Other frameworks (like Stagefright on Android, AVFoundation on OS X and Media Foundation on Windows) demonstrate nicely that all this belongs behind a curtain that WebKit shouldn't have to be concerned about.
Comment 40 Martin Robinson 2012-10-12 08:23:37 PDT
(In reply to comment #39)
 
> > One benefit is that color space conversion could be moved to a shader.
> 
> Sure, but that's a halfway-there solution only. Other frameworks (like Stagefright on Android, AVFoundation on OS X and Media Foundation on Windows) demonstrate nicely that all this belongs behind a curtain that WebKit shouldn't have to be concerned about.

Sure, in an ideal world and for the future, but we can also make things faster for people not using the bleedingest edge versions of Gstreamer.
Comment 41 arno. 2012-10-12 11:04:56 PDT
(In reply to comment #38)

> That API is now called GstSurfaceMeta (see gstreamer-1.0/gst/video/gstsurfacemeta.h).

I'm not successful in using it.
- documentation is obsolete. It references #GST_VIDEO_CAPS_SURFACE which is no more defined.
- I've tried to set the GST_STATIC_PADS to "video/x-surface, opengl=true" or "video/x-surface, type=vaapi" and the video does not render (render sink callback is not called)
- in gstreamer sources, x-surface is only checked for subtitle overlays.
Comment 42 arno. 2012-10-12 11:10:51 PDT
(In reply to comment #36)

> do you suggest we ask gstreamer to send the buffer data as yuv, and perform the conversion ourselves. But so, how can we evaluate the performance win ?

I made some tests: I applied my patch. Then I asked gstreamer for some planar yuv format, and performed the conversion on the shader. With a few videos on a page, GtkLauncher seemed less responsive with accelerated compositing than without. So unless, there is a better way to achieve that, it seems like it decreases performance. I do not have numbers to back that claim though.
Comment 43 Martin Robinson 2012-10-12 11:33:39 PDT
(In reply to comment #42)
> (In reply to comment #36)
> 
> > do you suggest we ask gstreamer to send the buffer data as yuv, and perform the conversion ourselves. But so, how can we evaluate the performance win ?
> 
> I made some tests: I applied my patch. Then I asked gstreamer for some planar yuv format, and performed the conversion on the shader. With a few videos on a page, GtkLauncher seemed less responsive with accelerated compositing than without. So unless, there is a better way to achieve that, it seems like it decreases performance. I do not have numbers to back that claim though.

Could it be that you are uploading and downloading  to th GPU more than you think? What happens if you turn the shader off and just render the YUV data directly? Is accelerated compositing slow for you, in general?
Comment 44 arno. 2012-10-12 13:24:13 PDT
Created attachment 168470 [details]
fixes a memory leak

(In reply to comment #43)

> > I made some tests: I applied my patch. Then I asked gstreamer for some planar yuv format, and performed the conversion on the shader. With a few videos on a page, GtkLauncher seemed less responsive with accelerated compositing than without. So unless, there is a better way to achieve that, it seems like it decreases performance. I do not have numbers to back that claim though.
> 
> Could it be that you are uploading and downloading  to th GPU more than you think? What happens if you turn the shader off and just render the YUV data directly? Is accelerated compositing slow for you, in general?

There was indeed a problem with the patch. It did not unref the gstreamer buffer properly, and a lot of memory leaked. Here is an updated patch. With it, there is no noticeable difference between requesting xrgb or yuv format.
Comment 45 arno. 2012-10-24 11:53:37 PDT
(In reply to comment #40)
> (In reply to comment #39)
> 
> > > One benefit is that color space conversion could be moved to a shader.
> > 
> > Sure, but that's a halfway-there solution only. Other frameworks (like Stagefright on Android, AVFoundation on OS X and Media Foundation on Windows) demonstrate nicely that all this belongs behind a curtain that WebKit shouldn't have to be concerned about.
> 
> Sure, in an ideal world and for the future, but we can also make things faster for people not using the bleedingest edge versions of Gstreamer.

Hi,
I could indeed measure a performance win when performing the color conversion in a shader.

I measured gst_video_convert_transform_frame execution time in gstreamer-1.0 for a 480x360 video.  color conversion from I420 to BGRx takes between 2000ns and 6000ns (2 to 6ms) color conversion from I420 to AYUV takes between 300ns and 500ns

current paint method takes between 400ns and 2000ns Without a shader, paintTextureMapper takes between 400ns and 700ns.  With a AYUV -> BGRx shader, paintTextureMapper takes between 600ns and 800ns.  Except first time where it take about 12000ns (12ms).

So it looks like with current gstreamer implementation, we have some win by performing the conversion ourself. But I'm concerned about doing that: What will happen when gstreamer change ? For example by implementing the plugins which will allow us to get the meta surface, or by optimizing the color conversion in some way. Does that mean we would have to perform some version check and have two different code paths then ?

Alternatively, it looks like we still have a (small) gain just by using the
texture mapper without a shader.
Comment 46 Zhao, Halley 2012-10-26 02:28:13 PDT
I just start to look at webkit, please correct me if there is any misunderstanding.
here are some concern from me for the patch:

1. paint video to graphics layer by drawTexture() is heavy
I expect (and think it is) that video is the only RenderObject in a graphics layer, even NOT share the same graphics layer with player controls; otherwise we composite each video frame twice. one is in the backing surface of graphics layer, another is for final compositing of each graphics layer.
compositing includes glClear/drawTexture/swapBuffer, it is really heavy.
how about the following suggestion:
set video texture to graphics layer by bindSurface() and skip (or do nothing) in beginPainting/endPainting/drawTexture
I had thought to use cairo gl surface for video before, since sw path of gtk/efl uses cairo. do you think it is possible to create a simple graphics layer with one texture only(without the complexity of TextureMapper)?

2. There is both ::paint and ::paintToTextureMapper in MediaPlayerPrivateGStreamer, they may conflict
I don’t know where ::paint inherits from, but when TextureMapperPlatformLayer is used, should we avoid inherit the interface introduce ::paint function.
should we only use ::paint in sw path and ::paintToTextureMapper in ACCELERATED_COMPOSITING path?

3. Create an empty video texture (m_texture), then upload video data to this texture introduce data copy
I’d like create texture from existing native image like: glEGLImageTargetTexture2DOES() from an existing EGLImage
how could I configure webkit to use EGL/GLES instead of GLX/GL? for Qt port, it may decided inside qt. I cares more for gtk/efl port.

thanks
Comment 47 Simon Hausmann 2012-10-26 02:58:22 PDT
(In reply to comment #46)
> I just start to look at webkit, please correct me if there is any misunderstanding.
> here are some concern from me for the patch:
> 
> 1. paint video to graphics layer by drawTexture() is heavy
> I expect (and think it is) that video is the only RenderObject in a graphics layer, even NOT share the same graphics layer with player controls; otherwise we composite each video frame twice. one is in the backing surface of graphics layer, another is for final compositing of each graphics layer.
> compositing includes glClear/drawTexture/swapBuffer, it is really heavy.
> how about the following suggestion:
> set video texture to graphics layer by bindSurface() and skip (or do nothing) in beginPainting/endPainting/drawTexture

That is kind of what is happening. Each graphics layer has a corresponding TextureMapperPlatformLayer. When the texture mapper renders the scene, it asks each platform layer to paint itself (this is a somewhat simplified model). So if you have a html document with a video element and a bunch of controls on top, you should end up with three platform layers:

    (1) The background / regular content is cached in tile textures as part of a TextureMapperTiledBackingStore (which is a sub-class of TextureMapperPlatformLayer). Its paintTextTextureMapper draws the tile textures.
    (2) The video gets its own TextureMapperPlatformLayer, that's what MediaPlayerPrivateGStreamer is and implements. So after painting the tiles of the static content, it's up to the MediaPlayerPrivateGStreamer to draw the video texture.
    (3) Lastly the controls on top are cached in another layer (I think usually also again a TextureMapperTiledBackingStore) and are drawn in one shot.

So for an entire scene there is no double-composition, the video frames are not rendered into an intermediate buffer.

(This is a WK1 TextureMapperGL model, with WK2 video is not implemented yet and might either get an intermediate GraphicsSurface or if the platform supports it perhaps we could also avoid the extra composition).

> I had thought to use cairo gl surface for video before, since sw path of gtk/efl uses cairo. do you think it is possible to create a simple graphics layer with one texture only(without the complexity of TextureMapper)?

I think that's what we do have today (well, not really, but could :)

> 2. There is both ::paint and ::paintToTextureMapper in MediaPlayerPrivateGStreamer, they may conflict
> I don’t know where ::paint inherits from, but when TextureMapperPlatformLayer is used, should we avoid inherit the interface introduce ::paint function.
> should we only use ::paint in sw path and ::paintToTextureMapper in ACCELERATED_COMPOSITING path?

::paint() will still be called when we're in the accelerated compositing code path, but we do an early return from there
if we are, so no extra pixels are painted. I don't remember exactly where it is called from, it might be the population of the root layer? I've seen it get called at least, but the early return should catch it.
 
> 3. Create an empty video texture (m_texture), then upload video data to this texture introduce data copy
> I’d like create texture from existing native image like: glEGLImageTargetTexture2DOES() from an existing EGLImage
> how could I configure webkit to use EGL/GLES instead of GLX/GL? for Qt port, it may decided inside qt. I cares more for gtk/efl port.

This may be a missing feature in BitmapTextureGL in the texture mapper I would say. But conceptually it sounds like what we should indeed aim for.
Comment 48 Zhao, Halley 2012-10-26 03:11:59 PDT
I run into SoftwareMode in TexTureMapper::create(), and shows nothing since hw video frame doesn't match sw path.
I uses Ubuntu12.04 with KDE desktop, build webkit with "./build-webkit --qt", my qt version 4.8 which is installed from default repo 
where is wrong for my system or my build configuration?
thanks.
Comment 49 Zhao, Halley 2012-10-26 03:27:05 PDT
(In reply to comment #47)
>     (2) The video gets its own TextureMapperPlatformLayer, that's what MediaPlayerPrivateGStreamer is and implements. So after painting the tiles of the static content, it's up to the MediaPlayerPrivateGStreamer to draw the video texture.

do you mean beginPainting and endPainting will NOT called for each video frame?
if yes, that's great.

since I run into SoftwareMode on my platform, I don't know exactly how data flows.
I just read the source code that "textureMapper->drawTexture" (in MediaPlayerPrivateGStreamer::paintToTextureMapper) will call into TextureMapperGL::drawTextureQuadWithProgram(), I think glClear() is required before the draw func in it.
Comment 50 Simon Hausmann 2012-10-26 03:30:19 PDT
(In reply to comment #49)
> (In reply to comment #47)
> >     (2) The video gets its own TextureMapperPlatformLayer, that's what MediaPlayerPrivateGStreamer is and implements. So after painting the tiles of the static content, it's up to the MediaPlayerPrivateGStreamer to draw the video texture.
> 
> do you mean beginPainting and endPainting will NOT called for each video frame?
> if yes, that's great.

I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen.
 
I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second).

Can you elaborate how your ideal rendering pass would look like?
Comment 51 Zhao, Halley 2012-10-26 05:15:43 PDT
> I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen.
> 
> I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second).
> 
> Can you elaborate how your ideal rendering pass would look like?

eventually, we just need one texture for each Graphics Layer.
so 3D pipeline (glClear/drawTexture/swapBuffer) isn't necessary when we have such texture alerady. for example: we can use bindSurface only with video texture as input. any gap for it?

if we reach the above solution, it means TextuerMapper is heavy for us, we can create a simple class for it.

comparing to controls, video layer is larger and higher update frequency, it is really expensive to do glClear and copy the texture.
Comment 52 Simon Hausmann 2012-10-26 05:25:47 PDT
(In reply to comment #51)
> > I'm sure it's called for each _screen visible_ frame. Every time you want to re-render the entire WebKit scene onto the screen.
> > 
> > I don't see a way around it though, it's not specific to video frames, isn't it? I mean, as the video players other content needs updating, too (although only every second).
> > 
> > Can you elaborate how your ideal rendering pass would look like?
> 
> eventually, we just need one texture for each Graphics Layer.

That is what we have. Actually, regular content layers usually have multiple textures because of tiling. But video for example has one texture only. That texture is an implementation detail of MediaPlayerPrivateGStreamer as implementation of TextureMapperPlatformLayer.

> so 3D pipeline (glClear/drawTexture/swapBuffer) isn't necessary when we have such texture alerady. for example: we can use bindSurface only with video texture as input. any gap for it?
> 
> if we reach the above solution, it means TextuerMapper is heavy for us, we can create a simple class for it.
> 
> comparing to controls, video layer is larger and higher update frequency, it is really expensive to do glClear and copy the texture.

But we don't copy the texture, we only draw it once, onto the screen (into the back-buffer).

Can you elaborate what you mean with bindSurface()? I think I must be misunderstanding something here.

Note that the video texture is _not_ copied into an intermediate surface, it is copied straight onto the screen.
Comment 53 Zhao, Halley 2012-10-26 06:46:08 PDT
sorry, I misreading the patch.

"textureMapperGL->bindSurface(m_texture.get());" 
it uses video texture for video's graphics layer.

"textureMapper->drawTexture(*m_texture.get(), targetRect, matrix, opacity, mask);" 
from your explanation, I understand that it happens in 2nd phase compositing. -- I had thought it happens in 1st phase compositing before.

thanks for your detailed explanation.
Comment 54 Zhao, Halley 2012-10-31 00:08:12 PDT
Is it possible to use YUV texture in 2nd phase compositing?

the advantage is that we needn't convert YUV to RGB before 2nd phase compositing, (something like reduce memcpy once).

the disadvantage is that we have to introduce some complexity ( some detail of YUV texture) in TextureMapperGL.
Comment 55 Simon Hausmann 2012-10-31 02:42:16 PDT
(In reply to comment #54)
> Is it possible to use YUV texture in 2nd phase compositing?
> 
> the advantage is that we needn't convert YUV to RGB before 2nd phase compositing, (something like reduce memcpy once).
> 
> the disadvantage is that we have to introduce some complexity ( some detail of YUV texture) in TextureMapperGL.

I think that would be a reasonable extension to TextureMapperGL and I think we need it anyway (despite being SoC/GPU specific). Of course the GStreamer interface doesn't specify anything there right now, so I think it will require some extensions. Perhaps as simple as the GstSurfaceConverter implementation providing a function for compiling the fragment shader needed for rendering the texture.
Comment 56 Zhao, Halley 2012-11-02 02:59:03 PDT
Created attachment 172019 [details]
support gst-vaapi to upload vaSurface to webkit texture

revise Simon's patch with fixes(share the X Display between webkit and gst pipeline), 
it runs on my sandy bridge PC; however, playback stops after several seconds.

it it not my target since there is TFP and FBO in gst-vaapi to support gst_surface_converter_upload(), 
it is something like 3 times memory copy.

next, I will work on the following proposal:
  1. gst-vaapi export a 'handler' for video surface
  2. a EGLImage can be create from this handler by eglCreateImageKHR()
  3. texture is created from the above EGLImage by glEGLImageTargetTexture2DOES()

something are not certain are:
  a) should the step 2 be done in gst-vaapi?
    pros: EGLImage is generic, webkit will be happy
    cons: EGL context should be shared between webkit and gst-vaapi, it increases the complexity
  b) should we export YUV (indicated by the 'handler') from gst-vaapi?
    pros: it is better for performance, driver needn't convert YUV to RGB before export the buffer.
    cons: complexity will be introduced on webkit side: detail info of the YUV buffer layout and corresponding shader to render such YUV texture.
    anyway, I will try RGB first. my SandyBridge can convert 720p video frame to RGB format at ~400fps

I'm not familiar with webkit, 
  could someone give clues on adding glEGLImageTargetTexture2DOES? 
  and how could I make sure EGL/GLES is used instead of GLX/OGL?
Comment 57 Simon Hausmann 2012-11-02 03:13:05 PDT
(In reply to comment #56)
> Created an attachment (id=172019) [details]
> support gst-vaapi to upload vaSurface to webkit texture
> 
> revise Simon's patch with fixes(share the X Display between webkit and gst pipeline), 
> it runs on my sandy bridge PC; however, playback stops after several seconds.

I've observed the same :)
 
> it it not my target since there is TFP and FBO in gst-vaapi to support gst_surface_converter_upload(), 
> it is something like 3 times memory copy.
> 
> next, I will work on the following proposal:
>   1. gst-vaapi export a 'handler' for video surface
>   2. a EGLImage can be create from this handler by eglCreateImageKHR()
>   3. texture is created from the above EGLImage by glEGLImageTargetTexture2DOES()
> 
> something are not certain are:
>   a) should the step 2 be done in gst-vaapi?
>     pros: EGLImage is generic, webkit will be happy
>     cons: EGL context should be shared between webkit and gst-vaapi, it increases the complexity

What if WebKit provides the EGL context but gst-vaapi creates its own context that is initialized to _share_
resources with the one provided by WebKit (or the app in general)?

>   b) should we export YUV (indicated by the 'handler') from gst-vaapi?
>     pros: it is better for performance, driver needn't convert YUV to RGB before export the buffer.
>     cons: complexity will be introduced on webkit side: detail info of the YUV buffer layout and corresponding shader to render such YUV texture.
>     anyway, I will try RGB first. my SandyBridge can convert 720p video frame to RGB format at ~400fps

I wonder if it would be possible to have API in the gst_surface_converter interfaces that would allow for
the app/webkit to call into the implementation (gst-vaapi) to create (compile) the fragment shader required for rendering
the texture? Then WebKit could call that function when it needs to.

Another approach would be for gst-vaapi to provide the shader as a string, but I don't think that's as clean.
 
> I'm not familiar with webkit, 
>   could someone give clues on adding glEGLImageTargetTexture2DOES? 

Would this be actually needed in WebKit if gst-vaapi continues to support as texture as interface instead of EGL image?

Then again, I suppose one argument in favour of using EGL images instead of texture is that OpenMax IL has IIRC an extension
that allows for binding a video surface to an EGL image. So maybe that would make the gstreamer interface more useable in the long run.

>   and how could I make sure EGL/GLES is used instead of GLX/OGL?

If you decide to use the Qt port for testing/developing it, just make sure Qt was configured with -opengl es2. You might also want to use wayland as windowing system, i.e. build the qtwayland module.
Comment 58 Zhao, Halley 2012-11-07 02:23:49 PST
(In reply to comment #57)
> (In reply to comment #56)
> > Created an attachment (id=172019) [details] [details]
> 
> What if WebKit provides the EGL context but gst-vaapi creates its own context that is initialized to _share_
> resources with the one provided by WebKit (or the app in general)?

EGLClientBuffer is a void* indeed, gst-vaapi can export it without link to EGL.
so I will try it first.

your suggestion looks good, eglInitilize/eglTerminate/eglCreateContext is still required, right?
how about we just pass EGLDisplay/EGLContext from webkit to gst-vaapi?

> 
> I wonder if it would be possible to have API in the gst_surface_converter interfaces that would allow for
> the app/webkit to call into the implementation (gst-vaapi) to create (compile) the fragment shader required for rendering
> the texture? Then WebKit could call that function when it needs to.
> 
> Another approach would be for gst-vaapi to provide the shader as a string, but I don't think that's as clean.
> 
if there is implementation for sw decoded YUV frame, gst-vaapi can try to follow it.
in my opinion, it is not so important to support YUV texture by gl shader.
many driver support it inside 3d driver.
(though Intel driver doesn't, it should be a push to us. and on the other hand, we can convert yuv to rgb before export it, it doesn't hit performance much)
> Would this be actually needed in WebKit if gst-vaapi continues to support as > texture as interface instead of EGL image?
texture depends on egl context and gl context, and the option between openGL and gles is decide at runtime for webkit. so, I hesitate to support texture as interface
> 
> Then again, I suppose one argument in favour of using EGL images instead of texture is that OpenMax IL has IIRC an extension
> that allows for binding a video surface to an EGL image. So maybe that would make the gstreamer interface more useable in the long run.
> 
I also favour of EGLImage, just postpone it for several days
> 
> If you decide to use the Qt port for testing/developing it, just make sure 
> Qt was configured with -opengl es2. You might also want to use wayland as 
> windowing system, i.e. build the qtwayland module.

now, I'm using gtk port on Ubuntu12.04
Comment 59 Zhao, Halley 2012-11-07 02:35:12 PST
Created attachment 172745 [details]
temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, fail to compile

temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, but fail to compile

I met a strange issue that: compile fail if I try to include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp.
"MediaPlayerPrivateGStreamer.cpp:321:35: error: expected unqualified-id before numeric constant"
it seems to be namespace issue, but i don't know where breaks.

you can reproduce the issue by simple include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp
Comment 60 Philippe Normand 2012-11-08 02:22:30 PST
(In reply to comment #59)
> Created an attachment (id=172745) [details]
> temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, fail to compile
> 
> temp patch to add eglCreateImageKHR and glEGLImageTargetTexture2DOES, but fail to compile
> 
> I met a strange issue that: compile fail if I try to include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp.
> "MediaPlayerPrivateGStreamer.cpp:321:35: error: expected unqualified-id before numeric constant"
> it seems to be namespace issue, but i don't know where breaks.
> 
> you can reproduce the issue by simple include GLContextEGL.h in MediaPlayerPrivateGStreamer.cpp

Preprocess the file with g++ -E ....cpp -o foo.cpp and check the resulting file.
Comment 61 Zhao, Halley 2012-11-08 23:49:02 PST
(In reply to comment #60)
> (In reply to comment #59)
> Preprocess the file with g++ -E ....cpp -o foo.cpp and check the resulting file.

thanks Philippe.
the reason is MediaPlayer::None change to MediaPlayer::0L after c pre-process.

I change the definition of MediaPlayer::None to MediaPlayer::PreloadNone, it fixed the issue.
Comment 62 Philippe Normand 2012-11-12 03:30:44 PST
CCing GStreamer hacker extraordinaire, Sebastian Dröge.
Comment 63 Sebastian Dröge (slomo) 2012-11-12 06:24:00 PST
Note that the removal/replacement of the surfaceconverter/surface(buffer|meta) API will be really soon and it will be replaced with something more generic.

I wouldn't worry about doing anything for 0.10 here, just concentrate on 1.0/1.2.
Comment 64 Zhao, Halley 2012-11-12 17:12:36 PST
(In reply to comment #63)
> Note that the removal/replacement of the surfaceconverter/surface(buffer|meta) API will be really soon and it will be replaced with something more generic.
> I wouldn't worry about doing anything for 0.10 here, just concentrate on 1.0/1.2.

thanks for reminder.
I works on gst 0.10 first, since it's my platform priority and gst-vaapi doesn't work with gst 1.0 yet.

after enabling with gst 0.10, I will work on gst 1.0 and try to push it upstream.
Comment 65 Simon Hausmann 2012-11-12 23:57:22 PST
(In reply to comment #63)
> Note that the removal/replacement of the surfaceconverter/surface(buffer|meta) API will be really soon and it will be replaced with something more generic.

Interesting, I was secretly hoping for that :). Any hints about the successor?

> I wouldn't worry about doing anything for 0.10 here, just concentrate on 1.0/1.2.

Agreed!
Comment 66 Zhao, Halley 2012-11-14 19:29:17 PST
Created attachment 174322 [details]
add eglCreateImage/glEGLImageTargetTexture2DOES to webkit

after adding eglCreateImage/glEGLImageTargetTexture2DOES in webkit, 
I can chain up each component and run it on my PC, however pictures shows incorrectly (far from ok).

1, EGLImage seems to be created correctly after debug, I will continue debug it in gl lib
2, on gst side, finally I come to the same concept as meta data in gst 1.0.
Comment 67 Zhao, Halley 2012-12-03 23:28:33 PST
Created attachment 177424 [details]
Bind texture from hw video surface which exports EGLClientBuffer information

this is the patch works for me on Intel SandyBridge platform, which uses extended GstSurfaceBuffer to get EGL client buffer information.
It is similar idea of Gst 1.0 meta data when I extend GstSurfaceBuffer.
Comment 68 Philippe Normand 2012-12-03 23:34:50 PST
We're not likely to upstream a patch based on an unstable, deprecated API I'm afraid. The focus should be on GStreamer 1.0, in my opinion.
Comment 69 Zhao, Halley 2012-12-04 01:27:13 PST
I can port it to gst 1.0(meta data interface) when hw codec is ready for gst 1.0.

gst-vaapi is porting to gst 1.0, the best case is by end of this year to finish the porting.
Comment 70 Philippe Normand 2013-01-08 07:35:16 PST
(In reply to comment #69)
> I can port it to gst 1.0(meta data interface) when hw codec is ready for gst 1.0.
> 
> gst-vaapi is porting to gst 1.0, the best case is by end of this year to finish the porting.

https://gitorious.org/~sreerenj/vaapi/sree-gstreamer-vaapi/commits/1.0

I got this working here with gst 1.0 but it still use the -bad unstable API. :(
Comment 71 Víctor M. Jáquez L. 2013-02-07 03:39:01 PST
Created attachment 187046 [details]
Rebase of the proposed patches (without gst surfaces)

I've updated the patch to use the current TextureMapperShaderProgram API.

Please note that it still is a work in progress.
Comment 72 Philippe Normand 2013-02-07 03:48:02 PST
(In reply to comment #71)
> Created an attachment (id=187046) [details]
> Rebase of Phil's patch
> 

FTR this is not my patch.
Comment 73 Víctor M. Jáquez L. 2013-02-22 03:50:17 PST
Created attachment 189741 [details]
simple patch for video composition

Since the gstreamer backend has been refactored in two classes, I applied the previous patches manually and this the result.

I tried to keep the simplest approach, taking many of the elements proposed by arno., such as update player in renderer when source changed and to set needs display to the layer in renderLayerBacking, but removing the shader.

What I want is to keep most of the video processing in the GStreamer pipeline, and try to avoid the context change of the buffer (GPU<->CPU). That is why applying the proposed shader would imply to move the buffer back and forth, impacting the performance. 

Still, the performance is bad. I would try to remove the color space conversion (YUV->RGB), and also try to avoid bring the buffer into the CPU, but push the WebKit's texture to the GStreamer context. Does this make sense?
Comment 74 Víctor M. Jáquez L. 2013-02-22 06:11:18 PST
(In reply to comment #73)
> Created an attachment (id=189741) [details]
> simple patch for video composition
> 

About all these patches, I've observed two bugs:

* The video doesn't play if it doesn't have the mouse above the video widget

* When two or more videos are rendered in the same page, I got this error:

[xcb] Unknown request in queue while dequeuing
[xcb] Most likely this is a multi-threaded client and XInitThreads has not been called
[xcb] Aborting, sorry about that.
GtkLauncher: xcb_io.c:178: dequeue_pending_request: La declaración `!xcb_xlib_unknown_req_in_deq' no se cumple.

and the GtkLauncher crashes.
Comment 75 Víctor M. Jáquez L. 2013-03-04 07:56:23 PST
Created attachment 191247 [details]
simple patch for video composition

This new patch fix the X problem, because the ::paintToTextureMapper() method is fast as possible. The buffer handler is move out to the gstreamer video sink callback.

Still the method ::paintToTextureMapper() doesn't happen if the mouse is not above the video area. Furthermore, this method is not called as frequently as the video buffers arrive, so the video looks choppy.
Comment 76 Víctor M. Jáquez L. 2013-03-20 10:45:42 PDT
Created attachment 194084 [details]
New version of the composited video

This new version of the patch fixes the problem of the repainting when the pointer is out of the video area.
Comment 77 Víctor M. Jáquez L. 2013-03-25 10:05:54 PDT
Created attachment 194877 [details]
New version of the composited video

This new version of the patch honors the --enable-accelerated-compositing flag
Comment 78 Víctor M. Jáquez L. 2013-04-17 03:27:57 PDT
Created attachment 198493 [details]
compsited video with GstSurface{Meta/Convert} support 

This an alternative version of my previous patch (attachment 194877 [details])

This version of the patch depends on:

1. The GstSurfaceMeta/GstSurfaceConvert interface, available in gst-plugins-bad v1.0 -though it is already removed for v1.2-

2. The current HEAD of official gstreamer-vaapi (git://gitorious.org/vaapi/gstreamer-vaapi.git - a0e5876)

With this patch I'm able to render 1080p videos in GtkLaunch using a NVIDIA GPU (GT218 ION)
Comment 79 Víctor M. Jáquez L. 2013-04-17 03:34:22 PDT
My plan for this bug is

1. Split this patch, taking out the render related bits and file it in a different bug.

2. The other part of the patch will remain as a work in progress within this bug.

3. Work in gstreamer-vaapi and vdpau decoders to support the interfaces in GStreamer. See http://blogs.igalia.com/vjaquez/2013/04/08/gstreamer-hackfest-2013-milan/

4. Update the patch to use those new interfaces

5. Add full-screen support

Does this plan make sense?
Comment 80 Philippe Normand 2013-04-17 04:20:56 PDT
Comment on attachment 198493 [details]
compsited video with GstSurface{Meta/Convert} support 

View in context: https://bugs.webkit.org/attachment.cgi?id=198493&action=review

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:311
> +    const void *srcData;

* misplaced

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:323
> +    // @TODO: support cropping

What's the plan for this? Get the GstCropMeta (or whatever the name is :)) and apply in ::updateContents below?

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:381
> +        m_textureMapper = textureMapper; // @TODO Is this a sane enough?

Seems sane to me!

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:385
> +    if (m_texture) {
> +        textureMapper->drawTexture(*m_texture.get(), targetRect, matrix, opacity);
> +    }

No need for curly braces.

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:107
> +    virtual void paintToTextureMapper(TextureMapper*, const FloatRect& targetRect, const TransformationMatrix&, float opacity);

targetRect doesn't need to be named.

> Source/WebCore/rendering/RenderLayerBacking.cpp:1785
> +        if (m_graphicsLayer->hasContentsLayer()) {
> +            m_graphicsLayer->setContentsNeedsDisplay();
> +        } else if (m_graphicsLayer->drawsContent()) {
> +            m_graphicsLayer->setNeedsDisplay();
> +        }

Curly braces can be avoided here too, I suppose
Comment 81 Víctor M. Jáquez L. 2013-05-03 09:40:36 PDT
Created attachment 200421 [details]
Composited video support

This patch enables the video rendering using the TextureMapper. 

Depends on the fix at bug #114742
Comment 82 Víctor M. Jáquez L. 2013-05-03 09:46:36 PDT
Created attachment 200423 [details]
GstSurface{Meta/Convert} support for gstreamer-vaapi elements (Gst 1.0)

This patch applies on top of attachment #200421 [details].

It enables the FullHD rendering using the GstSurfaceMeta and GstSurfaceConvert from GStreamer 1.0 (note: those APIs are already deprecated and will be replaced by new one in GStreamer 1.2) with gstreamer-vaapi decoding element.

This patch was split because it is not mandatory for the composited video and uses deprecated GStreamer API.
Comment 83 Víctor M. Jáquez L. 2013-05-13 06:19:49 PDT
Created attachment 201553 [details]
Patch
Comment 84 Gustavo Noronha (kov) 2013-05-13 06:34:32 PDT
Comment on attachment 201553 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=201553&action=review

> Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:394
> +        return;

This got me curious, you get the texture here, why then you return instead of drawing?
Comment 85 Philippe Normand 2013-05-13 06:43:19 PDT
(In reply to comment #84)
> (From update of attachment 201553 [details])
> View in context: https://bugs.webkit.org/attachment.cgi?id=201553&action=review
> 
> > Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:394
> > +        return;
> 
> This got me curious, you get the texture here, why then you return instead of drawing?

Because it's empty when created I suppose :)
Comment 86 Gustavo Noronha (kov) 2013-05-13 07:12:39 PDT
Ah! I thought it was drawing to not from the texture, thanks =)
Comment 87 Víctor M. Jáquez L. 2013-05-13 07:23:11 PDT
Created attachment 201564 [details]
Patch
Comment 88 Noam Rosenthal 2013-05-13 07:57:06 PDT
Texmap changes seem ok, I'll let kov or Philippe the honors of r+ing/r-ing.
Comment 89 Philippe Normand 2013-05-13 08:01:09 PDT
Comment on attachment 201564 [details]
Patch

Looks good, thanks!
Comment 90 WebKit Commit Bot 2013-05-13 08:14:11 PDT
Comment on attachment 201564 [details]
Patch

Clearing flags on attachment: 201564

Committed r150014: <http://trac.webkit.org/changeset/150014>
Comment 91 WebKit Commit Bot 2013-05-13 08:14:22 PDT
All reviewed patches have been landed.  Closing bug.
Comment 95 Philippe Normand 2013-05-13 13:29:28 PDT
Reverted r150014 for reason:

Broke video playback on WK2 and some webgl tests

Committed r150035: <http://trac.webkit.org/changeset/150035>
Comment 96 Víctor M. Jáquez L. 2013-05-16 08:16:34 PDT
(In reply to comment #94)

> It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/

That's odd. I have tested with a nvidia and an intel GPUs and the HTML5 videos work OK.

And it is plain OpenGL's texture, no other library is involved.
Comment 97 Kalyan 2013-05-16 12:35:58 PDT
(In reply to comment #96)
> (In reply to comment #94)
> 
> > It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/
> 
> That's odd. I have tested with a nvidia and an intel GPUs and the HTML5 videos work OK.
> 
> And it is plain OpenGL's texture, no other library is involved.

WebGL test failures seem bit odd here.

The current solution would break HTML5 video for ports like EFL, Qt(??) which handle the final composition on UI process side. We would need to have a solution similar to WebGL(using GraphicsSurface).
Comment 98 Kalyan 2013-05-16 12:39:39 PDT
(In reply to comment #97)
> (In reply to comment #96)
> > (In reply to comment #94)
> > 
> > > It is actually worse. HTML5 video playback seems completely broken for us. I have tried to play a few HTML5 video demos on the Web and I see a back screen with only audio :/
> > 
> > That's odd. I have tested with a nvidia and an intel GPUs and the HTML5 videos work OK.
> > 
> > And it is plain OpenGL's texture, no other library is involved.
> 
> WebGL test failures seem bit odd here.
> 
 The current solution would break HTML5 video for ports like EFL, Qt(??) which handle the final composition on UI process side. We would need to have a solution similar to WebGL(using GraphicsSurface) for us(EFL port atleast). Perhaps you could enable this only for GTK port for now (atleast with WebKit2) while we add the needed support from our side??
Comment 99 Víctor M. Jáquez L. 2013-05-29 02:54:19 PDT
Created attachment 203145 [details]
Patch
Comment 100 Noam Rosenthal 2013-05-29 03:02:45 PDT
Comment on attachment 203145 [details]
Patch

How about !USE(COORDINATED_GRAPHICS) instead of PLATFORM(GTK) ?
Comment 101 Kalyan 2013-05-29 03:16:04 PDT
(In reply to comment #100)
> (From update of attachment 203145 [details])
> How about !USE(COORDINATED_GRAPHICS) instead of PLATFORM(GTK) ?

!USE(COORDINATED_GRAPHICS) seems a better choice for me too.
Comment 102 Víctor M. Jáquez L. 2013-05-29 04:16:57 PDT
Created attachment 203151 [details]
Patch
Comment 103 WebKit Commit Bot 2013-05-29 06:24:38 PDT
Comment on attachment 203151 [details]
Patch

Clearing flags on attachment: 203151

Committed r150890: <http://trac.webkit.org/changeset/150890>
Comment 104 WebKit Commit Bot 2013-05-29 06:24:46 PDT
All reviewed patches have been landed.  Closing bug.