Bug 194817 - [EGL] Runtime support for RGB565 pixel layout
Summary: [EGL] Runtime support for RGB565 pixel layout
Status: RESOLVED FIXED
Alias: None
Product: WebKit
Classification: Unclassified
Component: WPE WebKit (show other bugs)
Version: WebKit Nightly Build
Hardware: Unspecified Unspecified
: P2 Normal
Assignee: Philippe Normand
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2019-02-19 07:02 PST by Philippe Normand
Modified: 2019-02-27 06:45 PST (History)
6 users (show)

See Also:


Attachments
Patch (34.56 KB, patch)
2019-02-19 07:30 PST, Philippe Normand
no flags Details | Formatted Diff | Diff
Patch (2.62 KB, patch)
2019-02-25 04:57 PST, Philippe Normand
no flags Details | Formatted Diff | Diff
Patch (3.82 KB, patch)
2019-02-25 05:02 PST, Philippe Normand
no flags Details | Formatted Diff | Diff
Patch (4.27 KB, patch)
2019-02-26 04:00 PST, Philippe Normand
cgarcia: commit-queue-
Details | Formatted Diff | Diff
Patch (4.21 KB, patch)
2019-02-26 04:06 PST, Philippe Normand
no flags Details | Formatted Diff | Diff
Patch (4.18 KB, patch)
2019-02-26 06:26 PST, Philippe Normand
cgarcia: review+
Details | Formatted Diff | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Philippe Normand 2019-02-19 07:02:53 PST
Currently our graphics pipeline always relies on a ARGB8888 (32 bpp) pixel configuration. On some low-end (old) embedded platforms the graphics driver is sometimes optimized for 16 bpp configurations, such as RGB565. So it would be nice if there was a runtime setting that could be used on these platforms.

I started a patch...
Comment 1 Philippe Normand 2019-02-19 07:04:46 PST
http://www.barth-dev.de/online/rgb565-color-picker/

Especially (cheap) screens used with embedded devices do not provide 24 bit color-depth. Moreover, storing and/or transmitting 3 bytes per pixel is consuming quite some memory and creates latency. RGB565 requires only 16 (5+6+5) bits/2 bytes and is commonly used with embedded screens.
Comment 2 Philippe Normand 2019-02-19 07:30:14 PST
Created attachment 362385 [details]
Patch
Comment 3 Philippe Normand 2019-02-19 07:51:12 PST
I'm not really happy with the added optional arguments, I can change them to use WTF::Optional instead, if deemed better approach :)
Comment 4 Carlos Garcia Campos 2019-02-20 02:18:45 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

Why do we need a setting for this? Can't it be detected somehow?

> Source/WebCore/platform/graphics/GLContext.cpp:79
> +std::unique_ptr<GLContext> GLContext::createContextForWindow(GLNativeWindowType windowHandle, PlatformDisplay* platformDisplay, uint32_t eglConfigBitsPerPixel)

If this value is set in the platform display, we wouldsn't need to pass it everywhere, we could get it from the display.

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:161
> +#if !PLATFORM(WPE)
> +    bitsPerPixel = 32;
> +#endif

Why is this specific to WPE?

> Source/WebCore/platform/graphics/egl/GLContextEGL.h:92
> +    static std::unique_ptr<GLContextEGL> createSurfacelessContext(PlatformDisplay&, EGLContext sharingContext = nullptr, uint32_t bitsPerPixel = 42);

why 42 here?

> Source/WebKit/UIProcess/API/C/WKPreferences.cpp:2123
> +#if PLATFORM(WPE)
> +void WKPreferencesSetEGLConfigBitsPerPixel(WKPreferencesRef preferencesRef, int bitsPerPixel)
> +{
> +    toImpl(preferencesRef)->setEglConfigBitsPerPixel(bitsPerPixel);
> +}
> +
> +bool WKPreferencesGetEGLConfigBitsPerPixel(WKPreferencesRef preferencesRef)
> +{
> +    return toImpl(preferencesRef)->eglConfigBitsPerPixel();
> +}
> +#endif

Please do not add anew C API unless it's used by WKTR
Comment 5 Philippe Normand 2019-02-20 02:49:18 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:161
>> +#endif
> 
> Why is this specific to WPE?

The whole patch is specific to WPE because I assumed the new setting would be useful for this port only, as it's the preferred port nowadays for embedded devices.

>> Source/WebCore/platform/graphics/egl/GLContextEGL.h:92
>> +    static std::unique_ptr<GLContextEGL> createSurfacelessContext(PlatformDisplay&, EGLContext sharingContext = nullptr, uint32_t bitsPerPixel = 42);
> 
> why 42 here?

Typo :)

>> Source/WebKit/UIProcess/API/C/WKPreferences.cpp:2123
>> +#endif
> 
> Please do not add anew C API unless it's used by WKTR

Hum ok, but can I still have a web-setting without C API ?
Comment 6 Philippe Normand 2019-02-20 02:52:05 PST
(In reply to Carlos Garcia Campos from comment #4)
> Comment on attachment 362385 [details]
> Patch
> 
> View in context:
> https://bugs.webkit.org/attachment.cgi?id=362385&action=review
> 
> Why do we need a setting for this? Can't it be detected somehow?
> 

I'll let others chime in on this.
For my case (i.MX51 board) it can't be detected because most formats are already advertised as supported (from 16bpp to 32bpp) but only I know that 16bpp should be preferred. AFAIK the EGL config attributes provide no support pixel format "hinting".
Comment 7 Philippe Normand 2019-02-20 03:18:06 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

>>> Source/WebKit/UIProcess/API/C/WKPreferences.cpp:2123
>>> +#endif
>> 
>> Please do not add anew C API unless it's used by WKTR
> 
> Hum ok, but can I still have a web-setting without C API ?

The answer is 42^Wyes
Comment 8 Carlos Garcia Campos 2019-02-20 03:25:48 PST
(In reply to Philippe Normand from comment #7)
> Comment on attachment 362385 [details]
> Patch
> 
> View in context:
> https://bugs.webkit.org/attachment.cgi?id=362385&action=review
> 
> >>> Source/WebKit/UIProcess/API/C/WKPreferences.cpp:2123
> >>> +#endif
> >> 
> >> Please do not add anew C API unless it's used by WKTR
> > 
> > Hum ok, but can I still have a web-setting without C API ?
> 
> The answer is 42^Wyes

Yes, I only complained about the C API. But I'm not sure about the glib one either. This is very specific use-case for a particular hw/driver, and it's not something we might want to configure per web view. Maybe an env var would be more appropriate.
Comment 9 Philippe Normand 2019-02-20 06:54:04 PST
(In reply to Carlos Garcia Campos from comment #8)
> (In reply to Philippe Normand from comment #7)
> > Comment on attachment 362385 [details]
> > Patch
> > 
> > View in context:
> > https://bugs.webkit.org/attachment.cgi?id=362385&action=review
> > 
> > >>> Source/WebKit/UIProcess/API/C/WKPreferences.cpp:2123
> > >>> +#endif
> > >> 
> > >> Please do not add anew C API unless it's used by WKTR
> > > 
> > > Hum ok, but can I still have a web-setting without C API ?
> > 
> > The answer is 42^Wyes
> 
> Yes, I only complained about the C API. But I'm not sure about the glib one
> either. This is very specific use-case for a particular hw/driver, and it's
> not something we might want to configure per web view. Maybe an env var
> would be more appropriate.

Before I update the patch I would like to know which approach should be used...
An env var would be easier to support than a web-setting, in the end it's mostly a question about semantics. So... do we use an env-var or a web-setting for this?
Comment 10 Adrian Perez 2019-02-20 09:38:13 PST
(In reply to Philippe Normand from comment #6)
> (In reply to Carlos Garcia Campos from comment #4)
> > Comment on attachment 362385 [details]
> > Patch
> > 
> > View in context:
> > https://bugs.webkit.org/attachment.cgi?id=362385&action=review
> > 
> > Why do we need a setting for this? Can't it be detected somehow?
> > 
> 
> I'll let others chime in on this.
> For my case (i.MX51 board) it can't be detected because most formats are
> already advertised as supported (from 16bpp to 32bpp) but only I know that
> 16bpp should be preferred. AFAIK the EGL config attributes provide no
> support pixel format "hinting".

I have also seen some systems in which the GPU is perfectly capable
of rendering in 32bpp modes, but the display attached (typically some
kind of TFT LCD) only supports 16bpp. I have been working with one
where the framebuffer device (correctly) allows only configuring 16bpp
modes but WebKit still needs to be patched to allow choosing an EGLConfig
where color components can have less than 8bit. This patch (or similar)
would help and avoid needing to apply patches in my case as well.
Comment 11 Carlos Garcia Campos 2019-02-21 01:13:01 PST
(In reply to Adrian Perez from comment #10)
> (In reply to Philippe Normand from comment #6)
> > (In reply to Carlos Garcia Campos from comment #4)
> > > Comment on attachment 362385 [details]
> > > Patch
> > > 
> > > View in context:
> > > https://bugs.webkit.org/attachment.cgi?id=362385&action=review
> > > 
> > > Why do we need a setting for this? Can't it be detected somehow?
> > > 
> > 
> > I'll let others chime in on this.
> > For my case (i.MX51 board) it can't be detected because most formats are
> > already advertised as supported (from 16bpp to 32bpp) but only I know that
> > 16bpp should be preferred. AFAIK the EGL config attributes provide no
> > support pixel format "hinting".
> 
> I have also seen some systems in which the GPU is perfectly capable
> of rendering in 32bpp modes, but the display attached (typically some
> kind of TFT LCD) only supports 16bpp. I have been working with one
> where the framebuffer device (correctly) allows only configuring 16bpp
> modes but WebKit still needs to be patched to allow choosing an EGLConfig
> where color components can have less than 8bit. This patch (or similar)
> would help and avoid needing to apply patches in my case as well.

I'm only concerned (not even opposed) to the setting, not the patch itself.
Comment 12 Philippe Normand 2019-02-21 01:27:46 PST
Yeah I think we all agree the patch makes sense. The dilemma is about env-var vs setting...

An env-var would simplify the patch, a lot. But is less discoverable from the user POV, unless we have a place in the docs documenting env-vars?
Comment 13 Carlos Garcia Campos 2019-02-21 01:57:59 PST
(In reply to Philippe Normand from comment #12)
> Yeah I think we all agree the patch makes sense. The dilemma is about
> env-var vs setting...
> 
> An env-var would simplify the patch, a lot. But is less discoverable from
> the user POV, unless we have a place in the docs documenting env-vars?

We don't, but we should.
Comment 14 Adrian Perez 2019-02-21 02:07:51 PST
(In reply to Carlos Garcia Campos from comment #13)
> (In reply to Philippe Normand from comment #12)
> > Yeah I think we all agree the patch makes sense. The dilemma is about
> > env-var vs setting...
> > 
> > An env-var would simplify the patch, a lot. But is less discoverable from
> > the user POV, unless we have a place in the docs documenting env-vars?
> 
> We don't, but we should.

This is a good point. I have found myself now and then grepping
the WebKit sources to find out which env-var we had for this or
that (e.g. like the one for pausing the WebProcess to give time
to attach the debugger).

Not all environment variables need to be in the documentation,
for example the ones used for development probably would fit better
somewhere in the wiki.
Comment 15 Carlos Garcia Campos 2019-02-21 02:48:38 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:110
> +        EGL_SAMPLES, 0,

Why do you add EGL_SAMPLES too?

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:118
> +    EGLint bufferSize = bitsPerPixel;
> +    if (bufferSize == 16) {
> +        attributeList[9] = 0;
> +        attributeList[11] = 0;
> +    }

I don't understand this. 9 is EGL_STENCIL_SIZE and 11 is EGL_ALPHA_SIZE. Why do we only change them if bitsPerPixel is 16, don't we want to support other depth sizes? like 24 for example? And why do we set the values to 0?
Comment 16 Philippe Normand 2019-02-21 03:02:02 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:110
>> +        EGL_SAMPLES, 0,
> 
> Why do you add EGL_SAMPLES too?

Perhaps it can be unset, I just followed the same code as weston's simple-egl.

>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:118
>> +    }
> 
> I don't understand this. 9 is EGL_STENCIL_SIZE and 11 is EGL_ALPHA_SIZE. Why do we only change them if bitsPerPixel is 16, don't we want to support other depth sizes? like 24 for example? And why do we set the values to 0?

stencil and alpha are disabled here so that we have a chance to reach 16bpp... Disabling alpha means we use an RGB format, like RGB565, which uses 16 bits for storage.
For 24bpp, I think additional changes might be needed, I haven't investigated this.
Comment 17 Carlos Garcia Campos 2019-02-21 03:06:51 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

>>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:110
>>> +        EGL_SAMPLES, 0,
>> 
>> Why do you add EGL_SAMPLES too?
> 
> Perhaps it can be unset, I just followed the same code as weston's simple-egl.

But 0 is the default for both depth size and samples.

>>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:118
>>> +    }
>> 
>> I don't understand this. 9 is EGL_STENCIL_SIZE and 11 is EGL_ALPHA_SIZE. Why do we only change them if bitsPerPixel is 16, don't we want to support other depth sizes? like 24 for example? And why do we set the values to 0?
> 
> stencil and alpha are disabled here so that we have a chance to reach 16bpp... Disabling alpha means we use an RGB format, like RGB565, which uses 16 bits for storage.
> For 24bpp, I think additional changes might be needed, I haven't investigated this.

That means we are exposing a setting to set the depth size, but that only works with value 16 and what it does is disabling alpha, instead of changing the depth size in the config. It's very confusing.
Comment 18 Philippe Normand 2019-02-21 07:18:39 PST
Comment on attachment 362385 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362385&action=review

>>>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:110
>>>> +        EGL_SAMPLES, 0,
>>> 
>>> Why do you add EGL_SAMPLES too?
>> 
>> Perhaps it can be unset, I just followed the same code as weston's simple-egl.
> 
> But 0 is the default for both depth size and samples.

I'll revert this change in the next patch iteration.

>>>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:118
>>>> +    }
>>> 
>>> I don't understand this. 9 is EGL_STENCIL_SIZE and 11 is EGL_ALPHA_SIZE. Why do we only change them if bitsPerPixel is 16, don't we want to support other depth sizes? like 24 for example? And why do we set the values to 0?
>> 
>> stencil and alpha are disabled here so that we have a chance to reach 16bpp... Disabling alpha means we use an RGB format, like RGB565, which uses 16 bits for storage.
>> For 24bpp, I think additional changes might be needed, I haven't investigated this.
> 
> That means we are exposing a setting to set the depth size, but that only works with value 16 and what it does is disabling alpha, instead of changing the depth size in the config. It's very confusing.

I think you're confusing pixel formats with depth testing... Anyway I'll let actual experts step in here :)
Comment 19 Carlos Garcia Campos 2019-02-21 08:16:57 PST
(In reply to Philippe Normand from comment #18)
> Comment on attachment 362385 [details]
> Patch
> 
> View in context:
> https://bugs.webkit.org/attachment.cgi?id=362385&action=review
> 
> >>>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:110
> >>>> +        EGL_SAMPLES, 0,
> >>> 
> >>> Why do you add EGL_SAMPLES too?
> >> 
> >> Perhaps it can be unset, I just followed the same code as weston's simple-egl.
> > 
> > But 0 is the default for both depth size and samples.
> 
> I'll revert this change in the next patch iteration.
> 
> >>>> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:118
> >>>> +    }
> >>> 
> >>> I don't understand this. 9 is EGL_STENCIL_SIZE and 11 is EGL_ALPHA_SIZE. Why do we only change them if bitsPerPixel is 16, don't we want to support other depth sizes? like 24 for example? And why do we set the values to 0?
> >> 
> >> stencil and alpha are disabled here so that we have a chance to reach 16bpp... Disabling alpha means we use an RGB format, like RGB565, which uses 16 bits for storage.
> >> For 24bpp, I think additional changes might be needed, I haven't investigated this.
> > 
> > That means we are exposing a setting to set the depth size, but that only works with value 16 and what it does is disabling alpha, instead of changing the depth size in the config. It's very confusing.
> 
> I think you're confusing pixel formats with depth testing... Anyway I'll let
> actual experts step in here :)

Yes, probably, this is indeed confusing, it's weird to expose a setting that allows to set any bpp but only works if 0 or 16 is passed. Whatever bpp means, that should be an enum and not a integer, I would say.
Comment 20 Miguel Gomez 2019-02-21 08:48:22 PST
Couple of comments related to what you're talking:

- EGL_DEPTH_SIZE is not set because we don't use depth testing so we want it to be 0 (default value). Same with EGL_SAMPLES.

- EGL_STENCIL_SIZE needs to be 8 cause we need it to do stencil clipping. That's not related to bpp as that's not transferred to the display.

- When setting the values of EGL_RED_SIZE, EGL_BLUE_SIZE and EGL_GREEN_SIZE to 1 you are saying that you want the smallest size available for those. But that could be RGB444 if that's supported and not the RGB565 that you're expecting. If you want RGB565 specifically, you should set those values to 5, 6, and 5 respectively.
Comment 21 Miguel Gomez 2019-02-21 08:57:23 PST
Maybe instead of receiving the bpp, what we need to receive is a format. For example WPE_EGL_FORMAT and allow RGB565 for starters, and then fallback to RGBA8888 if that's not supported. Later we can add more formats if we want to.
Comment 22 Philippe Normand 2019-02-21 14:04:10 PST
(In reply to Miguel Gomez from comment #20)
> Couple of comments related to what you're talking:
> 
> - EGL_DEPTH_SIZE is not set because we don't use depth testing so we want it
> to be 0 (default value). Same with EGL_SAMPLES.
> 
> - EGL_STENCIL_SIZE needs to be 8 cause we need it to do stencil clipping.
> That's not related to bpp as that's not transferred to the display.
> 
> - When setting the values of EGL_RED_SIZE, EGL_BLUE_SIZE and EGL_GREEN_SIZE
> to 1 you are saying that you want the smallest size available for those. But
> that could be RGB444 if that's supported and not the RGB565 that you're
> expecting. If you want RGB565 specifically, you should set those values to
> 5, 6, and 5 respectively.

But RGB444 isn't a 16bpp format, is it?
As I understand it, setting RGB channels to 1 constitutes a low-barrier for filtering on EGL_BUFFER_SIZE later on.

(In reply to Miguel Gomez from comment #21)
> Maybe instead of receiving the bpp, what we need to receive is a format. For
> example WPE_EGL_FORMAT and allow RGB565 for starters, and then fallback to
> RGBA8888 if that's not supported. Later we can add more formats if we want
> to.

Yes... that's an option too. Explicit is better than implicit... But I'm not sure there's a use for say, 24bpp formats... As mentioned in previous comments there's quite a few setups where 16bpp is the preferred format.

In this patch I basically copied the behaviour of weston's simple-egl client, which has a runtime option for 16bpp formats...
Comment 23 Carlos Garcia Campos 2019-02-21 23:11:03 PST
(In reply to Philippe Normand from comment #22)
> (In reply to Miguel Gomez from comment #20)
> > Couple of comments related to what you're talking:
> > 
> > - EGL_DEPTH_SIZE is not set because we don't use depth testing so we want it
> > to be 0 (default value). Same with EGL_SAMPLES.
> > 
> > - EGL_STENCIL_SIZE needs to be 8 cause we need it to do stencil clipping.
> > That's not related to bpp as that's not transferred to the display.
> > 
> > - When setting the values of EGL_RED_SIZE, EGL_BLUE_SIZE and EGL_GREEN_SIZE
> > to 1 you are saying that you want the smallest size available for those. But
> > that could be RGB444 if that's supported and not the RGB565 that you're
> > expecting. If you want RGB565 specifically, you should set those values to
> > 5, 6, and 5 respectively.
> 
> But RGB444 isn't a 16bpp format, is it?
> As I understand it, setting RGB channels to 1 constitutes a low-barrier for
> filtering on EGL_BUFFER_SIZE later on.
> 
> (In reply to Miguel Gomez from comment #21)
> > Maybe instead of receiving the bpp, what we need to receive is a format. For
> > example WPE_EGL_FORMAT and allow RGB565 for starters, and then fallback to
> > RGBA8888 if that's not supported. Later we can add more formats if we want
> > to.
> 
> Yes... that's an option too. Explicit is better than implicit... But I'm not
> sure there's a use for say, 24bpp formats... As mentioned in previous
> comments there's quite a few setups where 16bpp is the preferred format.
> 
> In this patch I basically copied the behaviour of weston's simple-egl
> client, which has a runtime option for 16bpp formats...

Could we simplify it then as use-rgba-egl-config (or something similar) in the public API?
Comment 24 Carlos Garcia Campos 2019-02-21 23:14:00 PST
This is what gtk does in gdkglcontext-wayland.c:

use_rgba = (visual == gdk_screen_get_rgba_visual (gdk_display_get_default_screen (display)));

if (use_rgba)
  {
    attrs[i++] = EGL_ALPHA_SIZE;
    attrs[i++] = 1;
  }
else
  {
    attrs[i++] = EGL_ALPHA_SIZE;
    attrs[i++] = 0;
  }

So, for the GTK port we could get the web view visual and send use_rgba to the web process without any new public API (I'm not asking to do this as port of this patch, though)
Comment 25 Philippe Normand 2019-02-22 01:01:53 PST
Disabling alpha is not enough because it might lead to RGB888 being selected.
Comment 26 Carlos Garcia Campos 2019-02-22 06:06:43 PST
(In reply to Philippe Normand from comment #25)
> Disabling alpha is not enough because it might lead to RGB888 being selected.

GTK also uses 1 for EGL_RED_SIZE, EGL_BLUE_SIZE and EGL_GREEN_SIZE
Comment 27 Philippe Normand 2019-02-25 04:19:34 PST
So which approach should I use? Env var or setting? I'd like to get this in WPE 2.24 if possible, so we need to settle on the approach :)
Comment 28 Carlos Garcia Campos 2019-02-25 04:37:04 PST
(In reply to Philippe Normand from comment #27)
> So which approach should I use? Env var or setting? I'd like to get this in
> WPE 2.24 if possible, so we need to settle on the approach :)

I think it's too late for new API in 2.24, specially when it's not clear enough how the thing should be exposed in the API.
Comment 29 Carlos Garcia Campos 2019-02-25 04:38:15 PST
(In reply to Carlos Garcia Campos from comment #26)
> (In reply to Philippe Normand from comment #25)
> > Disabling alpha is not enough because it might lead to RGB888 being selected.
> 
> GTK also uses 1 for EGL_RED_SIZE, EGL_BLUE_SIZE and EGL_GREEN_SIZE

And allows to set a different visual per window, so maybe we should also allow to use a different config per web view in the end.
Comment 30 Philippe Normand 2019-02-25 04:57:00 PST
Created attachment 362894 [details]
Patch
Comment 31 Philippe Normand 2019-02-25 04:59:17 PST
Comment on attachment 362894 [details]
Patch

Oops,  this is incomplete
Comment 32 Philippe Normand 2019-02-25 05:02:09 PST
Created attachment 362895 [details]
Patch
Comment 33 Carlos Garcia Campos 2019-02-26 01:40:29 PST
Comment on attachment 362895 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362895&action=review

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:114
> +    const char* environmentVariable = getenv("WEBKIT_EGL_PIXEL_FORMAT");
> +    if (environmentVariable && *environmentVariable) {

I don't think we need to check it's not empty, you could simply:

if (const char* pixelFormat = getenv("WEBKIT_EGL_PIXEL_FORMAT")) {

}

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:120
> +            attributeList[3] = 5;
> +            attributeList[5] = 6;
> +            attributeList[7] = 5;
> +            attributeList[9] = 0;

This is confusing, please add a comment before every line saying with the value this is for:

// EGL_RED_SIZE
attributeList[3] = 5;
// EGL_GREEN_SIZE
attributeList[5] = 6;
.....

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:122
> +            g_warning("Value supplied for WEBKIT_EGL_PIXEL_FORMAT should be RGB565 or the environment variable should not be set at all.");

Do not use g_warning() in this file that is not glib specific. You can use WTFLogAlways. I would say something like: "Unknown pixel format %s, falling back to RGBA8888".

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:141
> +    EGLint count;
> +    if (!eglGetConfigs(display, nullptr, 0, &count) || !count)
> +        return false;

I don't think we want all display configs, but only the ones matching the attributes. You can use eglChooseConfig() passing nullptr as configs to get the numbers of configs.

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:143
> +    EGLConfig* configs = reinterpret_cast<EGLConfig*>(fastCalloc(sizeof(EGLConfig), count));

Use unique_ptr here, this is leaked if eglChooseConfig fails below.

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:146
>      EGLint numberConfigsReturned;
> -    return eglChooseConfig(display, attributeList, config, 1, &numberConfigsReturned) && numberConfigsReturned;
> +    if (!eglChooseConfig(display, attributeList, configs, count, &numberConfigsReturned) || !numberConfigsReturned)
> +        return false;

I think we can keep using 1 config only for the non rgb565 case, since we don'tm need to iterate configs in such case. We could do something like:

if (!eglChooseConfig(display, attributeList, isRGB565 ? configs : config,  isRGB565 ? count : 1, &numberConfigsReturned) || !numberConfigsReturned)
    return false;

if (!isRGB565)
    return true;

// Iterate configs to find the one matching.

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:150
> +        eglGetConfigAttrib(display, configs[i], EGL_BUFFER_SIZE, &size);

Is really buffer size enough? couldn't be other formats with 16 that are not RGB565? I think it would be safer to get EGL_RED_SIZE, EGL_GREEN_SIZE, EGL_BLUE_SIZE, and EGL_ALPHA_SIZE, and check they are 5, 6, 5, 0.
Comment 34 Philippe Normand 2019-02-26 04:00:19 PST
Created attachment 362974 [details]
Patch
Comment 35 Philippe Normand 2019-02-26 04:06:57 PST
Created attachment 362975 [details]
Patch
Comment 36 Carlos Garcia Campos 2019-02-26 04:14:34 PST
Comment on attachment 362974 [details]
Patch

View in context: https://bugs.webkit.org/attachment.cgi?id=362974&action=review

> Source/WebCore/ChangeLog:3
> +        [WPE][EGL] Runtime support for RGB565 pixel layout

This is not WPE specific anymore

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:150
> +    auto configs = std::make_unique<Vector<EGLConfig>>(count);

Using Vector is a good idea, but then you don't need to use std::unique_ptr, just keep the vector stack allocated.

> Source/WebCore/platform/graphics/egl/GLContextEGL.cpp:162
> +        return (redSize == 5 && greenSize == 6 && blueSize == 5);

Don't we need to check alpha is 0 too?
Comment 37 Philippe Normand 2019-02-26 06:26:30 PST
Created attachment 362978 [details]
Patch
Comment 38 Philippe Normand 2019-02-26 07:13:51 PST
Committed r242084: <https://trac.webkit.org/changeset/242084>
Comment 39 Michael Catanzaro 2019-02-26 09:27:28 PST
(In reply to Philippe Normand from comment #27)
> So which approach should I use? Env var or setting? I'd like to get this in
> WPE 2.24 if possible, so we need to settle on the approach :)

We shouldn't add environment variables that must be set for WebKit to work properly, unless there's also corresponding API. If applications rely on setting the environment variable, they will -- almost guaranteed -- do so in an unsafe manner. But if we provide a corresponding API, then the environment variable becomes OK, since applications will very likely use the API, and the environment variable becomes just a convenience for debugging.

If there's really absolutely no way for WebKit to know the right pixel layout to use on its own, then we must add new API. That's better than requiring applications set an environment variable to work. But it would also be sad, because we try hard to avoid exposing low-level details like this in our API. Maybe it should be exposed only for WPE.
Comment 40 Philippe Normand 2019-02-26 09:38:20 PST
(In reply to Michael Catanzaro from comment #39)
> (In reply to Philippe Normand from comment #27)
> > So which approach should I use? Env var or setting? I'd like to get this in
> > WPE 2.24 if possible, so we need to settle on the approach :)
> 
> We shouldn't add environment variables that must be set for WebKit to work
> properly, 

s/must/may/
You once again exaggerated things.

This patch is about optimizing rendering, if the var is not set rendering will still work, although in some platforms, not optimally.

> unless there's also corresponding API. If applications rely on
> setting the environment variable, they will -- almost guaranteed -- do so in
> an unsafe manner. But if we provide a corresponding API, then the
> environment variable becomes OK, since applications will very likely use the
> API, and the environment variable becomes just a convenience for debugging.
> 
> If there's really absolutely no way for WebKit to know the right pixel
> layout to use on its own, then we must add new API. That's better than
> requiring applications set an environment variable to work. But it would
> also be sad, because we try hard to avoid exposing low-level details like
> this in our API. Maybe it should be exposed only for WPE.

I agree it doesn't make much sense to have GTK API for this.
Comment 41 Carlos Garcia Campos 2019-02-26 23:49:29 PST
(In reply to Michael Catanzaro from comment #39)
> (In reply to Philippe Normand from comment #27)
> > So which approach should I use? Env var or setting? I'd like to get this in
> > WPE 2.24 if possible, so we need to settle on the approach :)
> 
> We shouldn't add environment variables that must be set for WebKit to work
> properly, unless there's also corresponding API.

That's not the case here.

> If applications rely on
> setting the environment variable, they will -- almost guaranteed -- do so in
> an unsafe manner. 

Sure, you know it.

> But if we provide a corresponding API, then the
> environment variable becomes OK, since applications will very likely use the
> API, and the environment variable becomes just a convenience for debugging.

Please, read the other comments in this bug. The plan is to add API for this, but we haven't agreed on how to expose it yet. Phil needs this in 2.24 branch and I don't want to add new API in 2.24 at this point without being sure about the best approach to expose this. So, the env var allows to add the functionality while we decide on the API for 2.26. 

> If there's really absolutely no way for WebKit to know the right pixel
> layout to use on its own, then we must add new API.

Again, that's the plan.

> That's better than
> requiring applications set an environment variable to work.

Again, that's not the case.

> But it would
> also be sad, because we try hard to avoid exposing low-level details like
> this in our API. Maybe it should be exposed only for WPE.

In gtk we might try to guess the right pixel layout from the GdkWindow visual.
Comment 42 Michael Catanzaro 2019-02-27 06:45:19 PST
OK, that seems perfectly fine! Just need to follow-up with new API for 2.26. And document it here:

https://trac.webkit.org/wiki/EnvironmentVariables

(In reply to Carlos Garcia Campos from comment #41)
> > We shouldn't add environment variables that must be set for WebKit to work
> > properly, unless there's also corresponding API.
> 
> That's not the case here.

It could be restated as: we shouldn't add environment variables that applications will be tempted to set.