[chromium/mac] Improve deviceDPI, rect, and availableRect computation
Created attachment 146427 [details] Patch
Created attachment 146428 [details] Patch
Comment on attachment 146428 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=146428&action=review > Source/WebKit/chromium/src/mac/WebScreenInfoFactory.mm:87 > + float deviceDPI = 160 * deviceScaleFactor(view); We have this 160 constant in ViewportArguments.cpp. Should we move it to a header where it can be shared?
Comment on attachment 146428 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=146428&action=review This looks reasonable to me, but you might want to wait for input from fsamuel. > Source/WebKit/chromium/src/mac/WebScreenInfoFactory.mm:65 > + if (window) > + { Is this proper style? I'm not familiar with the style for .mm files.
Comment on attachment 146428 [details] Patch View in context: https://bugs.webkit.org/attachment.cgi?id=146428&action=review >> Source/WebKit/chromium/src/mac/WebScreenInfoFactory.mm:87 >> + float deviceDPI = 160 * deviceScaleFactor(view); > > We have this 160 constant in ViewportArguments.cpp. Should we move it to a header where it can be shared? I don't know. Basing anything off DPI seems silly to me, but then again I don't do mobile stuff. (Does mobile have useful, real DPI data? I'm not sure why it would be less fake than on desktop though.) This is already duplicated in render_widget.cc. I think longer term it's better to get rid of the DPI stuff, especially if we remove that from the viewport tag like you hinted at on webkit-dev.
(In reply to comment #4) > (From update of attachment 146428 [details]) > View in context: https://bugs.webkit.org/attachment.cgi?id=146428&action=review > > This looks reasonable to me, but you might want to wait for input from fsamuel. > > > Source/WebKit/chromium/src/mac/WebScreenInfoFactory.mm:65 > > + if (window) > > + { > > Is this proper style? I'm not familiar with the style for .mm files. No, it's not. Fixed, thanks.
We've already removed it from the viewport. It seems possible that we can remove the concept of DPI from WebKit entirely. I'd need to grep the codebase to know for sure.
Created attachment 146430 [details] Patch
Created attachment 146431 [details] Patch for landing
Created attachment 146435 [details] Patch for landing
Comment on attachment 146435 [details] Patch for landing Clearing flags on attachment: 146435 Committed r119786: <http://trac.webkit.org/changeset/119786>
All reviewed patches have been landed. Closing bug.
(In reply to comment #5) > I don't know. Basing anything off DPI seems silly to me, but then again I don't do mobile stuff. (Does mobile have useful, real DPI data? I'm not sure why it would be less fake than on desktop though.) Mobile does have useful DPI data. iOS has retina vs non-retina. Android has DisplayMetrics.density[1], which is approximately equal to deviceDpi/160, though it often gets snapped to the nearest 0.25 or 0.5. Other platforms have similar ways of dealing with high dpi displays. [1]: http://developer.android.com/reference/android/util/DisplayMetrics.html#density > > This is already duplicated in render_widget.cc. I think longer term it's better to get rid of the DPI stuff, especially if we remove that from the viewport tag like you hinted at on webkit-dev. (In reply to comment #7) > We've already removed it from the viewport. It seems possible that we can remove the concept of DPI from WebKit entirely. I'd need to grep the codebase to know for sure. WebKit definitely needs to know about the deviceScaleFactor, which is based on device dpi (and typical viewing distance; see bug 88114). It's conceivable that we could just ask ports to provide the deviceScaleFactor directly instead of the device dpi, though it would be good to somehow make sure they calculate it correctly.
(In reply to comment #13) > > I don't know. Basing anything off DPI seems silly to me, but then again I don't do mobile stuff. (Does mobile have useful, real DPI data? I'm not sure why it would be less fake than on desktop though.) > > Mobile does have useful DPI data. iOS has retina vs non-retina. That's not general DPI though, that's just 1 or 2. > Android has DisplayMetrics.density[1], which is approximately equal to deviceDpi/160, though it often gets snapped to the nearest 0.25 or 0.5. Other platforms have similar ways of dealing with high dpi displays. > > [1]: http://developer.android.com/reference/android/util/DisplayMetrics.html#density This looks like a java api (i.e. not exposed to the web). From what I can tell, the members that are computed here are currently used in 2 places (according to cs.chromium.org): 1. To compute the device scale factor, in render_widget.cc. This is done by `scale = dpi / 160;`. This is the part I need for mac, and this could be done more directly by just giving this class a deviceScale member instead. 2. For viewport-related things. Adam says that no longer needs DPI. > > This is already duplicated in render_widget.cc. I think longer term it's better to get rid of the DPI stuff, especially if we remove that from the viewport tag like you hinted at on webkit-dev. > > (In reply to comment #7) > > We've already removed it from the viewport. It seems possible that we can remove the concept of DPI from WebKit entirely. I'd need to grep the codebase to know for sure. > > WebKit definitely needs to know about the deviceScaleFactor, which is based on device dpi (and typical viewing distance; see bug 88114). It's conceivable that we could just ask ports to provide the deviceScaleFactor directly instead of the device dpi, though it would be good to somehow make sure they calculate it correctly. At least on Mac, that'd be a more convenient interface.
> WebKit definitely needs to know about the deviceScaleFactor, which is based on device dpi Oh definitely! It's just a question of units. I was wondering if we could get WebKit to use "device scale factor" units consistently rather than "DPI" units. If you trace these code paths, we end up multiplying and dividing by 160 a bunch of times, which is kind of silly. It's not a big deal either way.
(In reply to comment #14) > This looks like a java api (i.e. not exposed to the web). From what I can tell, the members that are computed here are currently used in 2 places (according to cs.chromium.org): > > 1. To compute the device scale factor, in render_widget.cc. This is done by `scale = dpi / 160;`. This is the part I need for mac, and this could be done more directly by just giving this class a deviceScale member instead. Agreed. But note that the deviceScale shouldn't necessarily be 1. On desktop mac, when accurate dpi information is available it should probably be max(1, true dpi / 96) in order to support high dpi monitors (and comply with the definition of a CSS px); see bug 88114 for explanation. This assumes that deviceScaleFactor does something useful on desktop mac; ideally it would increase pageZoomFactor proportionately, so that full page zoom occurs. For example on a theoretical "retina" MacBook, deviceScaleFactor would be 2 hence pageZoomFactor would be 2 and the initial containing block width would be half the width of the window in physical pixels == the width of the window in density-independent "UI" pixels. (In reply to comment #15) > I was wondering if we could get WebKit to use "device scale factor" units consistently rather than "DPI" units. If you trace these code paths, we end up multiplying and dividing by 160 a bunch of times, which is kind of silly. It's not a big deal either way. Yeah, sounds sensible. Certainly most of the mobile ports seem to end up doing deviceScaleFactor = (systemDeviceScaleFactor * 160) / 160, which I agree is needlessly complicated (though if we change this we should make sure to clearly comment how the deviceScaleFactor should be calculated, as it's not obvious).
> Agreed. But note that the deviceScale shouldn't necessarily be 1. On desktop mac, when accurate dpi information is available it should probably be max(1, true dpi / 96) I'm close to certain that it will be always 1 or 2 on desktop macs. That's what all apps are going to assume. Macs don't try to work ok with all hardware out there, instead they try to work great with some of the hardware.