Bug 100964 - [Qt] Make sure that the WebView in flickable mode sends mouse events to the page rather than controlling the viewport with it
Summary: [Qt] Make sure that the WebView in flickable mode sends mouse events to the p...
Status: RESOLVED FIXED
Alias: None
Product: WebKit
Classification: Unclassified
Component: WebKit Qt (show other bugs)
Version: 528+ (Nightly build)
Hardware: Unspecified Unspecified
: P2 Normal
Assignee: Jocelyn Turcotte
URL:
Keywords: Qt
Depends on:
Blocks: 76773
  Show dependency treegraph
 
Reported: 2012-11-01 09:16 PDT by Jocelyn Turcotte
Modified: 2012-11-29 03:15 PST (History)
6 users (show)

See Also:


Attachments
Patch (2.79 KB, patch)
2012-11-28 08:46 PST, Jocelyn Turcotte
kenneth: review+
Details | Formatted Diff | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Jocelyn Turcotte 2012-11-01 09:16:32 PDT
A workstation without a touch device should be able to use the WebView in flickable mode and be able to scroll with the wheel, see scrollbars in a a desktop WebView component. It should also be able to select text and click links without touch adjustment kicking in and no tap highlighting should be shown.

A workstation having both a touch device and a mouse should be able to select text with the mouse and flick the view with the touch device. Touch enabled web pages like Google maps should be able to take advantage of the touch device too.

We should set the correct behavior before we release 5.0 to make sure that we don't have to change it afterward.

Right now the only change that seems to be needed for this is removing QQuickWebViewFlickablePrivate::handleMouseEvent.
The only problem would be with touch screens on OSes sending both touch and mouse events (if that still exist). We have to test it properly.
Comment 1 Andras Becsi 2012-11-01 09:24:32 PDT
The work on this bug and the introduction of a desktop WebView component should also make it possible in the long term to remove the desktop (legacy) codepaths.
Comment 2 Sergio Villar Senin 2012-11-01 12:51:00 PDT
(In reply to comment #0)
> A workstation without a touch device should be able to use the WebView in flickable mode and be able to scroll with the wheel, see scrollbars in a a desktop WebView component. It should also be able to select text and click links without touch adjustment kicking in and no tap highlighting should be shown.
> 

Fantastic specification. I'd add that it should also be able to perform drags of the draggable elements inside the page (like images for example).

> A workstation having both a touch device and a mouse should be able to select text with the mouse and flick the view with the touch device. Touch enabled web pages like Google maps should be able to take advantage of the touch device too.
> 

Not sure at all about d&d in this case, looks complex.

> We should set the correct behavior before we release 5.0 to make sure that we don't have to change it afterward.
> 
> Right now the only change that seems to be needed for this is removing QQuickWebViewFlickablePrivate::handleMouseEvent.
> The only problem would be with touch screens on OSes sending both touch and mouse events (if that still exist). We have to test it properly.
Comment 3 Sergio Villar Senin 2012-11-13 17:29:48 PST
So, in order to keep this bug rolling, I think is important to consider how events are managed in the Qt WK2 port, although I have some experience with WebKit I've just started working on WebKitQt so please correct me if I'm wrong.

Basically once we get an event from the UI there are 2 high level code paths:

1) the QtWebPageEventHandler redirects the event to the WebProcess and then WebCore processes it and informs the UIProcess using the different available clients.

2) the QtWebPageEventHandler delegates the handling of the event to one of the QtGestureRecognizers available. The gesture recognizer could then decide to send the event back to the event handler or to the viewport controller for further processing.

Jocelyn suggests to remove QQuickWebViewFlickablePrivate::handleMouseEvent. What would that mean? Right now, if using the flickable version, all the events follow path 2. This means that for example all drag actions are considered a flick. If we remove that handler, then mouse events will follow path 1, which means that WebCore will directly handle them, and will consider the previous flick gesture as a drag start event.

So if we consider the typical case of a desktop browser using the flickable WebView (which will be the only one if I understood some of you correctly) the current architecture forces us to decide between having flickable events or the classical drag events. The drag start will also collide with some other events, for example a tap and hold.

How is the port supposed to handle these conflicts?
Comment 4 Jocelyn Turcotte 2012-11-14 01:37:21 PST
(In reply to comment #3)
> Basically once we get an event from the UI there are 2 high level code paths:
> 
> 1) the QtWebPageEventHandler redirects the event to the WebProcess and then WebCore processes it and informs the UIProcess using the different available clients.
> 
> 2) the QtWebPageEventHandler delegates the handling of the event to one of the QtGestureRecognizers available. The gesture recognizer could then decide to send the event back to the event handler or to the viewport controller for further processing.

The way I think it works right now (I'm no expert on this either) is that those two code paths are actually sequential.
Both touch and mouse events will first got through the web process (WebPage::touchEvent), and if the event wasn't handled by the page, return it to the UI process (WebPageProxy::didReceiveEvent) which will then give it back to QtWebPageEventHandler through PageClient::doneWithTouchEvent.

For mouse events however, WebPageProxy::didReceiveEvent doesn't push them back to the higher layers.

We might have to add shortcuts at some point to prevent a busy web process from delaying viewport interaction by sending events directly to the event handler (e.g. if we know the page doesn't have touch handlers at this position), but the logic should still apply.

> Jocelyn suggests to remove QQuickWebViewFlickablePrivate::handleMouseEvent. What would that mean? Right now, if using the flickable version, all the events follow path 2. This means that for example all drag actions are considered a flick. If we remove that handler, then mouse events will follow path 1, which means that WebCore will directly handle them, and will consider the previous flick gesture as a drag start event.
> 
> So if we consider the typical case of a desktop browser using the flickable WebView (which will be the only one if I understood some of you correctly) the current architecture forces us to decide between having flickable events or the classical drag events. The drag start will also collide with some other events, for example a tap and hold.
> 
> How is the port supposed to handle these conflicts?

The way I see it is that the WebView is only behaving as a flickable if you interact with it using touch events. To scroll the view using a mouse there are still the mouse wheel and scrollbars. Nobody (I assume) wants to flick using a mouse anyway.
That means that there would be no way of drag'n dropping using a touch device.
Comment 5 Sergio Villar Senin 2012-11-14 13:04:29 PST
(In reply to comment #4)
> (In reply to comment #3)
> > Basically once we get an event from the UI there are 2 high level code paths:
> > 
> > 1) the QtWebPageEventHandler redirects the event to the WebProcess and then WebCore processes it and informs the UIProcess using the different available clients.
> > 
> > 2) the QtWebPageEventHandler delegates the handling of the event to one of the QtGestureRecognizers available. The gesture recognizer could then decide to send the event back to the event handler or to the viewport controller for further processing.
> 
> The way I think it works right now (I'm no expert on this either) is that those two code paths are actually sequential.
> Both touch and mouse events will first got through the web process (WebPage::touchEvent), and if the event wasn't handled by the page, return it to the UI process (WebPageProxy::didReceiveEvent) which will then give it back to QtWebPageEventHandler through PageClient::doneWithTouchEvent.
> 
> For mouse events however, WebPageProxy::didReceiveEvent doesn't push them back to the higher layers.

Hmm taking a look at the code, the flickable private object redirects all mouse events to QtWebPageEventHandler::handleInputEvent. This means that the event will be handled by the gesture recognizers in the UI side and nothing will be send to the WebProcess, or am I wrong? As you said if we remove that handler then all the events will be then sent to the WebProcess directly.

> > So if we consider the typical case of a desktop browser using the flickable WebView (which will be the only one if I understood some of you correctly) the current architecture forces us to decide between having flickable events or the classical drag events. The drag start will also collide with some other events, for example a tap and hold.
> > 
> > How is the port supposed to handle these conflicts?
> 
> The way I see it is that the WebView is only behaving as a flickable if you interact with it using touch events. To scroll the view using a mouse there are still the mouse wheel and scrollbars. Nobody (I assume) wants to flick using a mouse anyway.
> That means that there would be no way of drag'n dropping using a touch device.

Ok that looks like a good trade-off. I am not sure that nobody wants to flick using a mouse though. For example I was thinking about interfaces like the Wii, were you have a pointer device that effectively acts as a mouse, but IIRC the Opera browser it has does some kind of panning (flick) instead of scrolling. The hybrid devices mentioned somewhere else could be another example.
Comment 6 Jocelyn Turcotte 2012-11-15 03:08:46 PST
(In reply to comment #5)
> Hmm taking a look at the code, the flickable private object redirects all mouse events to QtWebPageEventHandler::handleInputEvent. This means that the event will be handled by the gesture recognizers in the UI side and nothing will be send to the WebProcess, or am I wrong? As you said if we remove that handler then all the events will be then sent to the WebProcess directly.

Yep exactly, that's why selecting text and drag-dropping isn't working. We treat mouse clicks as touch taps if the button was released without moving and send them to the web process.

> Ok that looks like a good trade-off. I am not sure that nobody wants to flick using a mouse though. For example I was thinking about interfaces like the Wii, were you have a pointer device that effectively acts as a mouse, but IIRC the Opera browser it has does some kind of panning (flick) instead of scrolling. The hybrid devices mentioned somewhere else could be another example.

I think that this should then be patched in the distribution or enabled through a platform plugin. In other words, the platform should be dealing with this and not the application code IMO. If platform-specific applications want to enable this, we could expose a setting through a private API.
Comment 7 Simon Hausmann 2012-11-19 01:16:38 PST
(In reply to comment #0)
[...]
> Right now the only change that seems to be needed for this is removing QQuickWebViewFlickablePrivate::handleMouseEvent.
> The only problem would be with touch screens on OSes sending both touch and mouse events (if that still exist). We have to test it properly.

I believe that is for example the case on Mac OS X with a multi point touch-pad, isn't it?
Comment 8 Andras Becsi 2012-11-19 03:54:29 PST
(In reply to comment #4)
> (In reply to comment #3)
> > Basically once we get an event from the UI there are 2 high level code paths:
> > 
> > 1) the QtWebPageEventHandler redirects the event to the WebProcess and then WebCore processes it and informs the UIProcess using the different available clients.
> > 
> > 2) the QtWebPageEventHandler delegates the handling of the event to one of the QtGestureRecognizers available. The gesture recognizer could then decide to send the event back to the event handler or to the viewport controller for further processing.
> 
> The way I think it works right now (I'm no expert on this either) is that those two code paths are actually sequential.
> Both touch and mouse events will first got through the web process (WebPage::touchEvent), and if the event wasn't handled by the page, return it to the UI process (WebPageProxy::didReceiveEvent) which will then give it back to QtWebPageEventHandler through PageClient::doneWithTouchEvent.

Exactly, at this point the event is processed by the gesture recognizers and depending on the gesture we either control the flickable (QQuickWebView::handleFlickableMouse*) for pan gestures or we synthesize a tap or act on a double tap or scale the page item if the gesture was a pinch.

> 
> For mouse events however, WebPageProxy::didReceiveEvent doesn't push them back to the higher layers.
> 
> We might have to add shortcuts at some point to prevent a busy web process from delaying viewport interaction by sending events directly to the event handler (e.g. if we know the page doesn't have touch handlers at this position), but the logic should still apply.

Yes, I think the shortcutting is still something we want to do later, because as web pages have more and more event handlers our interaction might end up being sluggish which we do not want.

> 
> > Jocelyn suggests to remove QQuickWebViewFlickablePrivate::handleMouseEvent. What would that mean? Right now, if using the flickable version, all the events follow path 2. This means that for example all drag actions are considered a flick. If we remove that handler, then mouse events will follow path 1, which means that WebCore will directly handle them, and will consider the previous flick gesture as a drag start event.
> > 
> > So if we consider the typical case of a desktop browser using the flickable WebView (which will be the only one if I understood some of you correctly) the current architecture forces us to decide between having flickable events or the classical drag events. The drag start will also collide with some other events, for example a tap and hold.
> > 
> > How is the port supposed to handle these conflicts?
> 
> The way I see it is that the WebView is only behaving as a flickable if you interact with it using touch events. To scroll the view using a mouse there are still the mouse wheel and scrollbars. Nobody (I assume) wants to flick using a mouse anyway.
> That means that there would be no way of drag'n dropping using a touch device.

Drag&drop on a touch device should in my opinion be implemented with a tap-and-hold gesture which would open a context menu and where applicable would offer drag&drop.
I do not think that having this feature now is of high priority for us, though, since the tap-and-hold logic for context menus needs to be implemented first which depends on client side QML and is thus tied to the Desktop/Touch Components projects.
Comment 9 Andras Becsi 2012-11-19 04:02:35 PST
(In reply to comment #6)
> (In reply to comment #5)
> > Hmm taking a look at the code, the flickable private object redirects all mouse events to QtWebPageEventHandler::handleInputEvent. This means that the event will be handled by the gesture recognizers in the UI side and nothing will be send to the WebProcess, or am I wrong? As you said if we remove that handler then all the events will be then sent to the WebProcess directly.
> 
> Yep exactly, that's why selecting text and drag-dropping isn't working. We treat mouse clicks as touch taps if the button was released without moving and send them to the web process.

Incoming mouse event are directly sent through the gesture recognizers.
With this we worked around the issue that on devices that send both touch and mouse events simultaneously we end up selecting text during panning.

> 
> > Ok that looks like a good trade-off. I am not sure that nobody wants to flick using a mouse though. For example I was thinking about interfaces like the Wii, were you have a pointer device that effectively acts as a mouse, but IIRC the Opera browser it has does some kind of panning (flick) instead of scrolling. The hybrid devices mentioned somewhere else could be another example.
> 
> I think that this should then be patched in the distribution or enabled through a platform plugin. In other words, the platform should be dealing with this and not the application code IMO. If platform-specific applications want to enable this, we could expose a setting through a private API.

I also think this is an issue of the driver/ underlying platform, but as far as I know touch pads on Mac OS also send both events.
Comment 10 Simon Hausmann 2012-11-26 03:26:24 PST
ping, any update on this?
Comment 11 Jocelyn Turcotte 2012-11-26 04:01:48 PST
(In reply to comment #10)
> ping, any update on this?

I tried on Linux and just removing QQuickWebViewFlickablePrivate::handleMouseEvent produce the same behavior as IE does on Windows 8. Mouse selects, and touch pans. If we can add a QML scroll bar to WebKit2 examples I think this would be fine for 5.0.

Brought my Mac at work today, I want to try with the touch pad as you said it might produce both types of events.
My debug build failed though, it's trying to like against libQt5WebKitWidgets_debug but I only have libQtWebKitWidgets.5.dylib (no _debug suffix and with "Qt" instead of "Qt5")
I'll try to finish my patch for setHtml and look at it afterward.
Comment 12 Jocelyn Turcotte 2012-11-28 08:46:30 PST
Created attachment 176491 [details]
Patch

I tested with the mac trackpad and it's not sending touch events for single touch. When using multiple fingers it doesn't send mouse move events. The only problem is that it sends wheel events at the same time as touch events when doing two-fingers scroll, but this is also the reason why they were disabled by default one month ago: https://qt.gitorious.org/qt/qtbase/commit/adb156e4dd64609ba6c0b172e9c428a79e306a7c .

Ideally I guess that we should just handle gestures directly for those devices rather than doing gesture recognition directly ourselves. That would also help support stuff like that magic mouse that I've read aren't sending individual touch events.

So anyway here is the patch. I'll do some more work to try showing scroll bars in MiniBrowser and maybe handle ctrl-wheel directly in WebView to make it easier to handle it with the mouse.
Comment 13 Andras Becsi 2012-11-28 09:03:46 PST
Comment on attachment 176491 [details]
Patch

Looks good.

I agree with you on the gesture recognition part, in we would need some mechanism in Qt to have simple gesture events and be able to subscribe to them, instead of having decentralized gesture recognition. This sounds like an interesting research project.
Comment 14 Jocelyn Turcotte 2012-11-29 03:15:45 PST
Committed r136119: <http://trac.webkit.org/changeset/136119>