Interaction with a touch screen is far less precise that with a mouse. When one clicks on a link with a finger, the single point of interaction is often not on the link.
QTouchPoint gives a rect representing the area of the finger on the screen. This rect must be used to find something the user can interact with.
To implement this task, QtWebkit needs to sample points in the rect, and find an element that will accepts a click.
The rect is not always available, here are the cases to handle:
-QTouchPoint has a rect
-QTouchEvent is sent, but without a rect -> guess a rect
-No QTouchEvent is sent, QMouseEvent is sent instead on a platform without mouse (Maemo, some Symbian, Windows Mobile) -> guess a rect if the platform is mobile
Antonio, your experience with the spatial navigation might come to use here. Maybe something smarter than sampling points.
*** Bug 33169 has been marked as a duplicate of this bug. ***
*** Bug 30712 has been marked as a duplicate of this bug. ***
(In reply to comment #1)
> Antonio, your experience with the spatial navigation might come to use here.
> Maybe something smarter than sampling points.
Right. I can certainly work on this after bug 29431, which targets on Monday on my schedule.
(In reply to comment #4)
> Right. I can certainly work on this after bug 29431, which targets on Monday on
> my schedule.
If you think this will take more than a couple of days (and test with resistive and capacitive screens), please prioritize bug fixing instead.
If you think it can be easily done for WebKit 2.0, that would be great :)
Please note that this is also a problem with HTML elements other than links. It takes several touch events to activate elements such as multiselect lists and buttons, as well as links.
Dup'ing it. This one will likely be fixed when bug 44089 is fixed.
*** This bug has been marked as a duplicate of bug 44089 ***