Jump to: navigation, search

SWT/Multitouch support

< SWT
Revision as of 06:25, 20 September 2010 by Gercan.acm.org (Talk | contribs)

There is growing interest in supporting touch events and gestures in the SWT, and we want to consider adding it to the SWT for 3.7. The bugzilla bug requesting this feature is 279884

This page will be used as a collecting point for platform API references, other work to draw from, and, eventually, API.

References

Platform support for touch and gestures

MSDN guide to Windows Touch API

O'Reilly article on gesture handling for Windows 7

Android UI event handling

Android gestures

Mac OS X trackpad event handling

Blog posts on Gnome/GTK+ multi-touch (haven't seen links to API yet)

Flash 10.1/AIR 2.0 support for touch events and gestures

Qt 4.6 has touch and gesture events that are implemented to be cross-platform

Currently there are only a few tablets and laptops that have multitouch support built in. I found this cheap USB touchpad on Amazon for $20 that should allow development on a Mac Book Pro with Boot Camp.

Java touch event projects

Project Kenai Mac multi-touch support - Apache licensed

Eclipse plugin based on Kenai work - EPL licensed

These are interesting because they will work on 10.5 and 10.6, but the implementation relies on a private framework. The only official AppKit support for touch events exists in 10.6.

Requirements

  • A collection of new Event subclasses and SWTEventListener subclasses will be defined
  • TouchEvent, GestureEvent, ....
  • TouchEventListener, GestureEventListener, ....
  • A TouchEvent has a collection of TouchState objects that correspond to the set of touch points that made up the TouchEvent. A TouchState represents one component of the TouchEvent. TouchStates have:
    • a phase that represents what the particular touch point is doing -- e.g., Touch_Start, Touch_End, Touch_Move, Touch_Resting
    • an identifier that is unique for a given sequence of touches.
    • a device ID that corresponds to the touchpad or trackpad that generated the event
    • device height and width in pixels
    • the location of the touch, represented as the fraction of the distance up or right from the edge of the device
    • a flag indicating a 'resting' or ignored touch.
  • A TouchEvent is sent when there is some change in state of the fingers on the touch pad. The type of the event reflects what changed for that event.


  • Control adds new methods to add and remove TouchListeners.
  • GestureEvents are higher-level constructs that represent an interpretation of touch events.
  • A GestureEvent will have additional gesture-specific information associated with it. For example, a RotateGesture will have a number of degrees that the user rotated their fingers, a ZoomGesture will have a scale factor associated with the pinch/stretch, a SwipeGesture will have a direction, and a TapGesture will have the number of taps. The names are strictly for discussion. The SWT pattern would lean towards subclasses of GestureEvent with named fields as opposed to a more general dictionary of properties. That seems to be the case for all of the platform APIs.
  • A Control can get TouchEvents or
  • GestureEvent has 3 notifications
    • GESTURE_BEGIN -- gesture is about to happen
    • GESTURE_PERFORM -- gesture is happening
    • GESTURE_END -- gesture finished
  • Mac and Windows have specific gesture callbacks for pinch/zoom, rotate, and panning/swiping. Windows 7 has a distinct gesture for two-finger tap or tap and hold; Mac does not.
  • Touches always go to the deepest Control in the visual hierarchy.