There is growing interest in supporting touch events and gestures in the SWT, and we want to consider adding it to the SWT for 3.7. The bugzilla bug requesting this feature is 279884.
This page will be used as a collecting point for platform API references, other work to draw from, and, eventually, API.
Platform support for touch and gestures
Mac OS X trackpad event handling
Flash 10.1/AIR 2.0 support for touch events and gestures
Java touch event projects
Project Kenai Mac multi-touch support - Apache licensed
Eclipse plugin based on Kenai work - EPL licensed
These are interesting because they will work on 10.5 and 10.6, but the implementation relies on a private framework. The only official AppKit support for touch events exists in 10.6.
- A collection of new Event subclasses and SWTEventListener subclasses will be defined
- TouchEvent, GestureEvent, ....
- TouchEventListener, GestureEventListener, ....
- A TouchEvent returns a collection of Touch objects that correspond to the set of touch points that made up the TouchEvent. A Touch represents one component of the TouchEvent. It has a phase that represents what the particular touch point is doing -- e.g., Touch_Start, Touch_End, Touch_Move, Touch_Resting, and an identifier that is unique for a given sequence of touches.
- Generally speaking, a TouchEvent is fired when there is some change in state of the fingers on the touch pad.
- GestureEvents are higher-level constructs that represent an interpretation of TouchEvents. The SWT will make the necessary system calls to have the current TouchEvent interpreted for you, and will send GestureEvents as appropriate.
- A GestureEvent will have additional gesture-specific information associated with it. For example, a RotateGesture will have a number of degrees that the user rotated their fingers, a ZoomGesture will have a scale factor associated with the pinch/stretch, a SwipeGesture will have a direction, and a TapGesture will have the number of taps. The names are strictly for discussion. The SWT pattern would lean towards subclasses of GestureEvent with named fields as opposed to a more general dictionary of properties. That seems to be the case for all of the platform APIs.
- Mac and Windows have specific gesture callbacks for pinch/zoom and rotate. Mac has a Swipe gesture; Windows 7 does not. Windows 7 has a distinct gesture for two-finger tap or tap and hold; Mac does not.
- GestureEvent will return a collection of TouchEvents
- Control needs new methods to add and remove each kind of base listener class (likely TouchEvent and GestureEvent).
- Touches always go to the deepest Control in the visual hierarchy.
- Clients only need to create a listener and add it to the Control - no additional coding necessary.