Capturing gestures with accessibility features on (such as explore-by-touch)

view story

http://stackoverflow.com – We have an application which captures gestures (currently using the onTouch event callback, works great). Sadly, when turning on accessibility features on (such as explore-by-touch), only some of the fingers are recognized by our application. We of course have reasons to believe this is not due to a bug in our code. To us, the visually-impaired and blind populations are very important, and the gestures are even more important for them. How can gestures be captured when accessibility features are enabled? (HowTos)