To support this feature, the input dispatcher now allows input
events to be acknowledged out-of-order. As a result, the
consumer can choose to defer handling an input event from one
device (because it is building a big batch) while continuing
to handle input events from other devices.
The InputEventReceiver now sends a notification when a batch
is pending. The ViewRoot handles this notification by scheduling
a draw on the next sync. When the draw happens, the InputEventReceiver
is instructed to consume all pending batched input events, the
input event queue is fully processed (as much as possible),
and then the ViewRoot performs traversals as usual.
With these changes in place, the input dispatch latency is
consistently less than one frame as long as the application itself
isn't stalled. Input events are delivered to the application
as soon as possible and are handled as soon as possible. In practice,
it is no longer possible for an application to build up a huge
backlog of touch events.
This is part of a series of changes to improve input system pipelining.
Bug: 5963420
Change-Id: I42c01117eca78f12d66d49a736c1c122346ccd1d
This change enables the input dispatcher to send multiple touch
events to an application without waiting for them to be acknowledged.
Essentially this is a variation on the old streaming optimization
but it is much more comprehensive. Event dispatch will stall as
soon as 0.5sec of unacknowledged events are accumulated or a
focused event (such as a key event) needs to be delivered.
Streaming input events makes a tremendous difference in application
performance. The next step will be to enable batching of input
events on the client side once again.
This is part of a series of changes to improve input system pipelining.
Bug: 5963420
Change-Id: I025df90c06165d719fcca7f63eed322a5cce4a78
Since we will not longer be modifying events in place, we don't need
to use an ashmem region for input. Simplified the code to instead
use a socket of type SOCK_SEQPACKET.
This is part of a series of changes to improve input system pipelining.
Bug: 5963420
Change-Id: I05909075ed8b61b93900913e44c6db84857340d8
Fixed some issues where inconsistent streams of events could
be generated by the dispatcher, particularly when switching from
hovering with one device to hovering with another.
Fixed a bug where the touch pad would fail to generate a new
HOVER_MOVE following a tap event. As a result, the hover event
stream would not resume until the user touched the touch pad
again.
Change-Id: I444dce84641fb12e56a0af84c931520771d6c493
Added the concept of pointer properties in a MotionEvent.
This is currently used to track the pointer tool type to enable
applications to distinguish finger touches from a stylus.
Button states are also reported to application as part of touch events.
There are no new actions for detecting changes in button states.
The application should instead query the button state from the
MotionEvent and take appropriate action as needed.
A good time to check the button state is on ACTION_DOWN.
As a side-effect, applications that do not support multiple buttons
will treat primary, secondary and tertiary buttons identically
for all touch events.
The back button on the mouse is mapped to KEYCODE_BACK
and the forward button is mapped to KEYCODE_FORWARD.
Added basic plumbing for the secondary mouse button to invoke
the context menu, particularly in lists.
Added clamp and split methods on MotionEvent to take care of
common filtering operations so we don't have them scattered
in multiple places across the framework.
Bug: 4260011
Change-Id: Ie992b4d4e00c8f2e76b961da0a902145b27f6d83
Some drivers report individual finger updates one at a time
instead of all at once. When 10 fingers are down, this can
cause the framework to have to handle 10 times as many events
each with 10 times as much data. Applications like
PointerLocation would get significantly bogged down by all
of the redundant samples.
This change coalesces samples that are closely spaced in time,
before they are dispatched, as part of the motion event batching
protocol.
Increased the size of the InputChannel shared memory buffer so
that applications can catch up faster if they accumulate a
backlog of samples.
Added logging code to help measure input dispatch and drawing
latency issues in the view hierarchy. See ViewDebug.DEBUG_LATENCY.
Change-Id: Ia5898f781f19901d2225c529a910c32bdf4f504f
1. Single finger tap performs a click.
2. Single finger movement moves the pointer (hovers).
3. Button press plus movement performs click or drag.
While dragging, the pointer follows the finger that is moving
fastest. This is important if there are additional fingers
down on the touch pad for the purpose of applying force
to an integrated button underneath.
4. Two fingers near each other moving in the same direction
are coalesced as a swipe gesture under the pointer.
5. Two or more fingers moving in arbitrary directions are
transformed into touches in the vicinity of the pointer.
This makes scale/zoom and rotate gestures possible.
Added a native VelocityTracker implementation to enable intelligent
switching of the active pointer during drags.
Change-Id: I5ada57e7f2bdb9b0a791843eb354a8c706b365dc
Added support for tracking the mouse position even when the mouse button
is not pressed. To avoid confusing existing applications, mouse movements
are reported using the new ACTION_HOVER_MOVE action when the mouse button
is not pressed.
Added some more plumbing for the scroll wheel axes. The values are
reported to Views but they are not yet handled by the framework.
Change-Id: I1706be850d25cf34e5adf880bbed5cc3265cf4b1
This change enables the framework to synthesize key events to implement
default behavior when an application does not handle a key.
For example, this change enables numeric keypad keys to perform
their associated special function when numlock is off.
The application is informed that it is processing a fallback keypress
so it can choose to ignore it.
Added a new keycode for switching applications.
Added ALT key deadkeys.
New default key mappings:
- ESC -> BACK
- Meta+ESC -> HOME
- Alt+ESC -> MENU
- Meta+Space -> SEARCH
- Meta+Tab -> APP_SWITCH
Fixed some comments.
Fixed some tests.
Change-Id: Id7f3b6645f3a350275e624547822f72652f3defe
Refactored ViewRoot, NativeActivity and related classes to tell the
dispatcher whether an input event was actually handled by the application.
This will be used to move more of the global default key processing
into the system server instead of the application.
Change-Id: If06b98b6f45c543e5ac5b1eae2b3baf9371fba28
Added the MotionEvent.FLAG_WINDOW_IS_OBSCURED flag which is set by the
input manager whenever another visible window is partly or wholly obscured
the target of a touch event so that applications can filter touches
accordingly.
Added a "filterTouchesWhenObscured" attribute to View which can be used to
enable filtering of touches when the view's window is obscured.
Change-Id: I936d9c85013fd2d77fb296a600528d30a29027d2
Added several new coordinate values to MotionEvents to capture
touch major/minor area, tool major/minor area and orientation.
Renamed NDK input constants per convention.
Added InputDevice class in Java which will eventually provide
useful information about available input devices.
Added APIs for manufacturing new MotionEvent objects with multiple
pointers and all necessary coordinate data.
Fixed a bug in the input dispatcher where it could get stuck with
a pointer down forever.
Fixed a bug in the WindowManager where the input window list could
end up containing stale removed windows.
Fixed a bug in the WindowManager where the input channel was being
removed only after the final animation transition had taken place
which caused spurious WINDOW DIED log messages to be printed.
Change-Id: Ie55084da319b20aad29b28a0499b8dd98bb5da68
And also:
- APIs to show and hide the IME, and control its interaction with the app.
- APIs to tell the app when its window resizes and needs to be redrawn.
- API to tell the app the content rectangle of its window (to layout
around the IME or status bar).
There is still a problem with IME interaction -- we need a way for the
app to deliver events to the IME before it handles them, so that for
example the back key will close the IME instead of finishing the app.
Change-Id: I37b75fc2ec533750ef36ca3aedd2f0cc0b5813cd
Target identification is now fully native.
Fixed a couple of minor issues related to input injection.
Native input enabled by default, can be disabled by setting
WindowManagerPolicy.ENABLE_NATIVE_INPUT_DISPATCH to false.
Change-Id: I7edf66ed3e987cc9306ad4743ac57a116af452ff
Provides the basic infrastructure for a
NativeActivity's native code to get an object representing
its event stream that can be used to read input events.
Still work to do, probably some API changes, and reasonable
default key handling (so that for example back will still
work).
Change-Id: I6db891bc35dc9683181d7708eaed552b955a077e
Added more tests.
Fixed a regression in Vector.
Fixed bugs in pointer tracking.
Fixed a starvation issue in PollLoop when setting or removing callbacks.
Fixed a couple of policy nits.
Modified the internal representation of MotionEvent to be more
efficient and more consistent.
Added code to skip/cancel virtual key processing when there are multiple
pointers down. This helps to better disambiguate virtual key presses
from stray touches (such as cheek presses).
Change-Id: I2a7d2cce0195afb9125b23378baa94fd2fc6671c
The old dispatch mechanism has been left in place and continues to
be used by default for now. To enable native input dispatch,
edit the ENABLE_NATIVE_DISPATCH constant in WindowManagerPolicy.
Includes part of the new input event NDK API. Some details TBD.
To wire up input dispatch, as the ViewRoot adds a window to the
window session it receives an InputChannel object as an output
argument. The InputChannel encapsulates the file descriptors for a
shared memory region and two pipe end-points. The ViewRoot then
provides the InputChannel to the InputQueue. Behind the
scenes, InputQueue simply attaches handlers to the native PollLoop object
that underlies the MessageQueue. This way MessageQueue doesn't need
to know anything about input dispatch per-se, it just exposes (in native
code) a PollLoop that other components can use to monitor file descriptor
state changes.
There can be zero or more targets for any given input event. Each
input target is specified by its input channel and some parameters
including flags, an X/Y coordinate offset, and the dispatch timeout.
An input target can request either synchronous dispatch (for foreground apps)
or asynchronous dispatch (fire-and-forget for wallpapers and "outside"
targets). Currently, finding the appropriate input targets for an event
requires a call back into the WindowManagerServer from native code.
In the future this will be refactored to avoid most of these callbacks
except as required to handle pending focus transitions.
End-to-end event dispatch mostly works!
To do: event injection, rate limiting, ANRs, testing, optimization, etc.
Change-Id: I8c36b2b9e0a2d27392040ecda0f51b636456de25