Bug: 4364920
Velocity damping proved to be a bad idea because it would
cause a significant ramp in velocity at the beginning of
a gesture, instead of the desired smooth behavior. Oh well.
Change-Id: Ie631946f47ef2492bd71fbed1ab44bbb39a875a8
Added a new PointerIcon API (hidden for now) for loading
pointer icons.
Fixed a starvation problem in the native Looper's sendMessage
implementation which caused new messages to be posted ahead
of old messages sent with sendMessageDelayed.
Redesigned the touch pad gestures to be defined in terms of
more fluid finger / spot movements. The objective is to reinforce
the natural mapping between fingers and spots which means there
must not be any discontinuities in spot motion relative to
the fingers.
Removed the SpotController stub and folded its responsibilities
into PointerController.
Change-Id: Ib647dbd7a57a7f30dd9c6e2c260df51d7bbdd18e
Replaced VelocityTracker with a faster and more accurate
native implementation. This avoids the duplicate maintenance
overhead of having two implementations.
The new algorithm requires that the sample duration be at least
10ms in order to contribute to the velocity calculation. This
ensures that the velocity is not severely overestimated when
samples arrive in bursts.
The new algorithm computes the exponentially weighted moving
average using weights based on the relative duration of successive
sample periods.
The new algorithm is also more careful about how it handles
individual pointers going down or up and their effects on the
collected movement traces. The intent is to preserve the last
known velocity of pointers as they go up while also ensuring
that other motion samples do not count twice in that case.
Bug: 4086785
Change-Id: I95054102397c4b6a9076dc6a0fc841b4beec7920
1. Single finger tap performs a click.
2. Single finger movement moves the pointer (hovers).
3. Button press plus movement performs click or drag.
While dragging, the pointer follows the finger that is moving
fastest. This is important if there are additional fingers
down on the touch pad for the purpose of applying force
to an integrated button underneath.
4. Two fingers near each other moving in the same direction
are coalesced as a swipe gesture under the pointer.
5. Two or more fingers moving in arbitrary directions are
transformed into touches in the vicinity of the pointer.
This makes scale/zoom and rotate gestures possible.
Added a native VelocityTracker implementation to enable intelligent
switching of the active pointer during drags.
Change-Id: I7b7ddacc724fb1306e1590dbaebb740d3130d7cd
Added the concept of pointer properties in a MotionEvent.
This is currently used to track the pointer tool type to enable
applications to distinguish finger touches from a stylus.
Button states are also reported to application as part of touch events.
There are no new actions for detecting changes in button states.
The application should instead query the button state from the
MotionEvent and take appropriate action as needed.
A good time to check the button state is on ACTION_DOWN.
As a side-effect, applications that do not support multiple buttons
will treat primary, secondary and tertiary buttons identically
for all touch events.
The back button on the mouse is mapped to KEYCODE_BACK
and the forward button is mapped to KEYCODE_FORWARD.
Added basic plumbing for the secondary mouse button to invoke
the context menu, particularly in lists.
Added clamp and split methods on MotionEvent to take care of
common filtering operations so we don't have them scattered
in multiple places across the framework.
Bug: 4260011
Change-Id: Ie992b4d4e00c8f2e76b961da0a902145b27f6d83
First step of improving app screen size compatibility mode. When
running in compat mode, an application's windows are scaled up on
the screen rather than being small with 1:1 pixels.
Currently we scale the application to fill the entire screen, so
don't use an even pixel scaling. Though this may have some
negative impact on the appearance (it looks okay to me), it has a
big benefit of allowing us to now treat these apps as normal
full-screens apps and do the normal transition animations as you
move in and out and around in them.
This introduces fun stuff in the input system to take care of
modifying pointer coordinates to account for the app window
surface scaling. The input dispatcher is told about the scale
that is being applied to each window and, when there is one,
adjusts pointer events appropriately as they are being sent
to the transport.
Also modified is CompatibilityInfo, which has been greatly
simplified to not be so insane and incomprehendible. It is
now simple -- when constructed it determines if the given app
is compatible with the current screen size and density, and
that is that.
There are new APIs on ActivityManagerService to put applications
that we would traditionally consider compatible with larger screens
in compatibility mode. This is the start of a facility to have
a UI affordance for a user to switch apps in and out of
compatibility.
To test switching of modes, there is a new variation of the "am"
command to do this: am screen-compat [on|off] [package]
This mode switching has the fundamentals of restarting activities
when it is changed, though the state still needs to be persisted
and the overall mode switch cleaned up.
For the few small apps I have tested, things mostly seem to be
working well. I know of one problem with the text selection
handles being drawn at the wrong position because at some point
the window offset is being scaled incorrectly. There are
probably other similar issues around the interaction between
two windows because the different window coordinate spaces are
done in a hacky way instead of being formally integrated into
the window manager layout process.
Change-Id: Ie038e3746b448135117bd860859d74e360938557
Added a new PointerIcon API (hidden for now) for loading
pointer icons.
Fixed a starvation problem in the native Looper's sendMessage
implementation which caused new messages to be posted ahead
of old messages sent with sendMessageDelayed.
Redesigned the touch pad gestures to be defined in terms of
more fluid finger / spot movements. The objective is to reinforce
the natural mapping between fingers and spots which means there
must not be any discontinuities in spot motion relative to
the fingers.
Removed the SpotController stub and folded its responsibilities
into PointerController.
Change-Id: I5126b1e69d95252fda7f2a684c9287e239a57163
Replaced VelocityTracker with a faster and more accurate
native implementation. This avoids the duplicate maintenance
overhead of having two implementations.
The new algorithm requires that the sample duration be at least
10ms in order to contribute to the velocity calculation. This
ensures that the velocity is not severely overestimated when
samples arrive in bursts.
The new algorithm computes the exponentially weighted moving
average using weights based on the relative duration of successive
sample periods.
The new algorithm is also more careful about how it handles
individual pointers going down or up and their effects on the
collected movement traces. The intent is to preserve the last
known velocity of pointers as they go up while also ensuring
that other motion samples do not count twice in that case.
Bug: 4086785
Change-Id: I2632321232c64d6b8faacdb929e33f60e64dcdd3
1. Single finger tap performs a click.
2. Single finger movement moves the pointer (hovers).
3. Button press plus movement performs click or drag.
While dragging, the pointer follows the finger that is moving
fastest. This is important if there are additional fingers
down on the touch pad for the purpose of applying force
to an integrated button underneath.
4. Two fingers near each other moving in the same direction
are coalesced as a swipe gesture under the pointer.
5. Two or more fingers moving in arbitrary directions are
transformed into touches in the vicinity of the pointer.
This makes scale/zoom and rotate gestures possible.
Added a native VelocityTracker implementation to enable intelligent
switching of the active pointer during drags.
Change-Id: I5ada57e7f2bdb9b0a791843eb354a8c706b365dc
Associate each motion axis with the source from which it comes.
It is possible for multiple sources of the same device to define
the same axis. This fixes new API that was introduced in MR1.
(Bug: 4066146)
Fixed a bug that might cause a segfault when using a trackball.
Only fade out the mouse pointer when touching the touch screen,
ignore other touch pads.
Changed the plural "sources" to "source" in several places in
the InputReader where we intend to refer to a particular source
rather than to a combination of sources.
Improved the batching code to support batching events from different
sources of the same device in parallel. (Bug: 3391564)
Change-Id: I0189e18e464338f126f7bf94370b928e1b1695f2
Added some plumbing to enable the policy to intercept motion
events when the screen is off to handle wakeup if needed.
Added a basic concept of an external device to limit the scope
of the wakeup policy to external devices only. The wakeup policy
for internal devices should be based on explicit rules such as
policy flags in key layout files.
Moved isTouchEvent to native.
Ensure the dispatcher sends the right event type to userActivity
for non-touch pointer events like HOVER_MOVE and SCROLL.
Bug: 3193114
Change-Id: I15dbd48a16810dfaf226ff7ad117d46908ca4f86
Added API on InputDevice to query the set of axes available.
Added API on KeyEvent and MotionEvent to convert keycodes and axes
to symbolic name strings for diagnostic purposes.
Added API on KeyEvent to query if a given key code is a gamepad button.
Added a new "axis" element to key layout files to specify the
mapping between raw absolute axis values and motion axis ids.
Expanded the axis bitfield to 64bits to allow for future growth.
Modified the Makefile for keyboard prebuilts to run the keymap
validation tool during the build.
Added layouts for two game controllers.
Added default actions for game pad button keys.
Added more tests.
Fixed a bunch of bugs.
Change-Id: I73f9166c3b3c5bcf4970845b58088ad467525525
This change makes it possible to extend the set of axes that
are reported in MotionEvents by defining new axis constants.
The MotionEvent object is now backed by its C++ counterpart
to avoid having to maintain multiple representations of the
same data.
Change-Id: Ibe93c90d4b390d43c176cce48d558d20869ee608
Use Vendor ID, Product ID and optionally the Version to
locate keymaps and configuration files for external devices.
Moved virtual key definition parsing to native code so that
EventHub can identify touch screens with virtual keys and load
the appropriate key layout file.
Cleaned up a lot of old code in EventHub.
Fixed a regression in ViewRoot's fallback event handling.
Fixed a minor bug in FileMap that caused it to try to munmap
or close invalid handled when released if the attempt to map
the file failed.
Added a couple of new String8 conveniences for formatting strings.
Modified Tokenizer to fall back to open+read when mmap fails since
we can't mmap sysfs files as needed to open the virtual key
definition files in /sys/board_properties/.
Change-Id: I6ca5e5f9547619fd082ddac47e87ce185da69ee6
Fixed a bug with dpad keys on external keyboards being rotated
according to the display orientation by adding a new input device
configuration property called "keyboard.orientationAware".
Added a mechanism for overriding the key layout and key character
map in the input device configuration file using the new
"keyboard.layout" and "keyboard.characterMap" properties.
Also added "trackball.orientationAware", "touch.orientationAware" and
"touch.deviceType" configuration properties.
Rewrote the configuration property reading code in native code
so that it can be used by EventHub and other components.
Added basic support for installable idc, kl, and kcm files
in /data/system/devices. However, there is no provision for
copying files there yet.
Disabled long-press character pickers on full keyboards so that
key repeating works as expected.
Change-Id: I1bd9f0c3d344421db444e7d271eb09bc8bab4791
Added the MotionEvent.FLAG_WINDOW_IS_OBSCURED flag which is set by the
input manager whenever another visible window is partly or wholly obscured
the target of a touch event so that applications can filter touches
accordingly.
Added a "filterTouchesWhenObscured" attribute to View which can be used to
enable filtering of touches when the view's window is obscured.
Change-Id: I936d9c85013fd2d77fb296a600528d30a29027d2
Refactored the input reader so that each raw input protocol is handled
by a separate subclass of the new InputMapper type. This way, behaviors
pertaining to keyboard, trackballs, touchscreens, switches and other
devices are clearly distinguished for improved maintainability.
Added partial support for describing capabilities of input devices
(incomplete and untested for now, will be fleshed out in later commits).
Simplified EventHub interface somewhat since InputReader is taking over
more of the work.
Cleaned up some of the interactions between InputManager and
WindowManagerService related to reading input state.
Fixed swiping finger from screen edge into display area.
Added logging of device information to 'dumpsys window'.
Change-Id: I17faffc33e3aec3a0f33f0b37e81a70609378612
This significantly re-works the native key dispatching code to
allow events to be pre-dispatched to the current IME before
being processed by native code. It introduces one new public
API, which must be called after retrieving an event if the app
wishes for it to be pre-dispatched.
Currently the native code will only do pre-dispatching of
system keys, to avoid significant overhead for gaming input.
This should be improved to be smarted, filtering for only
keys that the IME is interested in. Unfortunately IMEs don't
currently provide this information. :p
Change-Id: Ic1c7aeec8b348164957f2cd88119eb5bd85c2a9f
Added several new coordinate values to MotionEvents to capture
touch major/minor area, tool major/minor area and orientation.
Renamed NDK input constants per convention.
Added InputDevice class in Java which will eventually provide
useful information about available input devices.
Added APIs for manufacturing new MotionEvent objects with multiple
pointers and all necessary coordinate data.
Fixed a bug in the input dispatcher where it could get stuck with
a pointer down forever.
Fixed a bug in the WindowManager where the input window list could
end up containing stale removed windows.
Fixed a bug in the WindowManager where the input channel was being
removed only after the final animation transition had taken place
which caused spurious WINDOW DIED log messages to be printed.
Change-Id: Ie55084da319b20aad29b28a0499b8dd98bb5da68
The native code now maintains a list of all keys that may use
default handling. If the app finishes one of these keys
without handling it, the key will be passed back off to Java
for default treatment.
Change-Id: I6a842a0d728eeafa4de7142fae573f8c11099e18
Provides the basic infrastructure for a
NativeActivity's native code to get an object representing
its event stream that can be used to read input events.
Still work to do, probably some API changes, and reasonable
default key handling (so that for example back will still
work).
Change-Id: I6db891bc35dc9683181d7708eaed552b955a077e
Added more tests.
Fixed a regression in Vector.
Fixed bugs in pointer tracking.
Fixed a starvation issue in PollLoop when setting or removing callbacks.
Fixed a couple of policy nits.
Modified the internal representation of MotionEvent to be more
efficient and more consistent.
Added code to skip/cancel virtual key processing when there are multiple
pointers down. This helps to better disambiguate virtual key presses
from stray touches (such as cheek presses).
Change-Id: I2a7d2cce0195afb9125b23378baa94fd2fc6671c
The old dispatch mechanism has been left in place and continues to
be used by default for now. To enable native input dispatch,
edit the ENABLE_NATIVE_DISPATCH constant in WindowManagerPolicy.
Includes part of the new input event NDK API. Some details TBD.
To wire up input dispatch, as the ViewRoot adds a window to the
window session it receives an InputChannel object as an output
argument. The InputChannel encapsulates the file descriptors for a
shared memory region and two pipe end-points. The ViewRoot then
provides the InputChannel to the InputQueue. Behind the
scenes, InputQueue simply attaches handlers to the native PollLoop object
that underlies the MessageQueue. This way MessageQueue doesn't need
to know anything about input dispatch per-se, it just exposes (in native
code) a PollLoop that other components can use to monitor file descriptor
state changes.
There can be zero or more targets for any given input event. Each
input target is specified by its input channel and some parameters
including flags, an X/Y coordinate offset, and the dispatch timeout.
An input target can request either synchronous dispatch (for foreground apps)
or asynchronous dispatch (fire-and-forget for wallpapers and "outside"
targets). Currently, finding the appropriate input targets for an event
requires a call back into the WindowManagerServer from native code.
In the future this will be refactored to avoid most of these callbacks
except as required to handle pending focus transitions.
End-to-end event dispatch mostly works!
To do: event injection, rate limiting, ANRs, testing, optimization, etc.
Change-Id: I8c36b2b9e0a2d27392040ecda0f51b636456de25