First step of improving app screen size compatibility mode. When
running in compat mode, an application's windows are scaled up on
the screen rather than being small with 1:1 pixels.
Currently we scale the application to fill the entire screen, so
don't use an even pixel scaling. Though this may have some
negative impact on the appearance (it looks okay to me), it has a
big benefit of allowing us to now treat these apps as normal
full-screens apps and do the normal transition animations as you
move in and out and around in them.
This introduces fun stuff in the input system to take care of
modifying pointer coordinates to account for the app window
surface scaling. The input dispatcher is told about the scale
that is being applied to each window and, when there is one,
adjusts pointer events appropriately as they are being sent
to the transport.
Also modified is CompatibilityInfo, which has been greatly
simplified to not be so insane and incomprehendible. It is
now simple -- when constructed it determines if the given app
is compatible with the current screen size and density, and
that is that.
There are new APIs on ActivityManagerService to put applications
that we would traditionally consider compatible with larger screens
in compatibility mode. This is the start of a facility to have
a UI affordance for a user to switch apps in and out of
compatibility.
To test switching of modes, there is a new variation of the "am"
command to do this: am screen-compat [on|off] [package]
This mode switching has the fundamentals of restarting activities
when it is changed, though the state still needs to be persisted
and the overall mode switch cleaned up.
For the few small apps I have tested, things mostly seem to be
working well. I know of one problem with the text selection
handles being drawn at the wrong position because at some point
the window offset is being scaled incorrectly. There are
probably other similar issues around the interaction between
two windows because the different window coordinate spaces are
done in a hacky way instead of being formally integrated into
the window manager layout process.
Change-Id: Ie038e3746b448135117bd860859d74e360938557
Associate each motion axis with the source from which it comes.
It is possible for multiple sources of the same device to define
the same axis. This fixes new API that was introduced in MR1.
(Bug: 4066146)
Fixed a bug that might cause a segfault when using a trackball.
Only fade out the mouse pointer when touching the touch screen,
ignore other touch pads.
Changed the plural "sources" to "source" in several places in
the InputReader where we intend to refer to a particular source
rather than to a combination of sources.
Improved the batching code to support batching events from different
sources of the same device in parallel. (Bug: 3391564)
Change-Id: I0189e18e464338f126f7bf94370b928e1b1695f2
Added some plumbing to enable the policy to intercept motion
events when the screen is off to handle wakeup if needed.
Added a basic concept of an external device to limit the scope
of the wakeup policy to external devices only. The wakeup policy
for internal devices should be based on explicit rules such as
policy flags in key layout files.
Moved isTouchEvent to native.
Ensure the dispatcher sends the right event type to userActivity
for non-touch pointer events like HOVER_MOVE and SCROLL.
Bug: 3193114
Change-Id: I15dbd48a16810dfaf226ff7ad117d46908ca4f86
Added API on InputDevice to query the set of axes available.
Added API on KeyEvent and MotionEvent to convert keycodes and axes
to symbolic name strings for diagnostic purposes.
Added API on KeyEvent to query if a given key code is a gamepad button.
Added a new "axis" element to key layout files to specify the
mapping between raw absolute axis values and motion axis ids.
Expanded the axis bitfield to 64bits to allow for future growth.
Modified the Makefile for keyboard prebuilts to run the keymap
validation tool during the build.
Added layouts for two game controllers.
Added default actions for game pad button keys.
Added more tests.
Fixed a bunch of bugs.
Change-Id: I73f9166c3b3c5bcf4970845b58088ad467525525
This change makes it possible to extend the set of axes that
are reported in MotionEvents by defining new axis constants.
The MotionEvent object is now backed by its C++ counterpart
to avoid having to maintain multiple representations of the
same data.
Change-Id: Ibe93c90d4b390d43c176cce48d558d20869ee608
Use Vendor ID, Product ID and optionally the Version to
locate keymaps and configuration files for external devices.
Moved virtual key definition parsing to native code so that
EventHub can identify touch screens with virtual keys and load
the appropriate key layout file.
Cleaned up a lot of old code in EventHub.
Fixed a regression in ViewRoot's fallback event handling.
Fixed a minor bug in FileMap that caused it to try to munmap
or close invalid handled when released if the attempt to map
the file failed.
Added a couple of new String8 conveniences for formatting strings.
Modified Tokenizer to fall back to open+read when mmap fails since
we can't mmap sysfs files as needed to open the virtual key
definition files in /sys/board_properties/.
Change-Id: I6ca5e5f9547619fd082ddac47e87ce185da69ee6
Fixed a bug with dpad keys on external keyboards being rotated
according to the display orientation by adding a new input device
configuration property called "keyboard.orientationAware".
Added a mechanism for overriding the key layout and key character
map in the input device configuration file using the new
"keyboard.layout" and "keyboard.characterMap" properties.
Also added "trackball.orientationAware", "touch.orientationAware" and
"touch.deviceType" configuration properties.
Rewrote the configuration property reading code in native code
so that it can be used by EventHub and other components.
Added basic support for installable idc, kl, and kcm files
in /data/system/devices. However, there is no provision for
copying files there yet.
Disabled long-press character pickers on full keyboards so that
key repeating works as expected.
Change-Id: I1bd9f0c3d344421db444e7d271eb09bc8bab4791
Added the MotionEvent.FLAG_WINDOW_IS_OBSCURED flag which is set by the
input manager whenever another visible window is partly or wholly obscured
the target of a touch event so that applications can filter touches
accordingly.
Added a "filterTouchesWhenObscured" attribute to View which can be used to
enable filtering of touches when the view's window is obscured.
Change-Id: I936d9c85013fd2d77fb296a600528d30a29027d2
Refactored the input reader so that each raw input protocol is handled
by a separate subclass of the new InputMapper type. This way, behaviors
pertaining to keyboard, trackballs, touchscreens, switches and other
devices are clearly distinguished for improved maintainability.
Added partial support for describing capabilities of input devices
(incomplete and untested for now, will be fleshed out in later commits).
Simplified EventHub interface somewhat since InputReader is taking over
more of the work.
Cleaned up some of the interactions between InputManager and
WindowManagerService related to reading input state.
Fixed swiping finger from screen edge into display area.
Added logging of device information to 'dumpsys window'.
Change-Id: I17faffc33e3aec3a0f33f0b37e81a70609378612
This significantly re-works the native key dispatching code to
allow events to be pre-dispatched to the current IME before
being processed by native code. It introduces one new public
API, which must be called after retrieving an event if the app
wishes for it to be pre-dispatched.
Currently the native code will only do pre-dispatching of
system keys, to avoid significant overhead for gaming input.
This should be improved to be smarted, filtering for only
keys that the IME is interested in. Unfortunately IMEs don't
currently provide this information. :p
Change-Id: Ic1c7aeec8b348164957f2cd88119eb5bd85c2a9f
Added several new coordinate values to MotionEvents to capture
touch major/minor area, tool major/minor area and orientation.
Renamed NDK input constants per convention.
Added InputDevice class in Java which will eventually provide
useful information about available input devices.
Added APIs for manufacturing new MotionEvent objects with multiple
pointers and all necessary coordinate data.
Fixed a bug in the input dispatcher where it could get stuck with
a pointer down forever.
Fixed a bug in the WindowManager where the input window list could
end up containing stale removed windows.
Fixed a bug in the WindowManager where the input channel was being
removed only after the final animation transition had taken place
which caused spurious WINDOW DIED log messages to be printed.
Change-Id: Ie55084da319b20aad29b28a0499b8dd98bb5da68
The native code now maintains a list of all keys that may use
default handling. If the app finishes one of these keys
without handling it, the key will be passed back off to Java
for default treatment.
Change-Id: I6a842a0d728eeafa4de7142fae573f8c11099e18
Provides the basic infrastructure for a
NativeActivity's native code to get an object representing
its event stream that can be used to read input events.
Still work to do, probably some API changes, and reasonable
default key handling (so that for example back will still
work).
Change-Id: I6db891bc35dc9683181d7708eaed552b955a077e
Added more tests.
Fixed a regression in Vector.
Fixed bugs in pointer tracking.
Fixed a starvation issue in PollLoop when setting or removing callbacks.
Fixed a couple of policy nits.
Modified the internal representation of MotionEvent to be more
efficient and more consistent.
Added code to skip/cancel virtual key processing when there are multiple
pointers down. This helps to better disambiguate virtual key presses
from stray touches (such as cheek presses).
Change-Id: I2a7d2cce0195afb9125b23378baa94fd2fc6671c
The old dispatch mechanism has been left in place and continues to
be used by default for now. To enable native input dispatch,
edit the ENABLE_NATIVE_DISPATCH constant in WindowManagerPolicy.
Includes part of the new input event NDK API. Some details TBD.
To wire up input dispatch, as the ViewRoot adds a window to the
window session it receives an InputChannel object as an output
argument. The InputChannel encapsulates the file descriptors for a
shared memory region and two pipe end-points. The ViewRoot then
provides the InputChannel to the InputQueue. Behind the
scenes, InputQueue simply attaches handlers to the native PollLoop object
that underlies the MessageQueue. This way MessageQueue doesn't need
to know anything about input dispatch per-se, it just exposes (in native
code) a PollLoop that other components can use to monitor file descriptor
state changes.
There can be zero or more targets for any given input event. Each
input target is specified by its input channel and some parameters
including flags, an X/Y coordinate offset, and the dispatch timeout.
An input target can request either synchronous dispatch (for foreground apps)
or asynchronous dispatch (fire-and-forget for wallpapers and "outside"
targets). Currently, finding the appropriate input targets for an event
requires a call back into the WindowManagerServer from native code.
In the future this will be refactored to avoid most of these callbacks
except as required to handle pending focus transitions.
End-to-end event dispatch mostly works!
To do: event injection, rate limiting, ANRs, testing, optimization, etc.
Change-Id: I8c36b2b9e0a2d27392040ecda0f51b636456de25