We now only consider a device to be a default keyboard if its name
has "-keypad". A hack, but whatever.
Also add some debug logging for the input state to help identify such
issues in the future.
Sleep for 100us and try to open the input device again if it fails, with a
maximum of 10 attempts.
We need the retry logic because setting permissions on a new input device is
racy. The init process watches for new input device (via uevent) and sets the
permission on them in devices.c:make_device(). However at the same time
EventHub.cpp watches for new input devices from the system_server process, and
immediately tries to open them. I can't see a simple way to avoid this race
condition.
As best as I can tell this race condition has always exisited.
There must have been some timing change that happened recently that causes us
to hit this race condition much more often. See repro notes in referenced bug.
Bug: 2375632
This addresses a few parts of the bug:
- There was a small issue in the window manager where we could show a window
too early before the transition animation starts, which was introduced
by the recent wallpaper work. This was the cause of the flicker when
starting the dialer for the first time.
- There was a much larger problem that has existing forever where moving
an application token to the front or back was not synchronized with the
application animation transaction. This was the cause of the flicker
when hanging up (now that the in-call screen moves to the back instead
of closing and we always have a wallpaper visible). The approach to
solving this is to have the window manager go ahead and move the app
tokens (it must in order to keep in sync with the activity manager), but
to delay the actual window movement: perform the movement to front when
the animation starts, and to back when it ends. Actually, when the
animation ends, we just go and completely rebuild the window list to
ensure it is correct, because there can be ways people can add windows
while in this intermediate state where they could end up at the wrong
place once we do the delayed movement to the front or back. And it is
simply reasuring to know that every time we finish a full app transition,
we re-evaluate the world and put everything in its proper place.
Also included in this change are a few little tweaks to the input system,
to perform better logging, and completely ignore input devices that do not
have any of our input classes. There is also a little cleanup of evaluating
configuration changes to not do more work than needed when an input
devices appears or disappears, and to only log a config change message when
the config is truly changing.
Change-Id: Ifb2db77f8867435121722a6abeb946ec7c3ea9d3
The major things going on here:
- The MotionEvent API is now extended to included "pointer ID" information, for
applications to keep track of individual fingers as they move up and down.
PointerLocation has been updated to take advantage of this.
- The input system now has logic to generate MotionEvents with the new ID
information, synthesizing an identifier as new points are down and trying to
keep pointer ids consistent across events by looking at the distance between
the last and next set of pointers.
- We now support the new multitouch driver protocol, and will use that instead
of the old one if it is available. We do NOT use any finger id information
coming from the driver, but always synthesize pointer ids in user space.
(This is simply because we don't yet have a driver reporting this information
from which to base an implementation on.)
- Increase maximum number of fingers to 10. This code has only been used
with a driver that reports up to 2, so no idea how more will actually work.
- Oh and the input system can now detect and report physical DPAD devices.
This will be used to avoid unnecessarily listening to data from sensors
that function as event devices.
Signed-off-by: Mike Lockwood <lockwood@android.com>
The kernel can now publish a property describing the layout of virtual
hardware buttons on the touchscreen. These outside of the display
area (outside of the absolute x and y controller range the driver
reports), and when the user presses on them a key event will be
generated rather than a touch event.
This also includes a number of tweaks to the absolute controller
processing to make things work better on the new screens. For
example, we now reject down events outside of the display area.
Still left to be done is the ability to cancel a key down event,
so the user can slide up from the virtual keys to the touch screen
without causing a virtual key to execute.