this would happen is the window is made visible but the client didn't render yet into it. This happens often with SurfaceView.
Instead of filling the window with solid black, SF would simply ignore it which could lead to more disturbing artifacts.
in theory the window manager should not display a window before it has been drawn into, but it does happen occasionnaly.
Apparently the problem is caused by the fact that A2dpAudioStreamOut::standby() calls a2dp_stop() after the headset has been powered down.
The workaround consists in indicating to A2DP audio hardware that a close request is pending and that stanby() must be bypassed.
This change makes SurfaceHolder.setType(GPU) obsolete (it's now ignored).
Added an API to android_native_window_t to allow extending the functionality without ever breaking binary compatibility. This is used to implement the new set_usage() API. This API needs to be called by software renderers because the default is to use usage flags suitable for h/w.
This is because the AudioFlinger duplicating thread is closed while the output tracks are still active. This cause the output tracks objects to be destroyed at a time where they can be in use by the destination output mixer.
The fix consists in adding the OutputTrack to the track list (mTracks) of its destination thread so that a strong reference is help during the mixer processed and the track is detroyed only when safe by destination thread.
Also added detection of problems when creating the output track (e.g. no more tracks in mixer). In this case the output track is not added to output track list of duplicating thread.
When changing the audio output stream sampling rate with setParameters() make sure that all tracks have a sampling rate less or equal to 2 times the new output sampling rate.
Merge change 7419 from master that may help eliminate the problem.
This change was for a different use case (when disabling A2DP to switch output to SCO) but without a repro case it is worth trying.
The BT headset detection now makes the difference between car kits and headsets, which can be used by audio policy manager.
The headset connection is also detected earlier, that is when the headset is connected and not when the SCO socket is connected as it was the case before. This allows the audio policy manager to suspend A2DP output while ringing if a SCO headset is connected.
There was no garanty that the corresponding thread destructor had been already called when exiting the closeOutput() or closeInput() functions.
This contructor could be called by the thread after the exit condition is signalled. By way of consequence, closeOutputStream() could be called after
we exited closeOutput() function.
To solve the problem, the call to closeOutputStream() or closeInputStream() is moved to closeOutput() or closeInput().
The function checkForNewParameters_l() is called with the ThreadBase mutex mLock locked. In the case where the parameter change implies
an audio parameter modification (e.g. sampling rate) the function sendConfigEvent() is called which tries to lock mLock creating a deadlock.
The fix consists in creating a function equivalent to sendConfigEvent() that must be called with mLock locked and does not lock mLock.
Also added the possibility to have more than one set parameter request pending.
Use integers instead of void* as input/output handles at IAudioFlinger and IAudioPolicyService interfaces.
AudioFlinger maintains an always increasing count of opened inputs or outputs as unique ID.
* changes:
update most gl tests to use EGLUtils
added two EGL helpers for selecting a config matching a certain pixelformat or native window type
added NATIVE_WINDOW_FORMAT attribute to android_native_window_t
The major things going on here:
- The MotionEvent API is now extended to included "pointer ID" information, for
applications to keep track of individual fingers as they move up and down.
PointerLocation has been updated to take advantage of this.
- The input system now has logic to generate MotionEvents with the new ID
information, synthesizing an identifier as new points are down and trying to
keep pointer ids consistent across events by looking at the distance between
the last and next set of pointers.
- We now support the new multitouch driver protocol, and will use that instead
of the old one if it is available. We do NOT use any finger id information
coming from the driver, but always synthesize pointer ids in user space.
(This is simply because we don't yet have a driver reporting this information
from which to base an implementation on.)
- Increase maximum number of fingers to 10. This code has only been used
with a driver that reports up to 2, so no idea how more will actually work.
- Oh and the input system can now detect and report physical DPAD devices.
If the output stream handler passed was not the A2DP output stream, the request was ignored instead of being forwarded downstream to hardware interface.
there was several issues:
- when a surface was made non-current, the last frame wasn't shown and the buffer could stay locked
- when a surface was made current the 2nd time, it would not dequeue a new buffer
now, queue/dequeue are done when the surface is made current.
for this to work, a new query() hook had to be added on android_native_window_t, it allows to retrieve some attributes of a window (currently only width and height).