When disconnecting from BufferQueue, we now drain the queue
except the head (which means in the screenshot case we won't
have to block, but we might not have a buffer to show, this
will appear as an error in the log).
Bug: 8362363
Change-Id: If80989aac3c917beea2ebddf3cbb502849d394da
Use prctl(PR_SET_PDEATHSIG, SIGKILL) in each forked child to ensure
it dies when dumpstate dies.
This is important for two cases:
- dumpstate runs a timer for each child process. On expiration, it
sends SIGTERM to kill the process. Sometimes SIGTERM isn't enough
to kill a hung process, so the child lives on.
- When dumpstate is killed by the user before completing, outstanding
children continue to run and generate output.
Change-Id: I96e0dc918c26d56c9fee53611980ba2edd238712
older drivers which are doing implicit synchronization need this
or they could deadlock.
Bug: 8341885
Change-Id: Icd980a6be16071678d6151e34725b3c1c547d7ee
we check that the order in which we destroy GLConsumer wrt.
releasing the corresponding EGLSurface via eglMake(Un)Current
doesn't leak a buffer.
On at least 2 devices this test doesn't pass.
Change-Id: I63ab83951b4b0a977f38571158f948cbd9dc7cec
When a binder service's main thread joins the thread pool
it retains its name (whatever the exec name was), which is
very confusing in systrace.
we now rename that thread just like its friends in the
thread pool.
Change-Id: Ibb3b6ff07304b247cfc6fb1694e72350c579513e
This isn't really right either, but avoids having an extra buffer that
the consumer has to drain which it might not be expecting.
To be correct, disconnecting a surface from a context should retain
the current buffer and continue using it when reconnected. The buffer
should only be canceled when the surface is destroyed. That will wait
for a later change.
Bug: 8320762
Change-Id: I5efa39c741193ca4f5612ea9de001ccbb683b345
- this gives us access to RefBase's refcounting debugging
- it doesn't cost much because GraphicBuffer already has a vtable
Change-Id: I7f696e421fea14b14bfaeb83880689b83e96af4d
* changes:
Get rid of LayerBase.
Make LayerDim a regular Layer instead of a LayerBase
fold LayerBaseClient into LayerBase
Remove support for ScreenshotLayer
The functionality of LayerBase and Layer is folded
into Layer. There wasn't a need for this abstraction
anymore.
Change-Id: I66511c08cc3d89009ba4deabf47e26cd4cfeaefb
Added EGL extension to set a timestamp on a surface.
Also, fix JNI encoding of "long" in glgen.
Bug 8191230
Change-Id: I38b7334bade3f8ff02bffe600bb74469ef22c164
* changes:
implement display projection clipping in h/w composer
refactor the crop region for hwc is calculated/set
apply the projection's viewport to the visibleregion passed to hwc
set correct crop rectangle in LayerBase::setCrop
Writing a NULL Surface was being read as a non-NULL Surface with NULL
mGraphicBufferProducer. Before the SurfaceTextureClient -> Surface
refactoring, you'd get a NULL Surface, and some code relies on that.
Bug: 8291161
Change-Id: I477bfe8882693e53a5f604a3d2c9e3cfe24473b4
- SurfaceFlinger now supports to take a screenshot
directly into an IGraphicBufferProducer
- reimplement the IMemoryHeap screenshot on top
of the above
- reimplement LayerScreenshot such that its
BufferQueue is directly used as the destination
of the screenshot. LayerScreenshot is now a thin
wrapper around Layer
Bug: 6940974
Change-Id: I69a2096b44b91acbb99eba16f83a9c78d94e0d10
Temporary, to fix weekend build, until we get Nvidia code drop.
This reverts commit 9a867a8798
DO NOT MERGE
Change-Id: I7b5dbc4db46ef3d97dc8598057d5487d6971178b
When a display is added, initialize it to use an empty layer stack, so
if it is somehow visible it will show black. It will be assigned the
real layer stack -- along with a projection and other properties -- by
window manager soon. Normally a display remains blanked until window
manager has decided what to show on it, but for HDMI connected at boot
that isn't currently the case.
Bug: 7258935
Change-Id: Ic9bb25f7a9b8d9d3772b097ab1d6fa03bc8780a1