Go to file
Jesse Hall 497ba0e085 Don't use implementation-defined format with CPU consumers
If the virtual display surface is being consumed by the CPU, it can't
be allowed with HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED since there is
no way for the CPU consumer to find out what format gralloc chose. So
for CPU-consumer surfaces, just use the BufferQueue's default format,
which can be set by the consumer.

A better but more invasive change would be to let the consumer require
a certain format (or set of formats?), and disallow the producer from
requesting a different format.

Bug: 11479817
Change-Id: I5b20ee6ac1146550e8799b806e14661d279670c0
2013-11-04 16:43:03 -08:00
build Add Dalvik heap definition for 7" xhdpi devices. 2013-05-06 15:06:20 -07:00
cmds Fix issue #10422349: Limit/change the battery history data in batterystats 2013-09-04 18:04:14 -07:00
data/etc Adding PackageManager feature strings for stepcounter and stepdetector. 2013-10-13 14:53:45 -07:00
include Native counterpart of new power manager method 2013-10-25 12:50:04 -07:00
libs Native counterpart of new power manager method 2013-10-25 12:50:04 -07:00
opengl EGL: rename CallStack::dump into CallStack::log 2013-10-25 17:13:57 -07:00
services Don't use implementation-defined format with CPU consumers 2013-11-04 16:43:03 -08:00
MODULE_LICENSE_APACHE2
NOTICE resolved conflicts for merge of adee6b35 to honeycomb-plus-aosp 2011-01-17 14:17:12 -08:00