Make sure to not use GL_TEXTURE_EXTERNAL when it's not supported
by the GL. The error was harmless, but annoying.
Change-Id: I571a9a9b05d35da51420950a6a6e95629067efd0
simplified things a lot, the biggest change is that the concept
of "ClientID" is now gone, instead we simply use references.
Change-Id: Icbc57f80865884aa5f35ad0d0a0db26f19f9f7ce
this change introduces R/W locks in the right places.
on the server-side, it guarantees that setBufferCount()
is synchronized with "retire" and "resize".
on the client-side, it guarantees that setBufferCount()
is synchronized with "dequeue", "lockbuffer" and "queue"
get rid of the "fake rtti" code, and use polymorphism instead.
also simplify how we log SF's state (using polymorphism)
Change-Id: I2bae7c98de4dd207a3e2b00083fa3fde7c467922
since we're using the GPU for composition, don't use a texture for dimming,
instead simply use an alpha-blended quad.
also workaround what looks like a GL driver bug by calling glFinish() before
glReadPixels().
A window is created and the browser is about to render into it the
very first time, at that point it does an IPC to SF to request a new
buffer. Meanwhile, the window manager removes that window from the
list and the shared memory block it uses is marked as invalid.
However, at that point, another window is created and is given the
same index (that just go freed), but a different identity and resets
the "invalid" bit in the shared block. When we go back to the buffer
allocation code, we're stuck because the surface we're allocating for
is gone and we don't detect it's invalid because the invalid bit has
been reset.
It is not sufficient to check for the invalid bit, I should
also check that identities match.