Allow RAW_SENSOR to be used for any combination of CPU read/write and
Camera read/write, instead of only camera->cpu or cpu->camera.
Change-Id: I032b9531e9069a202c1a3767b77975c808703285
Have gralloc_alloc be able to select the appropriate pixel format
given the endpoints, triggered by new
GRALLOC_EMULATOR_PIXEL_FORMAT_AUTO format.
Currently supports camera->screen, and camera->video encoder.
Bug: 6243944
Change-Id: Ib1bf8da8d9184ac99e7f50aad09212c146c32809
A few ANativeWindow methods were updatd to take a Sync HAL file
descriptor, and the existing methods were renamed with a _DEPRECATED
suffix. Since the emulator graphics acceleration doesn't yet support
the sync HAL, this change continues calling the deprecated functions
via their new names.
Change-Id: I5b1760811fafb6723ede887e32e63f94cbaeffe5
Because of the way the SDK and Android system images are branched,
host code that goes into the SDK tools can't live in the same
repository as code that goes into the system image. This change keeps
the emugl host code in sdk.git/emulator/opengl while moving the emugl
system code to development.git/tools/emulator/opengl.
A few changes were made beyond simply cloning the directories:
(a) Makefiles were modified to only build the relevant components. Not
doing so would break the build due to having multiple rule
definitions.
(b) Protocol spec files were moved from the guest encoder directories
to the host decoder directories. The decoder must support older
versions of the protocol, but not newer versions, so it makes
sense to keep the latest version of the protocol spec with the
decoder.
(c) Along with that, the encoder is now built from checked in
generated encoder source rather than directly from the protocol
spec. The generated code must be updated manually. This makes it
possible to freeze the system encoder version without freezing the
host decoder version, and also makes it very obvious when a
protocol changes is happening that will require special
backwards-compatibility support in the decoder/renderer.
(d) Host-only and system-only code were removed from the repository
where they aren't used.
(e) README and DESIGN documents were updated to reflect this split.
No actual source code was changed due to the above.
Change-Id: I2c936101ea0405b372750d36ba0f01e84d719c43
The emulator GLES support has two interfaces: a host shared library
interface used by QEMU, and a protocol between the platform and the
host. The host library interface is not versioned; QEMU and the GLES
renderer must match. The protocol on the other hand must be backwards
compatible: a new GLES renderer must support an older platform image.
Thus for branching purposes it makes more sense to put the GLES
renderer in sdk.git, which is branched along with qemu.git for SDK
releases. Platform images will be built against the protocol version
in the platform branch of sdk.git.
Change-Id: Ie73fce12815c9740e27d0f56caa53c6ceb3d30cc
To enable multi-touch on a tethered device, allow a callback to be
registered with the OpenGL renderer. On every frame, the framebuffer
is read into system memory and provided to the callback, so it can be
mirrored to the device.
This change is co-dependent on Idae3b026d52ed8dd666cbcdc3f3af80175c90ad3
in external/qemu.
Change-Id: I03c49bc55ed9e66ffb59462333181f77e7e46035
All ten libraries can now be built in 64-bit named "lib64*" (*)
in addition to the original 32-bit form named "lib*".
Also, dlopen "lib64*so" in 64-bit.
(*) eg. In Ubuntu, all can be built with the following command:
make out/host/linux-x86/lib/lib64OpenglRender.so \
out/host/linux-x86/lib/lib64EGL_translator.so \
out/host/linux-x86/lib/lib64GLES_CM_translator.so \
out/host/linux-x86/lib/lib64GLES_V2_translator.so
Rules to build static libraries lib64log.a, lib64cutils.a and lib64utils.a
they depend were added in other CLs.
Change-Id: I3afb64de6dda1d55dbd1b4443d2dbc78a683b19f
1. "emugen" generates four *dec.cpp files containing code like this
to decode offset to pointer in stream
tmp = *(T *)(ptr + 8 + 4 + 4 + 4 + *(size_t *)(ptr +8 + 4 + 4));
If *dec.cpp are compiled in 64-bit, size_t is 8-byte and dereferencing of
it is likley to get wild offset for dereferencing of *(T *) to crash the
code. Solution is to define tsize_t for "target size_t" instead
of using host size_t.
2. Cast pointer to "uintptr_t" instead of "unsigned int" for 2nd param of
ShareGroup::getGlobalName(NamedObjectType, ObjectLocalName/*64bit*/).
3. Instance of EGLSurface, EGLContext and EGLImageKHR are used as 32-bit
key for std::map< unsigned int, * > SurfacesHndlMap, ContextsHndlMap,
and ImagesHndlMap, respectively. Cast pointer to uintptr_t and assert
upper 32-bit is zero before passing to map::find().
4. Instance of GLeglImageOES is used to eglAttachEGLImage() which expect
"unsigned int". Cast it to uintptr_t and assert upper 32-bit is zero.
5. The 5th param to GLEScontext::setPointer is GLvoid* but contains 32-bit
offset to vbo if bufferName exists. Cast it to uintptr_t and assert
upper 32-bit is zero.
6. Use %zu instead of %d to print size_t
7. Cast pointer to (uintptr_t) in many other places
Change-Id: Iba6e5bda08c43376db5b011e9d781481ee1f5a12
On Macs running OS X 10.6 and 10.7 with Intel HD Graphics 3000, some
screens or parts of the screen are displayed upside down. The exact
conditions/sequence that triggers this aren't known yet; I haven't
been able to reproduce it in a standalone test. This also means I
don't know whether it is a driver bug, or a bug in the OpenglRender or
Translator code that just happens to work elsewhere.
Thanks to zhiyuan.li@intel.com for a patch this change is based on.
Change-Id: I04823773818d3b587a6951be48e70b03804b33d0
Whenever a surface was attached to a context, it was dequeing a new
buffer, and enqueing it when detached. This has the effect of doing a
SwapBuffers on detach/attach cycle, which is just wrong and
occasionally caused visible glitches (e.g. animations going backwards
for one frame). It also broke some SurfaceTexture tests which
(validly) depend on specific buffer production/consumption counts.
Change-Id: Ibd4761e8842871b79fd9edf52272900193cb672d
Pass the swap interval from eglSwapInterval to the native window so it
can enable/disable SurfaceTexture's async mode. Fixes the deadlock in
SurfaceTextureGLToGLTest.EglDestroySurfaceUnrefsBuffers.
Change-Id: I19bf69247341f5617223722df63d6c7f8cf389c6
* EGLImageTargetRenderbufferStorageOES was incorrectly accepting
TEXTURE_EXTERNAL_OES as a target. Revert that; the host GL will
correctly reject it with INVALID_ENUM.
* Handle the REQUIRED_TEXTURE_IMAGE_UNITS_OES texparameter query.
* Validate texture parameters set on TEXTURE_EXTERNAL textures;
otherwise invalid parameters would work on the emulator but not on a
real device.
Change-Id: I49a088608d58a9822f33e5916bd354eee3709127
The gralloc API assumes system-wide reference counting of gralloc
buffers. The host-GL accelerated gralloc maps buffers to host-side
ColorBuffer objects, but was destroying them unconditionally in
gralloc_free(), ignoring any additional references from
gralloc_register_buffer().
This affected the SurfaceTexture gralloc buffers used by the
Browser/WebView. For some reason these buffers are actually allocated
by SurfaceFlinger and passed back to the WebView through Binder. But
since SurfaceFlinger doesn't actually need the buffer for anything,
sometime after the WebView has called gralloc_register_buffer()
SurfaceFlinger calls gralloc_free() on it. This caused the host
ColorBuffer to be destroyed long before the WebView is done using it.
Change-Id: I33dbee887a48a6907041cf19e9f38a1f6c983eff
Copy changes faaf1553cf and
f37a7ed6c5 from the GLESv1 translator to
the GLESv2 translator. After this, both translators use the same logic
for glEGLImageTargetTexture2DOES().
Change-Id: I0a95bf2301df7b7428abc593f38170edf4cbda30
Off-by-two bug when removing textures from the tracking array could
overwrite malloc's mem chunk data structure, usually resulting in a
heap corruption abort on a later malloc/realloc/free.
Bug: 5951738
Change-Id: I11056bb62883373c2a3403f53899347ff8cdabf2
The data pointer argument to glBufferData can be NULL; this
[re]allocates the buffer while leaving the contents undefined.
Bug: 5833436
Change-Id: Ia1ddf62e2cd2c59d3d631e01d23d7c557ca5a52e
* Disable verbose debug spam.
* Add missing GL enum to utility function. The default case was
returning the correct size, so this doesn't fix any bugs, just
removes some logcat spam.
* Comment and whitespace corrections.
Change-Id: I83fb8644331ae1072d6a8dae9c041da92073089f
The code that creates the GL-accelerated screen view wasn't converting
the upper-left-relative coordinates used within the emulator to the
lower-left coordinates used by the Cocoa APIs on OS X. Since most
skins have the screen view centered vertically this often just
happened to work.
Bug: 5782118
Change-Id: I2f96ee181e850df5676d10a82d86c94421149b40
The emulator EGL implementation tried to hold its own reference to
buffers acquired/released with dequeueBuffer/queueBuffer, but was
missing an incRef after dequeueBuffer during swapBuffers.
Since the native window holds a reference to the buffer between
dequeueBuffer and queueBuffer, the EGL reference isn't needed anyway.
Change-Id: I95e4f9f4faf59198f99939cdca6603fe176c56bc
The glBufferData, glBufferSubData, and glDeleteBuffers entry points
had interception routines in GL2Encoder which cache the data, but they
weren't hooked up. So when glDrawElements tried to retrieve the cached
data it wasn't there.
Change-Id: Iaed11fccaefab3186485be53a0f15c8ca0a255f9