opengl-es

Canvas is drawing too slowly

允我心安 提交于 2019-12-11 08:01:27
问题 I am working on an app that allows stepping through video frames (bitmaps). It also has the capability to play back the frames, one after another, as though you are actually viewing a video composed of the frames. I have one ImageView that is showing one bitmap at a time. The problem I am having is that drawing the image to the ImageView (we're talking just a call to super.onDraw()) is taking roughly 30ms. Since the frames need to be playing back at their original framerate, this is a problem

Why does dispatch_semaphore_wait() return YES all the time even when I'm not scrolling?

眉间皱痕 提交于 2019-12-11 07:55:57
问题 Brad Larson delivered a solution for the CADisplayLink freeze issue when scroll views are scrolling. My OpenGL ES draw method is called by a CADisplayLink , and I tried Brad's technique but can't make it work. The core problem is that my OpenGL ES view is hosted by a UIScrollView , and when the U IScrollView scrolls, the CADisplayLink stops firing. The technique Brad described is supposed to let the CADisplayLink continue to fire even during scrolling (by adding it to NSRunLoopCommonModes

Conflict drawing on OpenGLES on iOS

三世轮回 提交于 2019-12-11 07:54:05
问题 I'm writing an app allows drawing free style (using finger) and drawing image. I posted one of my problems at OpenGL ES glFragColor depend on if condition on Fragment shader in iOS. Thanks for many suggestions, I solved that. Now I still have another new issue. I have 2 programs which have id PROGRAM_POINT (drawing free style) and PROGRAM_POINT_0 (draw image). Those are initialized. Each program has a pair of shader files. PROGRAM_POINT has shader files named point.vsh and point.fsh. For

Possible to change the alpha value of certain pixels on iPhone?

[亡魂溺海] 提交于 2019-12-11 07:47:51
问题 Is it possible to change just a portion of a Sprite's alpha in response to user interaction? A good example of what I mean is iFog or iSteam, where the user can wipe "steam" off the iPhone's screen. Swapping images out wouldn't be feasible due to the sheer number of possibilities where the user could touch and move... For example, say you have a simple app that has a brick wall in the background that has graffiti on it, so there'd be two sprites, one of the brick wall, then one of the

Adding transparency to a video from black and white (and gray) alpha information video images

坚强是说给别人听的谎言 提交于 2019-12-11 07:27:44
问题 I'd like to create a video with transparency (and semi transparency) in android. The images of the source video are split into two images, the top one with color information (looks like the normal video image), the bottom one with alpha information (the same shapes but only in black and white and gray, black means transparent). This is the solution for iOS: https://medium.com/@quentinfasquel/ios-transparent-video-with-coreimage-52cfb2544d54 What would be the best way to to this in android?

Compressed texture batching in OpenGL

泪湿孤枕 提交于 2019-12-11 07:14:02
问题 I'm trying to create an atlas of compressed textures but I can't seem to get it working. Here is a code snippet: void Texture::addImageToAtlas(ImageProperties* imageProperties) { generateTexture(); // delete and regenerate an empty texture bindTexture(); // bind it atlasProperties.push_back(imageProperties); width = height = 0; for (int i=0; i < atlasProperties.size(); i++) { width += atlasProperties[i]->width; height = atlasProperties[i]->height; } glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

Create new texture from multitexture

喜夏-厌秋 提交于 2019-12-11 07:13:09
问题 I'm struggling trying to understand how to create a new texture (with a new GLHandle, if possible) after merging two other in the same concept of multitexturing. I'm doing this because I'm using ETC1 compressed textures, witch doesn't support alpha channel's, so I wanted to load the pre-generated alpha channel from another file and "merge" both of them after uncompressing. I know that multitexturing does this, but it implies to render the two textures (compressed + alpha) altogether, right?

WebGL heightmap using vertex shader, using 32 bits instead of 8 bits

元气小坏坏 提交于 2019-12-11 07:03:56
问题 I'm using the following vertex shader (courtesy http://stemkoski.github.io/Three.js/Shader-Heightmap-Textures.html) to generate terrain from a grayscale height map: uniform sampler2D bumpTexture; uniform float bumpScale; varying float vAmount; varying vec2 vUV; void main() { vUV = uv; vec4 bumpData = texture2D( bumpTexture, uv ); vAmount = bumpData.r; // assuming map is grayscale it doesn't matter if you use r, g, or b. // move the position along the normal vec3 newPosition = position +

problem loading texture with transparency with OpenGL ES and Android

跟風遠走 提交于 2019-12-11 07:00:21
问题 Im trying to load an image that has background transparency that will be layered over another texture. When i try and load it, all i get is a white screen. The texture is 512 by 512, and its saved in photoshop as a 24 bit PNG (standard PNG specs in the Photoshop Save for Web and Devices config window). Any idea why its not showing? The texture without transparency shows without a problem. Here is my loadTextures method: public void loadGLTexture(GL10 gl, Context context) { //Get the texture

How to set PathModifier's coordinates randomly in start of LoopEntityModifier?

ぃ、小莉子 提交于 2019-12-11 06:48:55
问题 I created a live wallpaper service using AndEngine library. On screen there is a bird Sprite that flying repeatedly from the left to right. I'm using LoopEntityModifier and PathModifier for the solution. The bird is coded to start randomly on Y-position everytime it shows up from the left screen. The code is like this: public class MyLiveWallpaperService extends BaseLiveWallpaperService { private AnimatedSprite birdSprite; ... public Scene onLoadScene() { ... float[] coordY =