opengl-es

How to detect the area of touch in open gl es in android?

左心房为你撑大大i 提交于 2019-12-08 11:25:56
问题 I have designed a 3D model in .obj format and imported using min3d framework. Its a cube with different colors. I want to give a Toast with a message for a particular color when the side of that particular color is touched. How do I do this in android 1.6 ? 回答1: I'm not sure about the particulars in android/OpenGL-es, but in general : You have to calculate a line that starts at the camera's position and passes through the screen coordinate pressed/clicked/etc and determine where that line

OpenGL ES ReadPixels to bitmap From Texture larger than screen

僤鯓⒐⒋嵵緔 提交于 2019-12-08 11:22:48
问题 So, I am implementing a screen to apply Effects (grain, negative, etc) to an image a user has taken from their camera or an image taken from their gallery. I am able to take the image that they have selected or taken and display it to them through the use of OpenGL at full resolution ( or scaled down maintaining aspect ratio depending on the max texture size of their device and the size of the image). Also, selecting the effect and applying to the texture works completely fine. As does taking

Record GLSurfaceView on < Android 4.3

风流意气都作罢 提交于 2019-12-08 10:36:13
问题 I'm developing an app for applying effects to the camera image in real-time. Currently I'm using the MediaMuxer class in combination with MediaCodec . Those classes were implemented with Android 4.3. Now I wanted to redesign my app and make it compatible for more devices. The only thing I found in the internet was a combination of FFmpeg and OpenCV, but I read that the framerate is not very well if I want to use a high resolution. Is there any possibility to encode video in real-time while

Min3d doesn't show anything (in sample from min3d wiki)

与世无争的帅哥 提交于 2019-12-08 10:17:38
问题 I'm trying to create a simple Android application using min3d library Previously I have downloaded apk file from market with examples of min3d library. Every example in this application works fine on real phone. It means phone has not problems with 3d. I have chekouted source code of min3d with examples and created simple Android application with single activity copypasted from examples: import min3d.core.Object3dContainer; import min3d.core.RendererActivity; import min3d.objectPrimitives.Box

iphone: rewrite GLPaint using Core Graphics

柔情痞子 提交于 2019-12-08 09:58:21
问题 I would like to rewrite the Apple source code, GLPaint application, but not using OpenGL like the example, I would like to use Core Graphics library. But I stuck in displaying the texture like GLPaint. This is my code in Core Graphics: - (void) drawLineFromPoint: (CGPoint) fromPoint toPoint: (CGPoint) toPoint { UIGraphicsBeginImageContext(self.frame.size); CGContextRef context = UIGraphicsGetCurrentContext(); [drawImage.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size

Copy a picture of the current screen just before game pause blur it and render to screen in the duration of game pause

最后都变了- 提交于 2019-12-08 09:56:39
问题 I am having trouble implementing a game pause blur screen to render during the gamepause state. I need to take a picture or texture of the current screen i.e gamerunning state, just when the user pauses the screen . Maybe store it to memory, blur it then render it to screen to achieve a nice blur, Gaussian blur effect like they do in popular games like Angry birds. I tried implementing using a screenshot, but the process of screen capture makes the game freeze for a few seconds which is not

Strange blending when rendering self-transparent texture to the framebuffer

坚强是说给别人听的谎言 提交于 2019-12-08 09:14:35
问题 I'm trying to render self-transparent textures to the framebuffer, but I'm getting not what I guessed: everything previously rendered on the framebuffer gets ignored, and this texture blends with the colour I cleaned my main canvas. That's what I would like to get, but without using framebuffers: package test; import com.badlogic.gdx.*; import com.badlogic.gdx.graphics.*; import com.badlogic.gdx.graphics.g2d.*; public class GdxTest extends ApplicationAdapter { SpriteBatch batch; Texture img;

Can we use non-power-of-2 textures on OpenGL ES on Android?

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-08 08:59:41
问题 Does anyone know if it is possible to use NPOT textures on Android's OpenGL ES renderer? 回答1: Yes, for OpenGL ES 2.0, NPOT textures are supported in the core specification with some limitations in wrap modes, and 3D textures. The limitations are lifted when using the GL_OES_texture_npot extension. For OpenGL ES 1.x, there is no extension to support NPOT textures. 回答2: It appears that Froyo does has a NPOT implementation for OpenglES 1.1. However, so far I can not get UI to displace with

Positioning objects in a 3D Scene and then figuring out what the user clicked on

China☆狼群 提交于 2019-12-08 08:47:32
问题 I'm building a cross-platform game in C++, using OpenGL ES 2.0. The target is iPhone at the moment. I'm a newbie to coding games, but not a newbie to coding. I'm confused about how to architect the game. But specifically, I'm asking about how to setup the objects needed to position models in the scene. I have an object that represents a scene. There are 5 scenes. Only one scene is shown at a time. A scene is like a game level. Each scene has all the code for rendering, game logic, mouse and

Make a line thicker in 3D?

此生再无相见时 提交于 2019-12-08 08:34:44
问题 In reference to this question Drawing a line between two points using SceneKit I'm drawing a line in 3D and want to make it thicker by using this code func renderer(aRenderer: SCNSceneRenderer, willRenderScene scene: SCNScene, atTime time: NSTimeInterval) { //Makes the lines thicker glLineWidth(20) } but it doesn't work, iOS 8.2. Is there another way? Update From the docs https://developer.apple.com/library/prerelease/ios/documentation/SceneKit/Reference/SCNSceneRendererDelegate_Protocol