textures

Using texture for triangle mesh without having to read/write an image file

只谈情不闲聊 提交于 2019-12-02 00:46:24
This is a followup on a previous question (see Coloring individual triangles in a triangle mesh on javafx ) which I believe is another topic on its own. Is there a way (with javafx) that I can get away from having to actually write to disk (or external device) an image file to use a texture? In other words: can I use a specific texture without having to use Image? Since my color map will change on runtime I don't want to have to write to disk every time I run it. Also, this might be a security issue (writing to disk) for someone using my app.(with javafx) José Pereda As @Jens-Peter-Haack

Android OpenGL ES2.0 Texture Swapping

折月煮酒 提交于 2019-12-02 00:23:07
First off I am new to OpenGL, but on my phone (Motorola Bionic) the following code works as intended. GLES20.glActiveTexture(GLES20.GL_TEXTURE1); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTowerTextureHandle); GLES20.glActiveTexture(GLES20.GL_TEXTURE2); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTowerNormalHandle); GLES20.glActiveTexture(GLES20.GL_TEXTURE3); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mFrostTextureHandle); GLES20.glActiveTexture(GLES20.GL_TEXTURE4); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mFrostNormalHandle); GLES20.glActiveTexture(GLES20.GL_TEXTURE5); GLES20.glBindTexture

OpenGL trying to Draw Multiple 2d Textures, only 1st one appears

五迷三道 提交于 2019-12-02 00:05:57
I've got a problem in that I'm trying to draw a solar system using textures (one texture for each planet) and as I draw my textures, only the 1st one appears. None of the rest do. My init function iterates through my files, saves the textures into an object, and then iterates through the objects. As it's iterating, it generates the textures and binds them to a name using the OpenGL calls. SharpGL.OpenGL gl = args.OpenGL; gl.Enable(SharpGL.OpenGL.GL_BLEND); gl.Enable(SharpGL.OpenGL.GL_TEXTURE_2D); gl.BlendFunc(SharpGL.Enumerations.BlendingSourceFactor.SourceAlpha, SharpGL.Enumerations

In XNA, what is the best method to dispose of textures I no longer need?

本小妞迷上赌 提交于 2019-12-01 23:32:00
I started a project with the concept of reusing the same texture2d objects continuously throughout the game, reloading new textures periodically. Over time this proved a bad idea as I was running in to: System.OutOfMemoryException bool loadImages(string image) { System.Diagnostics.Debug.WriteLine("beginning loading textures " + image); try { //image_a = null; //image_o = null; //image_o.Dispose(); //image_a.Dispose(); image_o = Content.Load<Texture2D>("images/"+image); image_a = Content.Load<Texture2D>("images/" + image+"_a"); return true; } catch { System.Diagnostics.Debug.WriteLine("cannot

Problems texturing a cube

末鹿安然 提交于 2019-12-01 22:50:58
im trying to make a cube with a different texture on each face. I have the front face and the rear face working now. Now i am trying to make the right face of the cube. But something is going wrong, because i got the right face done but the texture is showing with errors (it's like stretched and shredded), i have something bad in my code and i dont know what. This is my code public class Cube { private FloatBuffer vertexBuffer;//Vertices private FloatBuffer textureBuffer;//Texture coordinates private ByteBuffer indexBuffer;//Indices private int[] textures = new int[6];//Texture pointer private

LibGDX Saving textures to avoid context loss

雨燕双飞 提交于 2019-12-01 22:42:23
I have a texture in my LibGDX based Android app that is created procedurally through FrameBuffers that I need to preserve through context loss, and it seems that the only efficient way to do this is to simply save the data, whether as a full image or raw data, out, and load it back in when the time comes. I'm struggling to find any way to achieve this though, as every route I've taken has lead to complete failure in one way or another. I've searched around quite a bit, but nothing I've come across has worked out. I am mostly just looking for a hint into the right direction, rather than the

OpenGL ES: How to tint texture with color

自古美人都是妖i 提交于 2019-12-01 18:07:56
I have texture with alpha. And I want to tint it with some color, so it will be colored depending on color alpha value, but overal opacity will be defined only by texture alpha. This is similar to multi-texturing but with color instead of second texture. How to do it? (Updated) I have tried to set up texture combiner. Color is tinted fine, but there is problem with alpha - it doesn't take value from texture (like mask). My code at this moment: glActiveTexture (GL_TEXTURE0); // do we need stage #1? glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE); glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE

How much more efficient are power-of-two textures?

我是研究僧i 提交于 2019-12-01 17:56:55
I am creating an OpenGL video player using Ffmpeg and all my videos aren't power of 2 (as they are normal video resolutions). It runs at fine fps with my nvidia card but I've found that it won't run on older ATI cards because they don't support non-power-of-two textures. I will only be using this on an Nvidia card so I don't really care about the ATI problem too much but I was wondering how much of a performance boost I'd get if the textuers were power-of-2? Is it worth padding them out? Also, if it is worth it, how do I go about padding them out to the nearest larger power-of-two? datenwolf

What does GL_UNSIGNED_BYTE mean for glTexImage2D?

落爺英雄遲暮 提交于 2019-12-01 17:18:50
I want to load a byte array containing a texture in RGBA 8888 format. The OpenGL ES docs offer 4 constants to use: GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT_5_6_5, GL_UNSIGNED_SHORT_4_4_4_4, and GL_UNSIGNED_SHORT_5_5_5_1. On regular OpenGL , there is a value GL_UNSIGNED_INT_8_8_8_8 that meets my needs -- and the numbers are interpreted thus: For example, if internalFormat is GL_R3_G3_B2, you are asking that texels be 3 bits of red, 3 bits of green, and 2 bits of blue. So GL_UNSIGNED_INT_8_8_8_8 must be 8 bits of R, 8 bits of G and 8 bits of B and 8 bits of A. But what does GL_UNSIGNED_BYTE mean on

How much more efficient are power-of-two textures?

大兔子大兔子 提交于 2019-12-01 17:12:54
问题 I am creating an OpenGL video player using Ffmpeg and all my videos aren't power of 2 (as they are normal video resolutions). It runs at fine fps with my nvidia card but I've found that it won't run on older ATI cards because they don't support non-power-of-two textures. I will only be using this on an Nvidia card so I don't really care about the ATI problem too much but I was wondering how much of a performance boost I'd get if the textuers were power-of-2? Is it worth padding them out? Also