textures

read and write integer 1-channel texture opengl

岁酱吖の 提交于 2019-12-21 21:32:31
问题 I want to: create a read and writable 1-channel texture that contains integers. using a shader, write integer "I" to the texture. use the texture as a source, sample it and compare if the sample is equal to the integer I. All this with core profile 3.3. This is what I've got so far: I create the texture like so: glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, width, height, 0, GL_RED, GL_INT, (java.nio.ByteBuffer) null); I've Also tried GL_R8I and GL_RED_INTEGER, but that won't work. I bind this

Threejs Custom Shader - Screen Tearing

让人想犯罪 __ 提交于 2019-12-21 21:25:33
问题 So to implement a tilemap using Threejs and Brandon Jone's tilemap method (found here) I am using a THREE.Plane geometry for each layer, and painting the face with the following custom shaders: Vertex Shader: var tilemapVS = [ "varying vec2 pixelCoord;", "varying vec2 texCoord;", "uniform vec2 mapSize;", "uniform vec2 inverseTileTextureSize;", "uniform float inverseTileSize;", "void main(void) {", " pixelCoord = (uv * mapSize);", " texCoord = pixelCoord * inverseTileTextureSize *

Image2D in compute shader

孤街浪徒 提交于 2019-12-21 21:19:15
问题 I want to use image2D as 2D storage for vertices which will be modified by compute shader but things doesnt work. Create textures: glGenTextures(1, &HeightMap); glBindTexture(GL_TEXTURE_2D, HeightMap); glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA32F, 513, 513, 0,GL_RGBA32F, GL_UNSIGNED_BYTE, 0); Use and dispatch compute shader: glUseProgram(ComputeProgram); glActiveTexture(GL_TEXTURE0); glBindImageTexture(0, HeightMap, 0, GL_FALSE, 0, GL_READ_WRITE, GL_RGBA32F); glDispatchCompute(1, 1, 1 );

How do I improve Direct3D streaming texture performance?

心不动则不痛 提交于 2019-12-21 19:40:54
问题 I'm trying to accelerate the drawing of a full-screen texture which changes every frame. On my system, I can get around 1000 FPS using GDI and BitBlt(), but I thought I could improve performance by using Direct3D and dynamic textures. Instead I'm only getting around 250 FPS. I'm running on a Mac Pro with an ATI HD 4870 with current drivers. I've tried using dynamic textures and that gives me a small gain (~15FPS) and I've tried using a texture chain to avoid pipeline stalls and that has no

How to send multiple textures to a fragment shader in WebGL?

家住魔仙堡 提交于 2019-12-21 13:11:22
问题 So in the javascript portion of my code, here is the snippet that actually sends an array of pixels to the vertex and fragment shaders- but I am only working with 1 texture when I get to those shaders- is there anyway that I can send two textures at a time? if so, how would I 'catch' both of them on the GLSL side of the codee? if (it > 0){ gl.activeTexture(gl.TEXTURE1); gl.bindTexture(gl.TEXTURE_2D, texture); gl.activeTexture(gl.TEXTURE0); gl.bindFramebuffer(gl.FRAMEBUFFER, FBO2);} else{ gl

texturing using texelFetch()

旧巷老猫 提交于 2019-12-21 12:15:03
问题 When I pass non max values into texture buffer, while rendering it draws geometry with colors at max values. I found this issue while using glTexBuffer() API. E.g. Let’s assume my texture data is GLubyte, when I pass any value less than 255, then the color is same as that of drawn with 255, instead of mixture of black and that color. I tried on AMD and nvidia card, but the results are same. Can you tell me where could be going wrong? I am copying my code here: Vert shader: in vec2 a_position;

Triangle texture mapping OpenGL

自闭症网瘾萝莉.ら 提交于 2019-12-21 05:16:19
问题 I am working on a project using the Marching Cubes algorithm and changing the data into a 3D model. Now I want to use texture mapping in OpenGL for my 3D model. I have tried a simple example to begin with, which maps a picture onto a triangle. Here is my code: int DrawGLScene(GLvoid) // Here's Where We Do All The Drawing { glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear Screen And Depth Buffer glLoadIdentity(); // Reset The Current Matrix glTranslatef(1.0f,0.0f,-6.0f); // Move

Compiling OpenGL SOIL on Mac OS X

两盒软妹~` 提交于 2019-12-21 03:57:29
问题 How would I link in or compile SOIL (http://lonesock.net/soil.html) into my C++ OpenGL project on Mac OS X? 回答1: On newer versions of Mac OS X, such as Leopard, you'll have to edit the make file and add '-arch 1386 -arch x86_64' to the CXX macro of the Makefile. After compiling, you'll also have to link in the CoreFoundation.framework in your project. So your final build command might look something like gcc -Wall -lSOIL -framework OpenGL -framework GLUT -framework CoreFoundation 回答2: There's

Haskell opengl texture GLFW

妖精的绣舞 提交于 2019-12-21 02:47:38
问题 I have been trying to get some script that just displays a texture on a square using texcoords. If possible can you edit the script so that it works as from there I can workout how you did it as thats how I learn. import Control.Monad (unless, when) import Graphics.Rendering.OpenGL import qualified Graphics.UI.GLFW as G import System.Exit import System.IO import Texture import Data.IORef import Graphics.GLUtil import qualified Data.Set as Set main :: IO () main = do let errorCallback err

How to load and display image in OpenGL ES for iphone

妖精的绣舞 提交于 2019-12-20 12:32:21
问题 I'm a newbie and trying to display a sprite on my iPhone screen using OpenGL ES. I know its far simpler and easier to do it with cocos2d but now I'm trying to code directly on OpenGL. Is there any simple yet efficient way to load and display sprites in OpenGL ES. What I've found until now is much much complex. :( 回答1: Here is some code to load a png from bundle: UIImage* image = [UIImage imageNamed:@"PictureName.png"]; GLubyte* imageData = malloc(image.size.width * image.size.height * 4);