textures

CUDA: how to create 2D texture object?

﹥>﹥吖頭↗ 提交于 2019-12-01 11:30:46
问题 I'm trying to create 2D texture object, 4x4 uint8_t. Here is the code: __global__ void kernel(cudaTextureObject_t tex) { int x = threadIdx.x; int y = threadIdx.y; uint8_t val = tex2D<uint8_t>(tex, x, y); printf("%d, ", val); return; } int main(int argc, char **argv) { cudaTextureObject_t tex; uint8_t dataIn[16] = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15}; uint8_t* dataDev = 0; cudaMalloc((void**)&dataDev, 16); struct cudaResourceDesc resDesc; memset(&resDesc, 0, sizeof(resDesc));

How to normalize image coordinates for texture space in OpenGL?

爷,独闯天下 提交于 2019-12-01 10:36:24
Say I have an image of size 320x240 . Now, sampling from an sampler2D with integer image coordinates ux, uy I must normalize for texture coordinates in range [0, size] (size may be width or height). Now, I wonder if I should normalize like this texture(image, vec2(ux/320.0, uy/240.0)) or like this texture(image, vec2(ux/319.0, uy/239.0)) Because ux = 0 ... 319 and uy = 0 ... 239 . The latter one will actually cover the whole range of [0, 1] correct? That means 0 corresponds to the e.g. left-most pixels and 1 corresponds to the right most pixels, right? Also I want to maintain filtering, so I

Textured Line in CG

喜你入骨 提交于 2019-12-01 10:35:29
问题 I am working on a Drawing App for iOS. In this app i need to draw textured lines. For this i am using this method.But texture is so small and not in shape. I want to know what i did wrong and how can i resolve it. Here is my updated code- CGPoint mid1 = midPoint(previousPoint1, previousPoint2); CGPoint mid2 = midPoint(currentPoint, previousPoint1); [curImage drawAtPoint:CGPointMake(0, 0)]; CGContextRef context = UIGraphicsGetCurrentContext(); [self.layer renderInContext:context];

Quad Strip Texturing Distortion

时光怂恿深爱的人放手 提交于 2019-12-01 09:11:00
I have a GL_QUAD_STRIP that I am texture mapping. The quad strip folds back on itself to form a UV sphere. Essentially, the quad strip is not generally made of rectangles--instead, trapezoids. I am getting texture distortion issues. The texture coordinates are correct (for example, they line up nicely where they should). The issue is on the trapezoidal faces themselves. I would expect texels towards the larger end of a trapezoidal face to subtend larger area, yet the face seems broken into two triangles. It looks similar to affine texture mapping (but I don't think it is; it still seems

CUDA 1D texture fetch always return 0

杀马特。学长 韩版系。学妹 提交于 2019-12-01 08:22:46
问题 I am trying to test the CUDA 1D texture with a piece of simple code. It is quite straightforward: first allocates a global memory, then bind it to a texture reference; access the texture from within a kernel by tex1D(); print out the value returned by the texture fetch. The code is as follows: #include "cuda.h" #include "cuda_runtime.h" #include <iostream> #include <vector> #include <cstdio> using namespace std; texture<float, cudaTextureType1D, cudaReadModeElementType> texX; __global__ void

Can't call glGenTextures on multithreaded android app

﹥>﹥吖頭↗ 提交于 2019-12-01 08:20:41
问题 I'm making an OpenGLES Android app using Android NDK, expanding from android's gljni example, which can be found here It's using GLSurfaceView. Textures are initialized in a JNI function called from onSurfaceChanged() of GLSurfaceView.Renderer When the user touches screen, the app needs more textures. In order to do so, glGenTextures() is called in a JNI function called in onTouchEvent(). The problem is that the thread id (which gettid() returns) seems completely arbitrary and not always the

How to normalize image coordinates for texture space in OpenGL?

旧城冷巷雨未停 提交于 2019-12-01 07:42:33
问题 Say I have an image of size 320x240 . Now, sampling from an sampler2D with integer image coordinates ux, uy I must normalize for texture coordinates in range [0, size] (size may be width or height). Now, I wonder if I should normalize like this texture(image, vec2(ux/320.0, uy/240.0)) or like this texture(image, vec2(ux/319.0, uy/239.0)) Because ux = 0 ... 319 and uy = 0 ... 239 . The latter one will actually cover the whole range of [0, 1] correct? That means 0 corresponds to the e.g. left

Quad Strip Texturing Distortion

大兔子大兔子 提交于 2019-12-01 07:24:54
问题 I have a GL_QUAD_STRIP that I am texture mapping. The quad strip folds back on itself to form a UV sphere. Essentially, the quad strip is not generally made of rectangles--instead, trapezoids. I am getting texture distortion issues. The texture coordinates are correct (for example, they line up nicely where they should). The issue is on the trapezoidal faces themselves. I would expect texels towards the larger end of a trapezoidal face to subtend larger area, yet the face seems broken into

Texture recognition within a specific area in the pic

爷,独闯天下 提交于 2019-12-01 06:39:22
I'm new in the texture recognition field, and I would like to know which are the possible ways to approach a texture problem in opencv. I need to identify the texture within a region in the pic, and tell if it is uniform, homogeneous in the whole area, or not. More in depth, I need to be able to tell if a possible fallen person is a person (with many different kind of textures) or something wrong like a pillow, or a blanket. Could anyone suggest a solution, please? Is there some already made opencv code to adapt? Thanks in advance! Why don't use haralick features? I other words they are called

Sharing the GLES20 context and textures between different GLSurfaceViews?

不想你离开。 提交于 2019-12-01 05:32:39
问题 Is it possible to share the GLES20 context between different GLSurfaceViews (within one Activity)? Alternatively, how would one share a set of texture between different GLSurfaceViews? On iOS, if you want to conserve memory and reuse (large) textures in different CAEAGLLayer-backed UIViews, you can pass around a EAGLContext object between them or use different EAGLContexts which share a common EAGLSharegroup object. I wonder how to accomplish this on Android. Is there any equivalent technique