coordinate-transformation

How to make stroke width immune to the current transformation matrix

a 夏天 提交于 2019-11-28 21:06:13
In SVG (and Canvas, Quartz, Postscript, ...), the transformation matrix affects both the path coordinates and the line width. Is there a way to make an adjustment so the line width is not affected? That is, in the following example, the scale is different for X and Y, which makes the square into a rectangle, which is OK, but it also makes the lines wider on two sides. <g transform="rotate(30) scale(5,1) "> <rect x="10" y="10" width="20" height="20" stroke="blue" fill="none" stroke-width="2"/> </g> I can see that would be useful in many cases, but is there a way to opt out of it? I suppose I

Is it possible to anchor a matplotlib annotation to a data coordinate in the x-axis, but to a relative location in the y-axis?

被刻印的时光 ゝ 提交于 2019-11-28 13:27:09
I have a plot where I'd like to annotate a specific location on the x-axis with an arrow and a label: The location of the tip of the arrow needs to be specified exactly in data coordinates. The arrow should be vertical, therefore the x-coordinate of the blunt end of the arrow (and the text label) should also be specified exactly in data coordinates. However, I would ideally like to be able to specify the y-position of the blunt end of the arrow relative to the axis bounding box, rather than the data. My current working solution involves specifying the locations of both the arrow tip and the

Is it possible get which surface of cube will be click in OpenGL?

痞子三分冷 提交于 2019-11-28 12:47:01
I already create a cube and its spin perfectly. And my task is which spinning cube you click. ex, if you click on red color of the surface in a cube then, I will win, but I can not able to find surface view of click of cube, Edited i want surface of where i touch. Here is my code of renderer: public void onDrawFrame(GL10 arg0) { // GLES20.glEnable(GLES20.GL_TEXTURE_CUBE_MAP); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); GLES20.glUseProgram(iProgId); cubeBuffer.position(0); GLES20.glVertexAttribPointer(iPosition, 3, GLES20.GL_FLOAT, false, 0, cubeBuffer); GLES20

Transpose z-position from perspective to orthographic camera in three.js

纵然是瞬间 提交于 2019-11-28 05:25:52
问题 I have a scene where I want to combine perspective objects (ie. objects that appear smaller when they are far away) with orthogographic objects (ie. objects that appear the same size irrespective of distance). The perspective objects are part of the rendered "world", while the orthogographic objects are adornments, like labels or icons. Unlike a HUD, I want the orthogographic objects to be rendered "within" the world, which means that they can be covered by world objects (imagine a plane

Android OpenGL ES 2.0 screen coordinates to world coordinates

蓝咒 提交于 2019-11-28 04:44:44
I'm building an Android application that uses OpenGL ES 2.0 and I've run into a wall. I'm trying to convert screen coordinates (where the user touches) to world coordinates. I've tried reading and playing around with GLU.gluUnProject but I'm either doing it wrong or just don't understand it. This is my attempt.... public void getWorldFromScreen(float x, float y) { int viewport[] = { 0, 0, width , height}; float startY = ((float) (height) - y); float[] near = { 0.0f, 0.0f, 0.0f, 0.0f }; float[] far = { 0.0f, 0.0f, 0.0f, 0.0f }; float[] mv = new float[16]; Matrix.multiplyMM(mv, 0, mViewMatrix, 0

Reference coordinate system changes between OpenCV, OpenGL and Android Sensor

梦想与她 提交于 2019-11-27 19:29:33
I am working with OpenCV , Android and OpenGL for an Augmented Reality project. As far as I know the coordintate system in OpenGL is The OpenCV coordinate system is: When combining these devices with android sensors how can I do the coordinate system conversions and [R|t] matrix conversion? Is there a good tutorial or documentation were all of this conffusing stuff is explained? If you look at the picture, then you see, that the both coordinate systems have the same handednes, but the OpenCV one is rotated by pi around the x axis. This can be represented by the following rotation matrix: 1 0 0

Is it possible get which surface of cube will be click in OpenGL?

淺唱寂寞╮ 提交于 2019-11-27 19:25:47
问题 I already create a cube and its spin perfectly. And my task is which spinning cube you click. ex, if you click on red color of the surface in a cube then, I will win, but I can not able to find surface view of click of cube, Edited i want surface of where i touch. Here is my code of renderer: public void onDrawFrame(GL10 arg0) { // GLES20.glEnable(GLES20.GL_TEXTURE_CUBE_MAP); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); GLES20.glUseProgram(iProgId); cubeBuffer

How to make stroke width immune to the current transformation matrix

[亡魂溺海] 提交于 2019-11-27 13:28:01
问题 In SVG (and Canvas, Quartz, Postscript, ...), the transformation matrix affects both the path coordinates and the line width. Is there a way to make an adjustment so the line width is not affected? That is, in the following example, the scale is different for X and Y, which makes the square into a rectangle, which is OK, but it also makes the lines wider on two sides. <g transform="rotate(30) scale(5,1) "> <rect x="10" y="10" width="20" height="20" stroke="blue" fill="none" stroke-width="2"/>

Acceleration from device's coordinate system into absolute coordinate system

本小妞迷上赌 提交于 2019-11-27 06:12:54
From my Android device I can read an array of linear acceleration values (in the device's coordinate system) and an array of absolute orientation values (in Earth's coordinate system). What I need is to obtain the linear acceleration values in the latter coord. system. How can I convert them? EDIT after Ali's reply in comment: All right, so if I understand correctly, when I measure the linear acceleration, the position of the phone completely does not matter, because the readings are given in Earth's coordinate system. right? But I just did a test where I put the phone in different positions

Determining UTM zone (to convert) from longitude/latitude

℡╲_俬逩灬. 提交于 2019-11-27 04:31:23
I'm writing a program which expects a number of lat/long points, and I convert them internally to UTM in order to do some calculations in metres. The range of the lat/long points themselves is quite small -- about 200m x 200m. They can be relied on almost always to be within a single UTM zone (unless you get unlucky and are across the border of a zone). However, the zone that the lat/longs are in is unrestricted. One day the program might be run for people in Australia (and oh, how many zones does even a single state lie across, and how much pain has that caused me already...), and another day