lwjgl

LWJGL VBO content is always drawn to the center of the screen (0,0,0)

廉价感情. 提交于 2019-12-12 00:27:23
问题 I started to follow a tutorial about modern OpenGL rendering and altered the c++ code from a VBO lesson to work with LWJGL. I initialized the VBO with the following code: int vbo = GL15.glGenBuffers(); GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vbo); GL15.glBufferData(GL15.GL_ARRAY_BUFFER, buffer, GL15.GL_STATIC_DRAW); "buffer" is initialized as FloatBuffer buffer = BufferUtils.createFloatBuffer(9); and then filled with {-0.5f, -0.5f, 0, 0.5f, -0.5f, 0, 0, 0.5f, 0} via buffer.put(val) My game

glfwGetPrimaryMonitor is returning 0

时光总嘲笑我的痴心妄想 提交于 2019-12-11 23:29:02
问题 If I call long rslt = glfwGetPrimaryMonitor(); the result is 0. The glfw library can't pick up my monitors for some unknown reason. Also, if I try: PointerBuffer rslt = glfwGetMonitors(); the result is null... My monitors work fine. I have seen some suggestions that I should uninstall my monitors and let windows reinstall the appropriate drivers. This seems like an inadequate workaround to me. I can't expect users to have to do this if they run my application. I'd much prefer to at least get

Error when launching Java app on desktop using LibGdx/Eclipse

好久不见. 提交于 2019-12-11 22:34:17
问题 I'm using LibGdx to develop a Java-based game. I haven't decided whether I want the game to be deployed on Android or PC, so I figured I can at least develop+test on desktop and then decide. I've used LibGdx before and it worked fine.. but that must been before I did a complete system wipe (as I like to do a few times per year). Anyhow, when I try to launch the game in Eclipse as "Java Application" I get the following errors: - Exception in thread "LWJGL Application" com.badlogic.gdx.utils

What is the simplest method for rendering shadows on a scene in OpenGL?

守給你的承諾、 提交于 2019-12-11 19:46:23
问题 I am using the LWJGL package and am able to create a basic scene and draw shapes (with or without textures), move a custom 'Camera' object and rotate it to render the scene accordingly. However, when it comes to creating shadows, I am at a loss. I can think of the basic algorithm for creating shadows. 1) Render the scene from the camera's view as if in shadow. 2) Render the scene from the light's view, lighting up the visible part of the scene (maybe darken the scene as it becomes farther

Appearance of a triangle strip. Surface normals? Or windings?

我只是一个虾纸丫 提交于 2019-12-11 19:03:45
问题 Below is a picture of what my outcome is. I am using flat shading and have put each vertex in their respectable triangle objects. Then I use these vertices to calculate the surface normals. I have been reading that because my triangles share similar vertices that calculating the normals may be an issue? But to me this looks like a windings problem given that every other one is off. I provided some of my code below to anyone who wants to look through it and get a better idea what the issue

Stop sprite from ghosting through another sprite

北城以北 提交于 2019-12-11 17:59:10
问题 Ok so I've just started learning java (I usually program in Objective-C). My first game is a game similar to Pokémon, however, its a lot more simplified obviously... The trouble I'm having is I can't find a way to stop 2 sprites from 'ghosting' through each other. On screen I have borders set up (boundaries), A player sprite, and an Enemy sprite. public void playerUpdate(GameContainer gc, int delta) throws SlickException { Input input = gc.getInput(); // Right Key Pressed if (input.isKeyDown

Implementation paradigm for efficiently streaming zoomable images from machine vision camera?

半世苍凉 提交于 2019-12-11 13:37:06
问题 I'm working on a machine vision project where I would like to use Java despite the benefits of working in a native environment due to the long term goal of developing a plugin to a pre-existing framework which is written in Java. Currently, I'm using a machine vision camera which streams 12MP images over a USB 3.0 bus at approximately 7 FPS, and may in the future go to higher resolution cameras (>29 MP). The native libraries I'm are using are in c which I have already successfully been able

LWJGL png texture transparency (textureColour.a white color instead of black)

南笙酒味 提交于 2019-12-11 12:27:50
问题 I have grass models textured by PNG pictures. And i get white background color instead of black, which I want to. Why it is so and what should i do to fix that? I am using LWJGL 3 and PNGDecoder.jar Texture loader code: public int loadTexture(String fileName) { ByteBuffer buf = null; int tWidth = 0; int tHeight = 0; try { // Open the PNG file as an InputStream InputStream in = new FileInputStream("res/" + fileName + ".png"); // Link the PNG decoder to this stream PNGDecoder decoder = new

LWJGL Drawing colored text to the screen issue

北城余情 提交于 2019-12-11 10:53:55
问题 I am trying to display text on the screen and eclipse is telling me that the drawString method does not accept a Color variable. This is my code import java.awt.Color; import java.awt.Font; import org.newdawn.slick.TrueTypeFont; public class Text { static TrueTypeFont font; public static void drawText(int x, int y, String text) { Font awtFont = new Font("Terminal", Font.BOLD, 24); font = new TrueTypeFont(awtFont, false); font.drawString(x, y, text, Color.yellow); //x, y, string to draw, color

Strange glVertexAttrib Behavior?

牧云@^-^@ 提交于 2019-12-11 08:59:55
问题 I'm currently adding a shader system into my project and have run into a bit of a snag that I'm hoping someone can sort out. My project is written in Java, uses the LWJGL library, and targets OpenGL 2.1. From my understanding, using the glVertexAttrib functions set an attribute that is meant to remain constant until a new value is passed. For example, the following call should make all of my geometry white until I change it: glVertexAttrib3f(shader.getAttributeLocation("in_Color"), 1.0f, 1.0f