
Maybe you can use gluUnProject for this purpose. I know two ways how to do this: 1) Use frame buffer, where you render all objects, each with different color (no lighing, no texturing. You can find something here - it will help you understand those transformations. Results should be starting and ending points of ray. Then multiply them by inverse projection matrix and inverse modelview matrix. These are NDC (normalized device coordinates). Or you can create 3D points - your screen coordinates (x, y) - devided by screen size (you have to get coords in interval ), and z coordinate will be -1 (starting point) and 1 (ending point). The biggest problem of the new JOGL version seems to be the dependence on non power of two textures for scrolling, which a lot of machines still do not support. I have found this article, it could be useful. In order to make the release look a little more exciting, Im working on a new game that makes good use of the OpenGL extensions. Then do ray tracing and find first object in path (ray object collision). Now you have to transfer 2D point back to 3D - create ray from camera to clicked point.



Then you will find object according to this color.Ģ) Render your scene, read clicked coordinates. When you click with your mouse, you read 2D coordinates, look in frame buffer and find color of pixel, where you have clicked. 1) Use frame buffer, where you render all objects, each with different color (no lighing, no texturing.
