Difference between revisions of "Touch Coordinates"
Jump to navigation
Jump to search
m |
m |
||
Line 5: | Line 5: | ||
Implement the following command: | Implement the following command: | ||
list [ | list [float x, float y, integer side, integer link_number] = llDetectedTextureCoordinates(integer num_detected); | ||
This function returns the texture x and y coordinates from within a touch event. | This function returns the texture x and y coordinates from within a touch event. | ||
Line 19: | Line 19: | ||
* Add support for the llDetectedTextureCoordinates LSL-command for the server and client. | * Add support for the llDetectedTextureCoordinates LSL-command for the server and client. | ||
* Change handling of client touching things so it is possible for the server to find out where the object was touched. | * Change handling of client touching things so it is possible for the server to find out where the object was touched. | ||
* To support scaling of the object, the x and y coordinates will be | * To support scaling of the object, the x and y coordinates will be a float ranging from 0.0 to 1.0 and will be a relative value independant of real prim dimensions. So, x=0.5 and y=0.5 will mean the exact center of that side. | ||
'''Requested by:''' [[User:Fairlight Lake]] | '''Requested by:''' [[User:Fairlight Lake]] |
Revision as of 07:17, 26 January 2007
Feature Request: Get Touch Coordinates
Proposal:
Implement the following command:
list [float x, float y, integer side, integer link_number] = llDetectedTextureCoordinates(integer num_detected);
This function returns the texture x and y coordinates from within a touch event.
Goal:
The goal is to get more detailed feedback about where exactly an object was touched. It will return the prim number within the linkset, the side on that prim and the coordinates (x,y) where on that side it was touched.
This will greatly help HUDs and other Panels to implement a user interface without using additional prims as buttons. You could make 10 switches as texture, put them on one single prim and then use the llDetectedTextureCoordinates function to find out which button was pressed.
How to implement:
- Add support for the llDetectedTextureCoordinates LSL-command for the server and client.
- Change handling of client touching things so it is possible for the server to find out where the object was touched.
- To support scaling of the object, the x and y coordinates will be a float ranging from 0.0 to 1.0 and will be a relative value independant of real prim dimensions. So, x=0.5 and y=0.5 will mean the exact center of that side.
Requested by: User:Fairlight Lake
Pros:
- greatly improve User Experience with HUDs
- reduce the number of used prims on operator panels and other objects that heavily rely on touch events to do different tasks
Cons:
- eventually slightly more overhead on touch event notification from client to server
Please use the talk page to discuss this Feature Request.