Difference between revisions of "Touch Coordinates"
m |
m (changed 'texturecoordiantes' to 'sidecoordinates' to make it more clear.) |
||
Line 5: | Line 5: | ||
Implement the following command: | Implement the following command: | ||
list [float x, float y, integer side, integer link_number] = | list [float x, float y, integer side, integer link_number] = llDetectedSideCoordinates(integer num_detected); | ||
This function returns the | This function returns the side x and y coordinates from within a touch event. | ||
'''Goal:''' | '''Goal:''' | ||
Line 17: | Line 17: | ||
'''How to implement:''' | '''How to implement:''' | ||
* Add support for the | * Add support for the llDetectedSideCoordinates LSL-command for the server and client. | ||
* Change handling of client touching things so it is possible for the server to find out where the object was touched. | * Change handling of client touching things so it is possible for the server to find out where the object was touched. | ||
* To support scaling of the object, the x and y coordinates will be a float ranging from 0.0 to 1.0 and will be a relative value independant of real prim dimensions. So, x=0.5 and y=0.5 will mean the exact center of that side. | * To support scaling of the object, the x and y coordinates will be a float ranging from 0.0 to 1.0 and will be a relative value independant of real prim dimensions. So, x=0.5 and y=0.5 will mean the exact center of that side. |
Revision as of 07:24, 26 January 2007
Feature Request: Get Touch Coordinates
Proposal:
Implement the following command:
list [float x, float y, integer side, integer link_number] = llDetectedSideCoordinates(integer num_detected);
This function returns the side x and y coordinates from within a touch event.
Goal:
The goal is to get more detailed feedback about where exactly an object was touched. It will return the prim number within the linkset, the side on that prim and the coordinates (x,y) where on that side it was touched.
This will greatly help HUDs and other Panels to implement a user interface without using additional prims as buttons. You could make 10 switches as texture, put them on one single prim and then use the llDetectedTextureCoordinates function to find out which button was pressed.
How to implement:
- Add support for the llDetectedSideCoordinates LSL-command for the server and client.
- Change handling of client touching things so it is possible for the server to find out where the object was touched.
- To support scaling of the object, the x and y coordinates will be a float ranging from 0.0 to 1.0 and will be a relative value independant of real prim dimensions. So, x=0.5 and y=0.5 will mean the exact center of that side.
Requested by: User:Fairlight Lake
Pros:
- greatly improve User Experience with HUDs
- reduce the number of used prims on operator panels and other objects that heavily rely on touch events to do different tasks
Cons:
- eventually slightly more overhead on touch event notification from client to server
Open questions:
- I am not sure if issues would arise with slanted sides.
Please use the talk page to discuss this Feature Request.