Talk:Touch Coordinates

From Second Life Wiki
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

General

What neither proposal addresses is Planar Mapping. I suspect regardless, that implementing this will require changes to the Rendering Pipeline. Strife Onizuka 15:35, 2 April 2007 (PDT)

Proposal A

Might be better implemented as providing the texture U and V. This would keep code simple when a texture spans a few surfaces, and would make the use more clear with spheres, tori, etc. --Moshe Sapwood 09:31, 26 January 2007 (PST)

No problem with that, just as long as it's persistent through size changes to relief the script of having to get texture size, offset, prim size and all that and calculate the button positions dynamically all the time. --Fairlight Lake 19:10, 26 January 2007 (CET)

Sure, UVs are normally done ranging 0-1. For the default texture mapping on a cube, the returned result would be identical for face coordinates as in the proposal, and for texture coordinates. --Moshe Sapwood 08:30, 28 January 2007 (PST)

What about hollow prims? The inside of the prim can have several "sides" so the coordinates would be very unpredictable in both yx and uv. Also UV points may not be helpful if the texture repeats multiple times across the same face. This idea just needs more thought and it should be well accepted in the feature proposal area. If this idea is proposed please post a link. TxMasterG Ping 20:29, 30 March 2007 (PDT)

It'd be nice if it had some sort of visual feedback on the screen to show where you're pointing, like in Doom 3. In fact, text field support and reactive buttons would be even nicer, albeit far far more complicated to implement. Bbot Dmytryk 23:22, 1 April 2007 (PDT)

Proposal B

I've split the proposal, Proposal B addresses the issues of repeats and resizing. From my proposal I stripped the link-number since there is already a llDetectedLinkNumber. I did this because it's easier to handle a vector then a list (less cpu intensive). Strife Onizuka 15:26, 2 April 2007 (PDT)

A vector is made of three floats. Perhaps we should split this into something like llDetectedSide(blah blah) and make this proposal into <float x,float y,(float)0>. This would still mean that their is no list. --TxMasterG Ping 13:10, 14 April 2007 (PDT)

General Discussion

I don't know if either plan should be tied to the texture scaling and offset. It seems like it could create more problems, and lengthen time to implementation. If it just returns the horizontal and vertical components on a scale of 0-1 (float) and the face, you can figure everything else out. You can get texture offset and scaling with other calls and perform the math. If it stretches across multiple faces then you check the face number returned. --Anthony Reisman 11:56, 28 September 2007 (PDT)

There is currently a feature request at SVC-519.

Also, there is visual feedback with the pointer and in mouselook, you can turn on crosshairs from the Client menu

The client support for this already exists, gLastHitUCoord and gLastHitVCoord are set whenever a pick is performed with gPickFaces == TRUE; The plan has always been to expose this to scripts at some point. --Richard Linden 12:24, 12 October 2007 (PDT)