Difference between revisions of "Talk:Web Textures"

From Second Life Wiki
Jump to navigation Jump to search
("Privacy Problems")
Line 135: Line 135:


How realistic is it to consider that their might actually ''be'' an issue here. Everyone using SL is using the internet, and everyone using SL will also be looking at websites where they are already leaving an IP address trail, so why should there be any difference between something ''within'' SL and something ''external''? Yes, the texture might be hosted on someone's personal webspace where they will see the IP address appear in the logs and could tie it to an avatar, but exactly the same happens when the user visits a myspace site, a blog, most non-commercial websites; in each case individuals will be getting the information and, for the most part (99.999% imho,) nothing will be done with that information as there is no benefit to trawling through the data. I believe that a simple education campaign comparing what SL is doing to how the web works generally will set users' minds at rest. --[[User:Alison Wheels|AlisonW]] 09:34, 24 April 2007 (PDT)
How realistic is it to consider that their might actually ''be'' an issue here. Everyone using SL is using the internet, and everyone using SL will also be looking at websites where they are already leaving an IP address trail, so why should there be any difference between something ''within'' SL and something ''external''? Yes, the texture might be hosted on someone's personal webspace where they will see the IP address appear in the logs and could tie it to an avatar, but exactly the same happens when the user visits a myspace site, a blog, most non-commercial websites; in each case individuals will be getting the information and, for the most part (99.999% imho,) nothing will be done with that information as there is no benefit to trawling through the data. I believe that a simple education campaign comparing what SL is doing to how the web works generally will set users' minds at rest. --[[User:Alison Wheels|AlisonW]] 09:34, 24 April 2007 (PDT)
:Good question. There would be a conflict over a false sense of security. We have to focus more on the medium being transfered rather than who or where it networks with from one point to another point. While the infrastructure is being based at SL, we do not need to be so focused on the medium at this time. As the implementation moves more external, those bits will need to be considered. The security of the medium is best for a new page. [[User:Dzonatas Sol|Dzonatas Sol]] 11:05, 24 April 2007 (PDT)

Revision as of 11:05, 24 April 2007

This is a talk page associated with the Open Source Portal. Changes to all pages like this one can be tracked by watching the "Related Changes" for "Category:Open Source Talk Page"
Please sign comments you leave here by putting four tildes (~~~~) at the end of your comment. For more guidelines, see Talk Page Guidelines

Argent, Lets try to keep these article-like, rather than discussion-like. I integrated your comments.

On the white/black list pros and cons, instead of commenting on them, put a subpoint under them to say what would mitigate that con, or just take the con out if you think it's completely bogus. Gigs Taggart 09:54, 26 January 2007 (PST)

Done -- Argent Stonecutter 12:18, 26 January 2007 (PST)

Gigs, the discussion of privacy concerns needs to be in a separate section. Argent Stonecutter 06:42, 31 January 2007 (PST)

Making page moves

Please make a request to make a page move rather than making moves by cutting/pasting (just request on User talk:Rob Linden for now...we'll come up with a more streamlined process later). It's really a pain to merge histories after a cut/paste/redirect move, and much simpler to move the history/discussion and everything in one move. Thanks -- Rob Linden 10:40, 26 January 2007 (PST)

Turns out there is a move tab I overlooked. I'll use that from now on. Gigs Taggart 11:37, 26 January 2007 (PST)

Proxy

If a professional can't type in an IP address and port number, they aren't very much of a professional. Gigs Taggart 16:51, 26 January 2007 (PST)

have you ever used an anonymizing proxy chain? -- Argent Stonecutter 13:38, 28 January 2007 (PST)

Streaming Media

Streaming music is not specific to the "Do Nothing - Just Implement It" section. I've restored that section. Comments about the relationship between streaming media and texture bugs need to be outside any specific section. Here's my thoughts on it:

Yes, streaming media is an issue. There have been scares in the past over streaming media, and for that I've been on the side of "don't worry about it". Why?

Streaming media differs from texture bugs:

  • A significant number of people do not use streaming media in SL because:
    • The overhead of streaming media has an impact on performance.
    • A lot of people have really bad music on their land.
  • Streaming music is limited to one parcel. Texture bugs can be anywhere.
  • The bandwidth costs of any large streaming-media-based collection scheme are significant.

The result is that it's a far less significant exposure: fewer people are impacted, and even for those people you can only collect data in a small number of places. -- Argent Stonecutter 13:49, 28 January 2007 (PST)

Sorry, I undid some of your changes before I saw this. I reorganized the article some to reflect this suggestion. Gigs Taggart 12:00, 29 January 2007 (PST)

With regards to LSL Implementation

Implementation could be simplified thusly: Instead of the original code:

llSetTextureURL(integer face, string url);
llRefreshTextureURL(integer face);
llSetTextureURL(1, llUrlEncode("http://example.com/getpng.php?text= " + text + "&font=futura&rez=512x512");
llRefreshTextureURL(integer face);

We can make use of llSetTexture()'s ability to be supplied with either a string OR a key.

llSetTexture("http://example.com/example.png",ALL_SIDES);

Since the LSL/Mono VM first checks if the variable supplied for the texture name is in the object's contents, after checking if it's a UUID, the VM would check if it's a valid URL.

I like that idea. This could also be applied to llRenderHTML and llRenderText in Programmatic Textures by allowing the string to be well-formatted XML or the name of a notecard in the objet's inventory -- Argent Stonecutter 06:31, 31 January 2007 (PST)

An alternate method, if Linden Lab were to host/cache the images so the Asset system information could be used to enable Copyright owners to stop content from being used:

llSetTexture(llExternalAsset("http://example.com/example.png"),ALL_SIDES);

llExternalAsset() would take any valid URI, and return a UUID that could be used with the appropriate existing LSL function. Any header information found in the file can be applied to the Asset system, so if a web page/image/sound/animation/etc. had creator/author/producer/developer attributes attached to it, the Asset Server will store that information, so it will be displayed in the UI (as opposed to the uploader of content in the current fashion). This would also give copyright holders an avenue to pursue DMCA claims, as the hypothetical system would indicate that they are owner, making things easier except for cases whereby the less scrupulous Resident did something like run an image through the GD libraries to strip out the metadata.

The llExternalAsset() implementation would also benefit greatly from caching, for two reasons:

  1. You'd obviously not want a new asset to be created every time the script was restarted.
  2. Lots of people might use the same resource, and
    1. this would help lower the amount of duplicate information being stored
    2. this would speed up subsequent queries
      1. In cases where the Asset Server indicates that the external asset already exists in the system, it'll have the option of attempting to receive a HTTP 304 header from the external source, or
      2. If one location for the asset is bogged down, the current simulator's local asset server can make the call and supply that image to the local simulator.
        • In this instance, after the local simulator re-downloaded the texture, binary comparison would have to be executed, and if the local sim's texture is different to the other sim's textures, then the other sim's asset servers would replace the old asset with the freshly downloaded one.

SignpostMarv Martin 07:20, 29 January 2007 (PST)

Creating assets for every frame of dyanmic content isn't really even an option. A web cam might be streaming 5 images a second through this. You are assuming that this would be for static images. That defeats the entire point of it. They can just upload the image if it's static. This is not some "save 10L on the upload fee" gimmick, the entire point of this is scriptable dynamic content. Gigs Taggart 11:54, 29 January 2007 (PST)
Yeah, but someone streaming a webcam would be better off using the parcel media stuff.
Would you be willing to pay for each asset you create with this method ? L$10, L$20, L$30 ?
SignpostMarv Martin 12:43, 29 January 2007 (PST)
Linden Labs has already ruled out any scripted generation of assets. -- Argent Stonecutter 06:32, 31 January 2007 (PST)
I'll second Gigs' point; this isn't about saving a few cents, it is about immediacy of near-live or live content, and as such no client should be seeing cached content, otherwise we're back to working out how to get multiple residents 'see' the same display concurrently (which inho is the reason for needing web xxx on a prim) --Alison Wheels 03:53, 28 March 2007 (PDT)
I'd like to avoid conflicts with inventory & urls and adding new functions. Since pipe is an unusable character in inventory names, I suggest it be used at the start of URIs. llSetTexture("|http://www.example.com/example.jp2", ALL_SIDES); Strife Onizuka 11:37, 28 March 2007 (PDT)
By not having new functions we remove the complicatsions of what llGetTexture/llGetPrimitiveParams should return. But it is probably better to add new functions. Strife Onizuka 21:24, 28 March 2007 (PDT)

Possible problem with feature

The way a image has to be intepreted relies on header data. This header data indicates to the viewer/interpreter info like howmuch data is being send, the pixeldepth etc. Deliberated corrupted headers could pose new and unpredictable problems. Also the threat of hiding executable data inside an image might create new types of threats.

I'd like it if someone with some insight on this issue would describe which implementations of this feature would pose the most risk.--Vernes Veranes 05:59, 28 March 2007 (PDT)

You'd be surprised how little data validation the server does on uploaded textures. This is really not a new issue, the client must handle corrupted and even malicious textures right now, even without web textures. Gigs Taggart 21:19, 10 April 2007 (PDT)

Texture/Bandwidth Theft

One problem with allowing unrestricted access to any (viable) texture on the web is that of using the materials and bandwidth of unsuspecting hosts. One solution is to require that any texture to be used must have, in the same directory or one higher up the parent tree, a text file of a particular name (such as "secondlife.txt") in much the same way that the "robots.txt" file is used.

Such a file could also contain information restricting what objects can use that texture in the form of a ban or allow list. References could include a UUID along with Owner, Creator, Group, or to get really specific, a particular Role in a Group. Failure to match these rules results in the 404 texture.

This is a hot linking issue. It's simply a case of adding an appropriate value into the referrer header. Thus requiring no extra text-files to be added, and making the "bandwidth theft" prevention methods 100% compatible with every existing installation of a hot linking preventing script, whereby the referrers must match a specific URL or domain.
SignpostMarv Martin 14:15, 10 April 2007 (PDT)
This is completely not an issue. The chance that someone is hosting a power of 2 texture that is a TGA or Jpeg2000 is nil. You can't just use random images off the net for a web texture, they will have to be specially crafted for this purpose. Gigs Taggart 21:18, 10 April 2007 (PDT)

Literature

Another use which springs to mind is that of literature. Presently, providing a magazine, book or other publication in-world requires rendering each "page" into a texture and uploading these individually. I can forsee SL "publishers" using PERL/PHP cgi-bin scripts to convert page-sized chunks of text and/or rendering pages from PDF files. Though issues of copyright are of concern, the benefits to SL residents would be enormous. -- Num Skall 13:21, 18 April 2007 (PDT)

Server Side Black List

It still remains the possibility to disallow the loading of textures from ip only hostnames, and the usage of a server side domain filter to block known attackers. -- Robaato Yoshikawa 05:18, 23 April 2007 (PDT)

Trusted hosting partners

I do not think it will become a nightmare if allowed hostnames must be registered first, remembering the host owner about the SL TOS, that way any privacy violation can be better approached by law. -- Robaato Yoshikawa 05:18, 23 April 2007 (PDT)

/me is confused

Can someone remind me how this is separate to HTML on a prim ?

SignpostMarv Martin 10:23, 23 April 2007 (PDT)


One would require Mozilla to do the rendering of the texture, the other wouldn't. But I can understand the confusion. Strife Onizuka 06:36, 24 April 2007 (PDT)


  • The web textures, HTML-on-a-prim, are part of another idea for implementation. At this point, it is all just under discussion to cover every aspect of each individual idea. The ideas of materials on a prim is the implementation side. Dzonatas Sol 07:56, 24 April 2007 (PDT)

"Privacy Problems"

How realistic is it to consider that their might actually be an issue here. Everyone using SL is using the internet, and everyone using SL will also be looking at websites where they are already leaving an IP address trail, so why should there be any difference between something within SL and something external? Yes, the texture might be hosted on someone's personal webspace where they will see the IP address appear in the logs and could tie it to an avatar, but exactly the same happens when the user visits a myspace site, a blog, most non-commercial websites; in each case individuals will be getting the information and, for the most part (99.999% imho,) nothing will be done with that information as there is no benefit to trawling through the data. I believe that a simple education campaign comparing what SL is doing to how the web works generally will set users' minds at rest. --AlisonW 09:34, 24 April 2007 (PDT)

Good question. There would be a conflict over a false sense of security. We have to focus more on the medium being transfered rather than who or where it networks with from one point to another point. While the infrastructure is being based at SL, we do not need to be so focused on the medium at this time. As the implementation moves more external, those bits will need to be considered. The security of the medium is best for a new page. Dzonatas Sol 11:05, 24 April 2007 (PDT)