Difference between revisions of "Talk:Web Textures"

From Second Life Wiki
Jump to navigation Jump to search
 
(25 intermediate revisions by 11 users not shown)
Line 1: Line 1:
{{Open Source Talk Page}}
Argent,  Lets try to keep these article-like, rather than discussion-like.  I integrated your comments.
Argent,  Lets try to keep these article-like, rather than discussion-like.  I integrated your comments.


Line 84: Line 86:


:I'd like to avoid conflicts with inventory & urls and adding new functions. Since pipe is an unusable character in inventory names, I suggest it be used at the start of URIs. <code><nowiki>llSetTexture("|http://www.example.com/example.jp2", ALL_SIDES);</nowiki></code> [[User:Strife Onizuka|Strife Onizuka]] 11:37, 28 March 2007 (PDT)
:I'd like to avoid conflicts with inventory & urls and adding new functions. Since pipe is an unusable character in inventory names, I suggest it be used at the start of URIs. <code><nowiki>llSetTexture("|http://www.example.com/example.jp2", ALL_SIDES);</nowiki></code> [[User:Strife Onizuka|Strife Onizuka]] 11:37, 28 March 2007 (PDT)
::By not having new functions we remove the complicatsions of what [[llGetTexture]]/[[llGetPrimitiveParams]] should return. But it is probably better to add new functions. [[User:Strife Onizuka|Strife Onizuka]] 21:24, 28 March 2007 (PDT)


== Possible problem with feature ==
== Possible problem with feature ==
Line 93: Line 96:


I'd like it if someone with some insight on this issue would describe which implementations of this feature would pose the most risk.--[[User:Vernes Veranes|Vernes Veranes]] 05:59, 28 March 2007 (PDT)
I'd like it if someone with some insight on this issue would describe which implementations of this feature would pose the most risk.--[[User:Vernes Veranes|Vernes Veranes]] 05:59, 28 March 2007 (PDT)
::You'd be surprised how little data validation the server does on uploaded textures.  This is really not a new issue, the client must handle corrupted and even malicious textures right now, even without web textures. [[User:Gigs Taggart|Gigs Taggart]] 21:19, 10 April 2007 (PDT)
==Texture/Bandwidth Theft==
One problem with allowing unrestricted access to any (viable) texture on the web is that of using the materials and bandwidth of unsuspecting hosts. One solution is to require that any texture to be used must have, in the same directory or one higher up the parent tree, a text file of a particular name (such as "secondlife.txt") in much the same way that the "robots.txt" file is used.
Such a file could also contain information restricting what objects can use that texture in the form of a ban or allow list. References could include a UUID along with Owner, Creator, Group, or to get really specific, a particular Role in a Group. Failure to match these rules results in the 404 texture.
: This is a hot linking issue. It's simply a case of adding an appropriate value into the referrer header. Thus requiring no extra text-files to be added, and making the "bandwidth theft" prevention methods 100% compatible with every existing installation of a hot linking preventing script, whereby the referrers must match a specific URL or domain.
: [[User:SignpostMarv Martin|SignpostMarv Martin]] 14:15, 10 April 2007 (PDT)
::This is completely not an issue.  The chance that someone is hosting a power of 2 texture that is a TGA or Jpeg2000 is nil.  You can't just use random images off the net for a web texture, they will have to be specially crafted for this purpose. [[User:Gigs Taggart|Gigs Taggart]] 21:18, 10 April 2007 (PDT)
::I think the point is that you could steal bandwidth from other people hosting web textures for SL. [[User:Argent Stonecutter|Argent Stonecutter]] 14:17, 31 December 2007 (PST)
== Literature ==
Another use which springs to mind is that of literature. Presently, providing a magazine, book or other publication in-world requires rendering each "page" into a texture and uploading these individually. I can forsee SL "publishers" using PERL/PHP cgi-bin scripts to convert page-sized chunks of text and/or rendering pages from PDF files. Though issues of copyright are of concern, the benefits to SL residents would be enormous.
-- [[User:Num Skall|Num Skall]] 13:21, 18 April 2007 (PDT)
== Server Side Black List ==
It still remains the possibility to disallow the loading of textures from ip only hostnames, and the usage of a server side domain filter to block known attackers. -- [[User:Robaato Yoshikawa|Robaato Yoshikawa]] 05:18, 23 April 2007 (PDT)
== Trusted hosting partners ==
I do not think it will become a nightmare if allowed hostnames must be registered first, remembering the host owner about the SL TOS, that way any privacy violation can be better approached by law. -- [[User:Robaato Yoshikawa|Robaato Yoshikawa]] 05:18, 23 April 2007 (PDT)
== /me is confused ==
Can someone remind me how this is separate to HTML on a prim ?
[[User:SignpostMarv Martin|SignpostMarv Martin]] 10:23, 23 April 2007 (PDT)
:One would require Mozilla to do the rendering of the texture, the other wouldn't. But I can understand the confusion. [[User:Strife Onizuka|Strife Onizuka]] 06:36, 24 April 2007 (PDT)
:The web textures, HTML-on-a-prim, are part of another idea for implementation. At this point, it is all just under discussion to cover every aspect of each individual idea. The ideas of materials on a prim is the implementation side. [[User:Dzonatas Sol|Dzonatas Sol]] 07:56, 24 April 2007 (PDT)
::Okay, now remind me why we need something else other than Gecko to render things other than JPEG2000 images on the side of a prim.
::[[User:SignpostMarv Martin|SignpostMarv Martin]] 00:13, 25 April 2007 (PDT)
:::Only reason I can think of is resources; Web Textures don't need any of the features that Gecko would implement. -- [[User:Strife Onizuka|Strife Onizuka]] 02:56, 25 April 2007 (PDT)
:::gecko is a very heavyweight software component. If you walked into a mall containing dozens of vendors each using HTML-on-a-prim it would be like opening a couple of dozen web pages at once... even on my core duo that's enough to freeze gecko even without SL in the picture. A couple of dozen vendors using web textures or other lightweight programmatic textures would be like opening a web page with a couple of dozen images... much better. -- [[User:Argent Stonecutter|Argent Stonecutter]] 08:24, 25 April 2007 (PDT)
== "Privacy Problems" ==
How realistic is it to consider that their might actually ''be'' an issue here. Everyone using SL is using the internet, and everyone using SL will also be looking at websites where they are already leaving an IP address trail, so why should there be any difference between something ''within'' SL and something ''external''? Yes, the texture might be hosted on someone's personal webspace where they will see the IP address appear in the logs and could tie it to an avatar, but exactly the same happens when the user visits a myspace site, a blog, most non-commercial websites; in each case individuals will be getting the information and, for the most part (99.999% imho,) nothing will be done with that information as there is no benefit to trawling through the data. I believe that a simple education campaign comparing what SL is doing to how the web works generally will set users' minds at rest. --[[User:Alison Wheels|AlisonW]] 09:34, 24 April 2007 (PDT)
:Good question. There would be a conflict over a false sense of security. We have to focus more on the medium being transfered rather than who or where it networks with from one point to another point. While the infrastructure is being based at SL, we do not need to be so focused on the medium at this time. As the implementation moves more external, those bits will need to be considered. The security issues of the medium is best for a new page. [[User:Dzonatas Sol|Dzonatas Sol]] 11:05, 24 April 2007 (PDT)
:It's not a new issue, IP addresses can be associated with SL names with create uses of land based video streams[[PARCEL MEDIA COMMAND AGENT|.]] --[[User:Strife Onizuka|Strife Onizuka]] 11:11, 24 April 2007 (PDT)
::It's the difference between having it happen automatically and having it happen by choice. A user has the choice of visiting a myspace site, a blog, and so on... or not. They can do it at their discretion, on their schedule, without leaving a trail connecting their avatar and their IP address. If you don't enable streaming and don't visit web pages in world or via profiles or links in messages (which are *all* optional) then there's nothing anyone other than Linden Labs can see to tie your avatar in world to an IP address. For some people it might be a "false sense of security". For others, it's the only way they can deal with stalkers. -- [[User:Argent Stonecutter|Argent Stonecutter]] 08:29, 25 April 2007 (PDT)
::UPDATE: There's been landowners complaining on the Blog about LL disabling Quicktime because they are routinely using it to track alts to ban griefers by IP. W-Hat has already built up and published a database of UUID to avatar names, and if they could routinely track people by IP the way large landowners can do you think they'd stop for a minute before building up a database like that? -- [[User:Argent Stonecutter|Argent Stonecutter]] 14:05, 31 December 2007 (PST)
== Other Issues ==
There are a few issues I see that don't seem to be addressed yet. It mostly focuses on ways the feature could be abused and while I realize that every feature has negative consequences I don't see how web textures have enough positives to negate the issues below (in addition to the ones raised by others above).
=== Accountability ===
There would no longer be any way to determine who is responsible for offenses that involve web textures. The owner of the prim/item may have simply purchased the item and be completely innocent/oblivious of/to any wrongdoing. Even the original creator wouldn't necessarily have control over the web texture.
Without any way to determine who is truly responsible for the abuse no action can be taken (or the wrong party will end up punished) and it's open season for griefers.
=== Legal issues ===
There would need to be an easy way to see the URL of a web texture so that when a texture is infringing a DMCA can be sent to the proper party. The downside is that it would facilitate bandwidth theft (see below).
=== Concealing illegal content ===
It would be trivial to serve different web textures to different IPs: everyone sees a texture of a "cute cuddly bear", while a select few would see highly illegal (pornographic) content instead.
: ''This should probably be combined with griefing.'' [[User:Argent Stonecutter|Argent Stonecutter]] 14:16, 31 December 2007 (PST)
=== Griefing ===
Ad-plot extortionists could now target neighbours with offensive and sickening images which couldn't even be AR'ed because if a Linden were to investigate, they'd see the texture of a "cute cuddly bear".
Sold or transferred prims can have their textures altered after purchase by replacing the texture at the source (someone is hired to make a custom build, gets paid, transfer the build only to replace the textures and extort for more money).
Bandwidth theft is definitely an issue because the URL of in-world web textures will be easily obtainable (see Legal Issues, or the fact that the viewer could easily be modified to display the URL if it's not available by default).
=== Availability ===
The web is unreliable: servers can go down at any time, content gets moved, sites close down, data transfer limits are reached, etc.
Content that uses web textures (including scripts) would need to be easily identifieable so that noone naively buys anything that relies on web textures only to find that a month later the creator has left SL and removed all the textures leaving the buyer with now worthless content.
The solution to this could be to simply mark anything that uses web textures (including scripts) to automatically be marked 'no transfer'.
''The "no-dialog whitelist" option on the article would make it obvious that web textures were in use.'' [[User:Argent Stonecutter|Argent Stonecutter]] 14:16, 31 December 2007 (PST)
[[User:Kitty Barnett|Kitty Barnett]] 00:13, 18 July 2007 (PDT)

Latest revision as of 14:17, 31 December 2007

This is a talk page associated with the Open Source Portal. Changes to all pages like this one can be tracked by watching the "Related Changes" for "Category:Open Source Talk Page"
Please sign comments you leave here by putting four tildes (~~~~) at the end of your comment. For more guidelines, see Talk Page Guidelines

Argent, Lets try to keep these article-like, rather than discussion-like. I integrated your comments.

On the white/black list pros and cons, instead of commenting on them, put a subpoint under them to say what would mitigate that con, or just take the con out if you think it's completely bogus. Gigs Taggart 09:54, 26 January 2007 (PST)

Done -- Argent Stonecutter 12:18, 26 January 2007 (PST)

Gigs, the discussion of privacy concerns needs to be in a separate section. Argent Stonecutter 06:42, 31 January 2007 (PST)

Making page moves

Please make a request to make a page move rather than making moves by cutting/pasting (just request on User talk:Rob Linden for now...we'll come up with a more streamlined process later). It's really a pain to merge histories after a cut/paste/redirect move, and much simpler to move the history/discussion and everything in one move. Thanks -- Rob Linden 10:40, 26 January 2007 (PST)

Turns out there is a move tab I overlooked. I'll use that from now on. Gigs Taggart 11:37, 26 January 2007 (PST)

Proxy

If a professional can't type in an IP address and port number, they aren't very much of a professional. Gigs Taggart 16:51, 26 January 2007 (PST)

have you ever used an anonymizing proxy chain? -- Argent Stonecutter 13:38, 28 January 2007 (PST)

Streaming Media

Streaming music is not specific to the "Do Nothing - Just Implement It" section. I've restored that section. Comments about the relationship between streaming media and texture bugs need to be outside any specific section. Here's my thoughts on it:

Yes, streaming media is an issue. There have been scares in the past over streaming media, and for that I've been on the side of "don't worry about it". Why?

Streaming media differs from texture bugs:

  • A significant number of people do not use streaming media in SL because:
    • The overhead of streaming media has an impact on performance.
    • A lot of people have really bad music on their land.
  • Streaming music is limited to one parcel. Texture bugs can be anywhere.
  • The bandwidth costs of any large streaming-media-based collection scheme are significant.

The result is that it's a far less significant exposure: fewer people are impacted, and even for those people you can only collect data in a small number of places. -- Argent Stonecutter 13:49, 28 January 2007 (PST)

Sorry, I undid some of your changes before I saw this. I reorganized the article some to reflect this suggestion. Gigs Taggart 12:00, 29 January 2007 (PST)

With regards to LSL Implementation

Implementation could be simplified thusly: Instead of the original code:

llSetTextureURL(integer face, string url);
llRefreshTextureURL(integer face);
llSetTextureURL(1, llUrlEncode("http://example.com/getpng.php?text= " + text + "&font=futura&rez=512x512");
llRefreshTextureURL(integer face);

We can make use of llSetTexture()'s ability to be supplied with either a string OR a key.

llSetTexture("http://example.com/example.png",ALL_SIDES);

Since the LSL/Mono VM first checks if the variable supplied for the texture name is in the object's contents, after checking if it's a UUID, the VM would check if it's a valid URL.

I like that idea. This could also be applied to llRenderHTML and llRenderText in Programmatic Textures by allowing the string to be well-formatted XML or the name of a notecard in the objet's inventory -- Argent Stonecutter 06:31, 31 January 2007 (PST)

An alternate method, if Linden Lab were to host/cache the images so the Asset system information could be used to enable Copyright owners to stop content from being used:

llSetTexture(llExternalAsset("http://example.com/example.png"),ALL_SIDES);

llExternalAsset() would take any valid URI, and return a UUID that could be used with the appropriate existing LSL function. Any header information found in the file can be applied to the Asset system, so if a web page/image/sound/animation/etc. had creator/author/producer/developer attributes attached to it, the Asset Server will store that information, so it will be displayed in the UI (as opposed to the uploader of content in the current fashion). This would also give copyright holders an avenue to pursue DMCA claims, as the hypothetical system would indicate that they are owner, making things easier except for cases whereby the less scrupulous Resident did something like run an image through the GD libraries to strip out the metadata.

The llExternalAsset() implementation would also benefit greatly from caching, for two reasons:

  1. You'd obviously not want a new asset to be created every time the script was restarted.
  2. Lots of people might use the same resource, and
    1. this would help lower the amount of duplicate information being stored
    2. this would speed up subsequent queries
      1. In cases where the Asset Server indicates that the external asset already exists in the system, it'll have the option of attempting to receive a HTTP 304 header from the external source, or
      2. If one location for the asset is bogged down, the current simulator's local asset server can make the call and supply that image to the local simulator.
        • In this instance, after the local simulator re-downloaded the texture, binary comparison would have to be executed, and if the local sim's texture is different to the other sim's textures, then the other sim's asset servers would replace the old asset with the freshly downloaded one.

SignpostMarv Martin 07:20, 29 January 2007 (PST)

Creating assets for every frame of dyanmic content isn't really even an option. A web cam might be streaming 5 images a second through this. You are assuming that this would be for static images. That defeats the entire point of it. They can just upload the image if it's static. This is not some "save 10L on the upload fee" gimmick, the entire point of this is scriptable dynamic content. Gigs Taggart 11:54, 29 January 2007 (PST)
Yeah, but someone streaming a webcam would be better off using the parcel media stuff.
Would you be willing to pay for each asset you create with this method ? L$10, L$20, L$30 ?
SignpostMarv Martin 12:43, 29 January 2007 (PST)
Linden Labs has already ruled out any scripted generation of assets. -- Argent Stonecutter 06:32, 31 January 2007 (PST)
I'll second Gigs' point; this isn't about saving a few cents, it is about immediacy of near-live or live content, and as such no client should be seeing cached content, otherwise we're back to working out how to get multiple residents 'see' the same display concurrently (which inho is the reason for needing web xxx on a prim) --Alison Wheels 03:53, 28 March 2007 (PDT)
I'd like to avoid conflicts with inventory & urls and adding new functions. Since pipe is an unusable character in inventory names, I suggest it be used at the start of URIs. llSetTexture("|http://www.example.com/example.jp2", ALL_SIDES); Strife Onizuka 11:37, 28 March 2007 (PDT)
By not having new functions we remove the complicatsions of what llGetTexture/llGetPrimitiveParams should return. But it is probably better to add new functions. Strife Onizuka 21:24, 28 March 2007 (PDT)

Possible problem with feature

The way a image has to be intepreted relies on header data. This header data indicates to the viewer/interpreter info like howmuch data is being send, the pixeldepth etc. Deliberated corrupted headers could pose new and unpredictable problems. Also the threat of hiding executable data inside an image might create new types of threats.

I'd like it if someone with some insight on this issue would describe which implementations of this feature would pose the most risk.--Vernes Veranes 05:59, 28 March 2007 (PDT)

You'd be surprised how little data validation the server does on uploaded textures. This is really not a new issue, the client must handle corrupted and even malicious textures right now, even without web textures. Gigs Taggart 21:19, 10 April 2007 (PDT)

Texture/Bandwidth Theft

One problem with allowing unrestricted access to any (viable) texture on the web is that of using the materials and bandwidth of unsuspecting hosts. One solution is to require that any texture to be used must have, in the same directory or one higher up the parent tree, a text file of a particular name (such as "secondlife.txt") in much the same way that the "robots.txt" file is used.

Such a file could also contain information restricting what objects can use that texture in the form of a ban or allow list. References could include a UUID along with Owner, Creator, Group, or to get really specific, a particular Role in a Group. Failure to match these rules results in the 404 texture.

This is a hot linking issue. It's simply a case of adding an appropriate value into the referrer header. Thus requiring no extra text-files to be added, and making the "bandwidth theft" prevention methods 100% compatible with every existing installation of a hot linking preventing script, whereby the referrers must match a specific URL or domain.
SignpostMarv Martin 14:15, 10 April 2007 (PDT)
This is completely not an issue. The chance that someone is hosting a power of 2 texture that is a TGA or Jpeg2000 is nil. You can't just use random images off the net for a web texture, they will have to be specially crafted for this purpose. Gigs Taggart 21:18, 10 April 2007 (PDT)
I think the point is that you could steal bandwidth from other people hosting web textures for SL. Argent Stonecutter 14:17, 31 December 2007 (PST)

Literature

Another use which springs to mind is that of literature. Presently, providing a magazine, book or other publication in-world requires rendering each "page" into a texture and uploading these individually. I can forsee SL "publishers" using PERL/PHP cgi-bin scripts to convert page-sized chunks of text and/or rendering pages from PDF files. Though issues of copyright are of concern, the benefits to SL residents would be enormous. -- Num Skall 13:21, 18 April 2007 (PDT)

Server Side Black List

It still remains the possibility to disallow the loading of textures from ip only hostnames, and the usage of a server side domain filter to block known attackers. -- Robaato Yoshikawa 05:18, 23 April 2007 (PDT)

Trusted hosting partners

I do not think it will become a nightmare if allowed hostnames must be registered first, remembering the host owner about the SL TOS, that way any privacy violation can be better approached by law. -- Robaato Yoshikawa 05:18, 23 April 2007 (PDT)

/me is confused

Can someone remind me how this is separate to HTML on a prim ?

SignpostMarv Martin 10:23, 23 April 2007 (PDT)


One would require Mozilla to do the rendering of the texture, the other wouldn't. But I can understand the confusion. Strife Onizuka 06:36, 24 April 2007 (PDT)
The web textures, HTML-on-a-prim, are part of another idea for implementation. At this point, it is all just under discussion to cover every aspect of each individual idea. The ideas of materials on a prim is the implementation side. Dzonatas Sol 07:56, 24 April 2007 (PDT)
Okay, now remind me why we need something else other than Gecko to render things other than JPEG2000 images on the side of a prim.
SignpostMarv Martin 00:13, 25 April 2007 (PDT)
Only reason I can think of is resources; Web Textures don't need any of the features that Gecko would implement. -- Strife Onizuka 02:56, 25 April 2007 (PDT)
gecko is a very heavyweight software component. If you walked into a mall containing dozens of vendors each using HTML-on-a-prim it would be like opening a couple of dozen web pages at once... even on my core duo that's enough to freeze gecko even without SL in the picture. A couple of dozen vendors using web textures or other lightweight programmatic textures would be like opening a web page with a couple of dozen images... much better. -- Argent Stonecutter 08:24, 25 April 2007 (PDT)

"Privacy Problems"

How realistic is it to consider that their might actually be an issue here. Everyone using SL is using the internet, and everyone using SL will also be looking at websites where they are already leaving an IP address trail, so why should there be any difference between something within SL and something external? Yes, the texture might be hosted on someone's personal webspace where they will see the IP address appear in the logs and could tie it to an avatar, but exactly the same happens when the user visits a myspace site, a blog, most non-commercial websites; in each case individuals will be getting the information and, for the most part (99.999% imho,) nothing will be done with that information as there is no benefit to trawling through the data. I believe that a simple education campaign comparing what SL is doing to how the web works generally will set users' minds at rest. --AlisonW 09:34, 24 April 2007 (PDT)

Good question. There would be a conflict over a false sense of security. We have to focus more on the medium being transfered rather than who or where it networks with from one point to another point. While the infrastructure is being based at SL, we do not need to be so focused on the medium at this time. As the implementation moves more external, those bits will need to be considered. The security issues of the medium is best for a new page. Dzonatas Sol 11:05, 24 April 2007 (PDT)
It's not a new issue, IP addresses can be associated with SL names with create uses of land based video streams. --Strife Onizuka 11:11, 24 April 2007 (PDT)
It's the difference between having it happen automatically and having it happen by choice. A user has the choice of visiting a myspace site, a blog, and so on... or not. They can do it at their discretion, on their schedule, without leaving a trail connecting their avatar and their IP address. If you don't enable streaming and don't visit web pages in world or via profiles or links in messages (which are *all* optional) then there's nothing anyone other than Linden Labs can see to tie your avatar in world to an IP address. For some people it might be a "false sense of security". For others, it's the only way they can deal with stalkers. -- Argent Stonecutter 08:29, 25 April 2007 (PDT)
UPDATE: There's been landowners complaining on the Blog about LL disabling Quicktime because they are routinely using it to track alts to ban griefers by IP. W-Hat has already built up and published a database of UUID to avatar names, and if they could routinely track people by IP the way large landowners can do you think they'd stop for a minute before building up a database like that? -- Argent Stonecutter 14:05, 31 December 2007 (PST)

Other Issues

There are a few issues I see that don't seem to be addressed yet. It mostly focuses on ways the feature could be abused and while I realize that every feature has negative consequences I don't see how web textures have enough positives to negate the issues below (in addition to the ones raised by others above).

Accountability

There would no longer be any way to determine who is responsible for offenses that involve web textures. The owner of the prim/item may have simply purchased the item and be completely innocent/oblivious of/to any wrongdoing. Even the original creator wouldn't necessarily have control over the web texture.

Without any way to determine who is truly responsible for the abuse no action can be taken (or the wrong party will end up punished) and it's open season for griefers.

Legal issues

There would need to be an easy way to see the URL of a web texture so that when a texture is infringing a DMCA can be sent to the proper party. The downside is that it would facilitate bandwidth theft (see below).

Concealing illegal content

It would be trivial to serve different web textures to different IPs: everyone sees a texture of a "cute cuddly bear", while a select few would see highly illegal (pornographic) content instead.

This should probably be combined with griefing. Argent Stonecutter 14:16, 31 December 2007 (PST)

Griefing

Ad-plot extortionists could now target neighbours with offensive and sickening images which couldn't even be AR'ed because if a Linden were to investigate, they'd see the texture of a "cute cuddly bear".

Sold or transferred prims can have their textures altered after purchase by replacing the texture at the source (someone is hired to make a custom build, gets paid, transfer the build only to replace the textures and extort for more money).

Bandwidth theft is definitely an issue because the URL of in-world web textures will be easily obtainable (see Legal Issues, or the fact that the viewer could easily be modified to display the URL if it's not available by default).

Availability

The web is unreliable: servers can go down at any time, content gets moved, sites close down, data transfer limits are reached, etc.

Content that uses web textures (including scripts) would need to be easily identifieable so that noone naively buys anything that relies on web textures only to find that a month later the creator has left SL and removed all the textures leaving the buyer with now worthless content.

The solution to this could be to simply mark anything that uses web textures (including scripts) to automatically be marked 'no transfer'.

The "no-dialog whitelist" option on the article would make it obvious that web textures were in use. Argent Stonecutter 14:16, 31 December 2007 (PST)

Kitty Barnett 00:13, 18 July 2007 (PDT)