Talk:2010 Wikimedia Study of Controversial Content: Part Two

Latest comment: 13 years ago by WhatamIdoing in topic sexual content: lack of universality

"User-Controlled Viewing Options"

edit

The recommendations in the section User-Controlled Viewing Options are suggesting a clear violation of NPOV. --Yair rand 22:22, 21 September 2010 (UTC)Reply

Just as a reminder, not all projects actually have NPOV. See COM:NPOV for example. Steven Walling (talk) 00:26, 22 September 2010 (UTC)Reply
How is user-initiated opting-out of viewing images a violation of NPOV? If I close my eyes when looking at the Mohammad article, am I violating NPOV? Apparently. Kaldari 21:58, 22 September 2010 (UTC)Reply
Noting that hiding images from users is an oft-rejected proposal on en.Wikipedia, we do also have a help page for not seeing images, though it currently requires modding your CSS or JS page. I'm pretty sure the proposal is referring to, essentially, easier tools to do the latter. Nifboy 00:19, 28 September 2010 (UTC)Reply
It occurs to me that there may be a difference here between the opinion of wikipedians and those of visitors TeunSpaans 17:54, 2 October 2010 (UTC)Reply

Comment on "Intent to arouse"

edit

That is a lot of thoughtful words, but only a few will be controversial. Namely, these:

We are suggesting they be deleted because they are out of scope – they serve no “reasonable educational purpose”...their intent is to arouse, not inform...It is our belief that the presence of these out of scope images in Commons is potentially dangerous for the Foundation and Community because they reduce the overall credibility of the projects as responsible educational endeavors, and thus call into question the legitimacy of the many images of sexual content and “controversial” sexual content that must remain on Commons for the projects to fulfill their mission. And, although not our primary motivation in making this recommendation, it must be noted that they are offensive to many people, men and women alike, and represent with their inclusion a very clear bias, and point of view – that of woman as sexual object. They are far from neutral. We would never allow this point of view untrammeled and unreflexive presence on any Wikipedia site as a clear violation of NPOV– we should not allow it on Commons either. (emphasis added)

Some comments:

  • The intent of the photographer or uploader is unknown and perhaps unknowable. In effect, this turn of phrase is a pure projection--to use the clinical term--of the image-viewer's biases about the image-uploader's intentions. It is a category error: there is not 'intent to arouse' without someone's arousal; in other words, this image is controversial because it makes someone horny. (Or in the case of violence, because it makes someone squeamish or horrified). In truth, there is no other metric apart from our own individual reactions. So just say it: we are censoring these images because many people will be turned on by them or shocked by them. And that is okay or tolerable but not more than necessary to advance the encyclopedia's core mission.
  • Wikipedia is not Google or YouTube or Facebook. It makes no profit. It has no goal except to deliver the sum total of human knowledge to its users. This is not a polite mission. It is a radical mission. I don't want to give cover to 'prurient' trivialities, but there is a sense in which today's "icky" images are always the boundaries of what society is willing to explore. The frontiers of human knowledge, whether in sex, science, violence, or other taboos is never comfortable. Yet this is where knowledge is expanded and discovered. There is a clear friction between the twin goals of presenting knowledge comfortably and allowing its growth beyond those boundaries.
  • Per many Wikipedia policies, Wikipedia is neither censored nor a crystal ball. It reflects the world as it is. But that is 'the thing' with controversial images: they exist. They happen. Sex happens. Ejaculation happens. Dildos happen. Beatings happen. Massacres happen. And so on. You make the fair point that these things can be shown without being endlessly replicated--and that people should have the 'option' to exclude these images if they wish. That seems fair. But don't do it under the guise of intent.
  • Do the images which show off women's bodies cross a line because they 'portray women as sexual objects'? I don't know, do they arouse you? Women are sexual objects, if you find them sexual and you objectify them. But again, that is in the eye of the beholder. Photos of women are not objectifying unless that is the lens through which they are viewed. This is a naive view, but it is also true.
  • A broader and more subversive or post-modern critique is that the socially-constructed definitions of what is sexual or controversial should not apply to an encyclopedia at all. For these things happen at the interface between object and subject, content and viewer, and they are not attributes of things in themselves. At the least, they are subjective, and as such cannot be applied across broad categories. Is that hand-tying? Yes, but if you want to be free to make choices which censor, as any decision to eliminate content will do, then you have to say clearly that you are making socially supported choices, biased by social attitudes. It's not a very Wikipedia thing to say, but if it's true, better to say it. Lots of users don't like seeing these images so we are going to limit the number of them to what is minimally needed.
  • I realize that justifying 'more boobies' with highfalutin social criticism is a risk--and a middle path might need to be charted which you have begun to do. But let's choose the words carefully, lest we weave a web that will ensnare us as well.

Ocaasi 09:44, 22 September 2010 (UTC)Reply

Standing ovation. :) --Cyclopia 15:44, 22 September 2010 (UTC)Reply
Well at the very least he has my applause. TheDJ 20:17, 22 September 2010 (UTC)Reply
Some agreement, though where others see thoughtfulness I see only post hoc justification. As for photographers' point of view, well, why do so many people upload photographs of waterfalls? Because they're beautiful. Why so many seashores? Because they're beautiful. Why so many naked women? You figure it out. There's nothing all that sophisticated or surprising going on here, and it doesn't express a point of view beyond that of any photographer who bothers to pick up the camera. Wnt 20:27, 22 September 2010 (UTC)Reply
I don't think he's referring to The Birth of Venus here. He's talking about porn. Kaldari 22:12, 22 September 2010 (UTC)Reply
It's funny you mention her, since she apparently fits into the following categories: [Standing nude women] [Topless women] [Women with shaved genitalia] [Nipples] [Public nudity] [Nude redheads] and [Nudity in nature]. I fully get your point. Do you get mine? What if I'm turned on by the painting, does that mean it had an 'intent to arouse'? What is the difference between that and a picture of a woman masturbating? You probably think my asking reveals everything you need to know, but it also reveals things you may assume which may not be true, at least not for everyone. Ocaasi. 06:02, 23 September 2010 (UTC)Reply
Can you really not tell the difference between a classical masterpiece and a photo of a woman masturbating? These are specious arguments without merit. For reference, per the UK definition, "an image is pornographic if it is of such a nature that it must reasonably be assumed to have been produced solely or principally for the purpose of sexual arousal." [1] Art like the "Birth of Venus" was not produced and sold as a masturbation aid. Masturbation aids existed then, too, and the Birth of Venus was not one of them. --JN466 06:49, 23 September 2010 (UTC)Reply
I can, but what if I don't? What if I'm that prudish. Then my objection should not deprive the rest. Alternately, what if images you find vile others find simply educational, or poignant in their depiction of human behavior, or beautiful, or interesting, or... arousing (but not in a bad way). Should you then deprive the rest? There are reasonable distinctions about purpose, number, age, legal status, etc. But I'm not sure I buy your easy categorization. It's not really relevant whether I do or not, though; the question is about the people who don't. Ocaasi 01:47, 24 September 2010 (UTC)Reply
The intent of the photographer or uploader is unknown and perhaps unknowable: In a good number of sexual media uploads, the filenames, file descriptions, files' provenance, even sometimes the uploaders' user names, leave no doubt at all that the image's ability to arouse was a prime motivating factor in creating or uploading it. And bearing in mind the role Internet media play in 21st-century human sexual arousal, it would be strange indeed if this role were not reflected in the motivations of uploaders. (Many such images are routinely deleted even today, under COM:PORN.) --JN466 22:42, 22 September 2010 (UTC)Reply
A few comments:
  • I am as much concerned with the language as the actual filtering. "Intent to arouse" is a thought-crime. It is a Kafka-esque road that we should not go down. Culling tits by legislating people's motivations is a devil's bargain. It's not just about specific acts when you make law, but about setting precedents for guilt and innocence. "I didn't do it!" ..."But you thought about doing it!" is no world to live in. We shouldn't encourage it here.
  • Of course I'm talking about porn, if you can define it. Surely some people's porn is other people's breakfast, and vice-versa. Besides, as Cyclopia also mentioned, 'porn' is real. It exists too. Arguably, it should not be presented any less vividly and thoroughly than pictures of trains or architecture. So some people get off on it or have to avert their eyes. If you like it, look. If you don't like it, don't. Our job is do a good job of collecting it.
  • If I had to choose between post-hoc and pre-censored, I'd think I'd take post-hoc. Once the images stop even showing up, there's nothing to rationalize--you don't even know what you missed.
  • Again, just be clear about it: "We're offended by lots of pictures of naked women doing sexual things and random shots of men masturbating and we're not going to allow it to expand beyond a necessary minimum. It's porn, and we think too much of it is bad for the encyclopedia." If that holds up to critique, then you'll be on much firmer ground.
  • Either way, if you still want to censor bodies and sex acts, I don't think there's much particular need for any grand controversial image policy. Just use existing guidelines about redundancy and replication. If you have one great picture of a vulva, allow in 4 others for variety, but not 60 just for kicks. Call it curating rather than censoring. But please drop the 'intent to arouse' bit. Ocaasi 00:08, 23 September 2010 (UTC)Reply

Definition of controversial

edit

The authors say: "[Images] would receive this designation [controversial] when there is verifiable evidence that they have received this designation by the communities the projects serve. This evidence would include: existing legislative and legal frameworks around the distribution of said images; wide public debates in responsible forums concerning these types of images; articles and editorial opinion expressed about these types of images in notable journals and newspapers, etc. In our view, these tests will ensure that the bar to be admitted to the category of controversial image will be high (much higher than it is for the Wikipedias), and should remain so. In effect, our surmise is that only three categories of images can currently pass this test -- sexual images, violent images, and certain images considered sacred by one spiritual tradition or another."

This is absurd. Almost everything is "controversial" if the only bar for that is that there is some public debate or articles and editorial opinion in newspapers. Thus there is no end to the damage that such a policy would create. I am sure there are a lot of newspapers and public forums where exposure to atheism is deemed dangerous to children: should we consider images of atheists controversial? Given the huge debate on creationism, should we consider evolutionist content "controversial"?

There is no end to the amount of censorship and POV pushing that could arise from that. It is grotesque that so-called experts do not recognize that.

Also this is highly worrying: "and the intent of the image, to a reasonable person, is merely to arouse, not educate". Who is a "reasonable person"? Why is something arousing can't be educative as well (for example: to illustrate, effectively, what people find arousing)? Should we take into account fetishes or feticists do not fall under "reasonable persons" (and if so, I think the authors are insulting a not trivial part of the population)? Why should "reasonable persons" decide what is educative and what not, while instead we should simply stick to the fact if an image/video is simply documenting properly a notable thing/behaviour, regardless of what happens in our reader's genitals?

I am thoroughly worried and disgusted. --Cyclopia 15:42, 22 September 2010 (UTC)Reply

Ssst, you are supposed to not ask and not to tell. TheDJ 20:19, 22 September 2010 (UTC)Reply
The problem is you think these things are encyclopedic. Your understanding of the word "education" lacks anything resembling the actual use of the word and is, for lack of a better word, absurd. Ottava Rima (talk) 00:11, 25 September 2010 (UTC)Reply
The scope of Commons is bigger than the encyclopedia, it is an individual project. TheDJ 11:05, 25 September 2010 (UTC)Reply
Sorry, but penis images are not going to be used on Wikiversity, Wikisource, Wikiquote, or the other sister projects. They are only used on Wikipedia. Your background and experience as both an editor of Wikipedia and of the sister projects is lacking, so I do not expect you to understand the matter. By the way, "commons" is not a project. It is a subsection of Wikimedia, the actual project. Wikimedia is a project that serves as a hub between the actual projects. I think the major problem here is that we have too many statements from those not involved in the areas that are most vital, thus we lack appropriate representation of the people who need to be heard in the matter. Ottava Rima (talk) 12:59, 25 September 2010 (UTC)Reply
What evidence do you have that penis images won't be used on Wikiversity or Wikibooks or Wikipedia or even off-Wiki, since Commons is also a free repository of images for people outside of the projects. Concretely, what about a book on... Penises, an article on circumcision, a course on Sexual exhibitionism, a tome about changing attitudes about nudity over time, an article about cross-cultural sexual behavior, the list is almost infinite. They're penises. Every single man on the planet has one. Don't underestimate their scope; this stuff is important. Also, arguably, it's nothing to be ashamed of or bothered by. Arguably, it's meritorious even if a decent picture has no purpose but for appreciation of human bodies and their diversity. It's the primary sexual organ for half of the species. Just because it's 'naughty' somewhere doesn't really stand as grounds for anything. Ocaasi 13:06, 25 September 2010 (UTC)Reply

A predictable error

edit

The central error of these recommendations is that they imagine that Wikipedians can agree on what is sexual or non-sexual, violent or non-violent, controversial or non-controversial. This will not happen. The reason why censorship is imposed by dictatorial entities, whether they be authoritarian governments, authoritarian political parties, or corporations, is that there is no correct way to make these decisions. There must be one lone dictator arbitrarily ordering all the decision making, with no one permitted to call much attention to his inevitable inconsistencies.

Suppose we consider an image like File:Leaning on Barn Doors.png. No doubt there is some potential for arousal; but then again some will say it is art. There are after all less puritanical countries where breasts are simply one of the many aesthetically pleasing features of beautiful women, and not the object of some taboo. It is ironic indeed, that you would say that to violate this taboo expresses a POV that a woman is a sexual object, when in fact that exact opposite is true. If you show a photograph of an Afghan woman's face, does that mean you regard her as a sexual object???

But here's the rub: suppose I go ahead and edit a category of the above image to say that no, this is not a sexual image, but simply an artistic one. Are you going to ban me from editing? I mean, I suppose you'd have to — otherwise your little rating system wouldn't be much of a system, now would it? You'd better be ready to have a thousand new admins, to arbitrate whether it is an offense against Wikipedia to tag Tom & Jerry cartoons as violence, or images of people smashing shop windows in a riot as nonviolence, or photos of low-cut jeans as sexual content, or images of cartoon characters humping as non-sexual. You'll need whole reams of new policy about how to characterize images of invertebrate sexual acts, and rule on whether the destruction of a Death Star is an act of terrorism. I suggest you set up a whole new namespace for this stuff, because you might find that most of the edits to Wikipedia are being made there.

You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one. Also note that your statement that "ethnographic" images will be spared from review conceals an old and well-worn racism, rooted in the idea that because non-white women are not valid sexual partners, that their images should not be viewed as pornographic. How is the image I linked above any less "ethnographic" than an image from darkest Africa, if the races are viewed as equals?

In response to your various points: 1. The central error of these recommendations is that they imagine that Wikipedians can agree on what is sexual or non-sexual, violent or non-violent, controversial or non-controversial. If your point is to question how Wikimedians make decisions, and to note that there are differences of opinion within Wikimedians about things, I would guess that would place them among hundreds of other similar issues within the community, all of which use the various means that the community has devised over the years to mediate them. If your point is that such definitions are impossible to craft, I think that is not true. . There can be, and are, definitions of sexual and violent content throughout the world. The proposed policy on Sexual content for Commons, for example, attempts such a definition. You say that not all Wikimedians can agree on such a definition. You may be right. But would not all Wikimedians agree that a depiction of human sexual intercourse, for instance, would be defined as a sexual image? I think they probably would. The fact that some content in the world resists easy definition does not mean that definitions are impossible. Let's talk within the community about your Leaning on Barn Doors image. Let's discuss how artistic it is (with the understanding that aesthetically pleasing and artistic are not synonyms). Nowhere have we suggested that current procedures around Requests for Deletion should be changed. In any case, our point of departure is the belief that Wikimedians can agree on what is educational – that is the definition we’re trying to establish and protect. If they cannot, then Commons: Scope lacks much meaning.
2. You can pretend that this is an NPOV crusade, but that's a lie. There may indeed be many pictures of "you and your friends" on Commons, and no doubt many of these can be deleted. But by announcing a crusade against 3000 bare breasted photos, you're sending a message that there's something wrong with women's breasts but not with men's breasts. Which is not merely a POV, but a wrong one. “Right” and “wrong” have nothing to do with designations of content within Commons. Educational and non-educational are the relevant criteria. It may be true that some people morally object to some of the images we’re suggesting be deleted from Commons. That does not mean that that is the rationale we are using for their deletion. If that were the case, we would more likely be suggesting the deletion of images of cock rings, ejaculating penises, shaved vulva, bondage – and we are not suggesting the deletion of any of these. And the reason is because we believe they have educational purpose – although the things they educate people about are disturbing to some. Morality is not the issue when it comes to application of scope. Educational purpose is. We have nominated certain kinds of images, which we have attempted to define as accurately as possible, as non-educational. Can that designation be challenged? Of course. But educational value is the only true test of inclusion – this, of course, is not our suggestion; it is current policy on Commons. If you want to make the argument that Wikimedians cannot agree on the definition of educational (as you have made above that they cannot agree on sexual, violent, or controversial), then we’re looking at quite a different set of challenges, I would think. Robertmharris 04:24, 23 September 2010 (UTC)Reply
I should follow up with a current example of the contentiousness of categorization — even though it has no effect on the display of an image. There's a thread currently at w:User talk:Jimbo Wales#Latuff cartoons (it'll soon be archived) in which a poster says "People have been hurt because of this dispute (some are blocked now) and I fail to understand why we couldn't come yet to a fair and rational solution." This concerns a debate at Commons:Commons:Village pump#Using categories anti-semitism and anti-zionism in Latuff cartoons in which people can't agree on whether an octopus in an Israeli flag attacking a Gaza-bound relief ship is anti-Semitic. Now I actually think that debate could be solved, by relying on reliable sources to make the decision — but only if we have sources to weigh in on the subject, and only if we accept the possibility of a category of images in which sources disagree on whether something is anti-Semitic or not. If we were deciding only "yes" or "no" for an image-hiding scheme, and dealing with an image for Wikipedia that hasn't been published and discussed widely, then there would be no solution. (Yes, I know that this isn't one of the three types of images you described; I don't know how you'd decide if it's 'controversial', though I'd assume U.S. opinions will outweigh Iranian opinions...) Wnt 20:26, 5 October 2010 (UTC)Reply

A modest counterproposal

edit

Some basic ideas to salvage this debacle (not guaranteeing it is a sufficient set of demands):

  • All categories used for showing/hiding images in articles should be in userspace. I.e. Category:Wnt/Sexual images. These categories should not be edited by anyone other than the originator without permission, except to place them into other categories (so that someone can put my category inside theirs or vice versa). A solution will be needed to resolve looped categories gracefully, since some people will intentionally form rings so that any editor's list is included in all, and a bot will be needed to remove overcategorization from the original images (when one person's category includes another but both have categorized the image). I'm not saying this is very workable - just more workable than your idea.
  • The "single switch" to set viewing of such images implies that one single set of criteria has a special status. This could be established by a "featured filter" in which people choose a single category (originating from a single user, which may contain categories from other users in a hierarchy). By choosing a single category to feature, recognizing that the content of that category is up to the user, you can avoid individual debates over whether X is violent.
  • Anyone with a preferences page on Wikipedia should be able to change (and of course remove entirely) the category he wishes to use for this purpose - the choice should not be just filtered vs. unfiltered, but "filtered by Grace" versus "filtered by Ahmed" and so on. If this is not done, then Wikipedia puts itself in the position of making a site "safe" for one culture's taboos, but not for others.
  • There must remain a URL that feeds through to a completely unbowdlerized version of all content, so that third party sites can provide direct links to the desired target. This must work even for obscure things like discussion pages and historical versions of articles - i.e. the site should work exactly the same way for two URLs, except for one all this image-hiding is omitted by default. (While the proposal here doesn't explicitly say that the intent is to hide images by default for unregistered readers, I can't imagine it won't come to that).
  • The plan to remove taboo images of breasts should be abandoned. If not abandoned, it should include all random snapshots of friends and significant others, regardless of the clothing. In any case, a full list of images to delete must be publicized in advance, so that private photo archives are alerted to take these images and reproduce them.
In response to this counter-proposal (which we welcome, as well as any others), I will just note that if we thought images should be hidden by default, we would have said so. In fact, we said the opposite as clearly as we knew how. Anything may come to anything, but to suggest that a proposal to do “x” is the same as a proposal to do the opposite of “x” is not entirely reasonable. Robertmharris 04:24, 23 September 2010 (UTC)Reply


"Active curation"

edit

This proposal suggests "that consideration be given by Commons editors and administrators to adopt policies of active curation within categories of images deemed controversial (sexual, violent, sacred) which would allow for restriction of numbers of images in a category, active commissioning of images deemed needed, but absent from the category (line drawings of sexual positions, eg.)". Now "active curation" here is a clear euphemism for censorship - I mean, can you imagine walking into an art gallery and finding the curator deciding that there are too many paintings of one type, so he's throwing some in the trash?

The actual wording here says "consideration". Well, people have considered these ideas before. And the response has always been, "Hell no" — we're not deleting images just because they're 'controversial', whether or not there are other images that can vaguely be described as similar. We should watch to see whether Part 3 interprets "consideration" as "mandate". Wnt 20:38, 22 September 2010 (UTC)Reply

Wnt 19:26, 22 September 2010 (UTC)Reply

You don’t have to wait for Part 3. Consideration means consideration; nothing more. It is just an idea we are suggesting be applied to the collections. And, as you note, it is an idea that has been brought up before, and we believe for a reason. To use your art gallery example, there might not be any curators throwing artworks in the trash, but I would be surprised if there wasn’t a curator in the world who, when approached by a potential donor, would not assesss his or her current collection to see whether that collection could benefit from the donor’s offerings, and perhaps turn them down if they did not. To call that “censorship” seems to be to be straining at the definition of the word. And to take a case closer to home, we counted more than 1,000 images of penises in Commons, and unless we missed one, they were all white. A curated Commons might actively try to correct this imbalance, to make the collection more representative. We made the recommendation to increase the likelihood that this right might be given Commons editors and administrators. Robertmharris 04:24, 23 September 2010 (UTC)Reply
Your comments here sound reassuring, and if the point you mean to argue is that existing policies are sufficient to deal with these issues, I'll not disagree with you. Policies like Commons:COM:SCOPE, which specifically addresses surplus genitalia in a carefully worded section (Commons:COM:CENSOR), do address these issues. A purge by Jimbo Wales et al. in Commons was a bit heavy-handed, but did end up removing a few hundred uneducational images that people still haven't restored. If you think that some or all of 3000 photos of topless women are outside the educational scope of the project, you can post the list to Commons and encourage people to go through and start AfDs on the ones that don't belong, all without any change to policy, or acting as anything but an ordinary editor of Wikipedia. Wnt 14:25, 23 September 2010 (UTC)Reply
I think, here, we've also got to consider how far the analogy stretches. Our hypothetical "curator", in this case, has a limited amount of space in his museum and there are a limited number of copies of the painting being donated (perhaps only that one). By turning down the donor, he gives the chance for the donor to find another place to donate the painting, where it might be useful in the limited amount of display space in that museum.
Conversely, Commons has an (effectively) unlimited amount of display space (and deleted images get saved, so deletion doesn't help with space anyway), and since the images are by definition freely licensed, they can be copied anywhere. If the person donating the image wants to also display it on Flickr and their own website, they can do that as well—unlike our physical museum and physical piece of art example, exhibiting the photo on Commons is not mutually exclusive with exhibiting it elsewhere at the same time.
Also, our hypothetical curator is running a museum, which serves not only to hold on to the art, but also to make exhibits from it. In this case, Commons is not the museum-it's just the museum's storage area. The "exhibits", be they on Wikipedia, Wikibooks, what have you, can then select images from this repository if and when they are needed for an "exhibit", and they are responsible for placing the image in its appropriate framework and context. But in our case, since the images are freely licensed, the entire world, not just our own "museums", can utilize what we store at Commons.
There's nothing wrong with your suggestion, such as in the "black penises" example, that we should seek out a greater variety of content when it seems to be skewed in one way or another. But to say that we should accomplish a wider variety of content by removing content is nonsensical. If we've got far more pictures of monarch butterflies than a rare species from the Amazon, that doesn't mean we delete most of the monarchs to make it "even"—it just might mean saying "Hey, can anyone get a picture of Amazonias Rareashellis? We could use some more of those." Seraphimblade 18:08, 25 September 2010 (UTC)Reply

Comments by TheDJ on recommendations

edit
  • Recommendations: Controversial Text: I fully agree on all counts. The one thing I do say is that a wikijunior is a difficult thing to create as has been proven in the past. This is largely because the target group is not capable of doing all the legwork as in many other wikis. That means that teachers and other editors will need to be heavily involved. It's a worthy cause to aim for a wikijunior, but it will take a lot of focus, focus that will impose upon the foundation. Perhaps a "sister foundation" might be a better idea. We could donate some resources to them (server usage etc), but there would be a separate focus to managing the project, promotion, cooperation with educators, etc.
  • Recommendations on “Controversial” Images
    • 4 This is the most problematic part of all the recommendations, as already explained very well I have to say by Wnt and Ocaasi. It is based on archaic traditionalist ideas and protectionism. I totally falls apart when we start comparing it with images of unveiled afghan ladies for instance. It is a Western viewpoint on "decency", and per my own policy (as person) on explicit content is not acceptable.
    • 5 Fully agreed
    • 6 So we should limit the amount of pussy, if I understand this correctly, and encourage the creation of linedrawings. I think we already do that, the amount of images we let in on sexual content is already diverging towards "we want high quality material that is useful for illustrations". In my opinion that works much better than setting fixed numbers.
  • Recommendations on User-Controlled Viewing Options
    • I have to start with the fact that I was expecting these recommendations. I even partly argued for them myself over the past years. Filtering is totally acceptable to me, if the projects at large support that desire and I'm not bothered by them (means default off). I believe in making the choice that you don't want to see something. I have significant worries about the amount of time and resources that would have to be diverted towards such processes (both technical and community wise, manhours and pure cost). It adds to the bureaucracy and complexity of the website and it's managing systems, and the question is can we handle that now that editor numbers are slipping. I'm not sure. But like I said, I can accept it as long as I don't have to participate in it.
    • 7 As long as i'm not bothered by it, doesn't inhibit printing, doesn't break google. No problem.
    • 8 This is the "click to reveal" routine. I note that this has significant accessibility risks (think mobile devices and the blind) which require appropriate attention in development. My other requirement is that individual projects (with the exception of Commons) reserve the right to "opt-out" of such schemes for anonymous users. (I have serious doubts if the Dutch community want their articles on the sexual revolution defaced by "click to show this image" blank spots. As a matter of fact, I think they and the Germans would seriously consider forking the projects in that case.)
    • 9 I'm confused on this one. Are we talking individual images, or just individual choices for "pre-made" (by commons editors) sets of controversial content ?
    • 10 I read this recommendation a few times and unfortunately it is still not clear to me what your intent is with this recommendation. As far as I know, it wouldn't be hard to create lists of images to censor already. I think this is more an issue of skills of the censors :D
    • 11 I cannot support this suggestion. least astonishment has nothing to do with "surprise" towards the reader that he gets stuff he might not have been prepared for when he googled vagina. It is a purely stylistic element of writing a comprehensible article. It is about dosage, accurateness, to the pointness and order of elements, it has nothing to do with inclusion and protecting feelings.

I share many of the concerns of Wnt. This is gonna be a warzone. There will be more discussion than useful contributions. And if such measures go to far, editors and readers will leave, some already left last time. These are the people who read, build and maintain this encyclopedia. Why do we care about including the fringe groups, who do not participate here, and endangering the participation of our core members ? TheDJ 21:40, 22 September 2010 (UTC)Reply

Judging by your comment on 8., I believe you may have misread 7 and 8. As written, they say,
  • "7. That a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images (organized using the current Commons category system) into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command (“under 12 button” or “NSFW” button).
  • 8. That no image be permanently denied any user by this regime, merely its appearance delayed, to prevent unexpected or unintentional exposure to images.
The way I read that, it means that the regime becomes activated after the user has clicked the opt-out button. --JN466 22:52, 22 September 2010 (UTC)Reply
I'm not entirely sure, i find the section slighlty confusing. That's why I stated that I cannot support hiding content by default. TheDJ 00:44, 23 September 2010 (UTC)Reply
Just to be clear, what we are recommending is that, unlike images in sexual and violent categories, which are collapsible by readers (non-registered, general-public users of the projects) by the selection of a simple command visible on all project pages, other images deemed controversial (by the tests we’ve proposed, let’s take Images of Muhummad as an example) can only be collapsed by registering as a user, and making choices from a set of options presented on your user page – what those options should be – category by category, image by image, we haven’t said). However, we believe that the option to collapse an image not be extended to all images in the projects, but limited to those we have, in effect pre-selected for potential filtering.(because they meet our definition of controversial). Same with potential third-party filters. We believe that we should not tag every image or bit of content in the projects, so that third-party filters can easily delay or delete content we believe should be open. We believe if the bias on the projects is towards freedom, we should not be fashioning tools for people to restrict that freedom, for themselves or anyone else, except in clearly-defined exceptional cases.Robertmharris 00:32, 23 September 2010 (UTC)Reply
Sounds like a good compromise solution to me. Allow everyone, at their own discretion, to filter out sexual and violent stuff, if they don't want to see it, and allow registered users limited scope to filter out additional controversial content, like images of Muhammad that they don't want to see. --JN466 02:03, 23 September 2010 (UTC)Reply
This sounds comparatively mild, except that I just read about "...policies of active curation within categories of images deemed controversial (sexual, violent, sacred)...", which are clearly not limited to what a single user sees.
Furthermore, there's a huge inconsistency between this and the Part I idea that controversy is objective and initiated by user complaints. What you seem to be saying here is that selected Western hot-button items will be concealed for non-registered users, but issues like Muhammad cartoons can never reach that level of treatment. That doesn't sound particularly stable, but for the time being it expresses a cultural bias. There are countries like India that have a very bad record on sexual censorship but don't think twice about posting photos of the dead from a train crash on the Web for everyone to look through and identify. See, e.g., external link from w:Gyaneshwari Express train derailment to the Indian government's site, which some people wanted to censor out of that article entirely due to gruesome content. — The preceding unsigned comment was added by Wnt (talk)
Wel put. TheDJ 13:47, 23 September 2010 (UTC)Reply
Wnt and TheDJ, Wnt said,
  • "What you seem to be saying here is that selected Western hot-button items will be concealed for non-registered users".
Robert said, above,
  • "I will just note that if we thought images should be hidden by default, we would have said so. In fact, we said the opposite as clearly as we knew how."
What recommendations 7 and 8 propose is a user-selected regime whereby readers can, if they so wish, click a "SFW" button or "Under-12" button which will then conceal these images. Until they do, all images will be visible. Is that correct, Robert? --JN466 16:06, 23 September 2010 (UTC)Reply
Yes. Robertmharris 16:15, 23 September 2010 (UTC)Reply
Note, however, that the draft says "NSFW button", not "SFW button". I'm also a bit confused how show-by-default would reconcile with the suggestion of raising a "principle of least astonishment" to policy status. Wnt 20:06, 23 September 2010 (UTC)Reply
Believe it or not, the people "who read, build and maintain this encyclopedia" includes many people who are offended by some images that are legal in the US. Such people are not a "fringe group" battling in a "warzone". If you view things that way I somehow doubt that you are interested in broadening Wikimedia's appeal beyond twenty-year-old white male Americans. Also, you should keep in mind that Wikimedia includes more than just "this encyclopedia", by which I assume you mean en.wiki. We are dealing with many people from many different cultures, backgrounds, age groups, etc. on several hundred different projects. Kaldari 00:20, 23 September 2010 (UTC)Reply
As I said those are my personal principles, they do not concern with how we actually run the projects, it has to do with how I think we should run the projects. It's my opinion, i'm professing it. As much as Privatemusings has been professing his opinions for years now. And yes, some of my ideals do trump participation of a wider audience I guess, but so be it. We started the project on ideals, I see no reason to change that. TheDJ 00:34, 23 September 2010 (UTC)Reply
Once you threw out the term "Western", you invalidated your argument. Muslim cultures are not "Western" and represent 1 billion people in the world. They have an even more strict view of pornography than you would ever be accepting of. China, also over 1 billion people, has tough restrictions on porn. Even Japan, a relatively libertine culture relies on censor bars and other measures. If you wish to make a point, please don't begin with misleading claims that are beyond laughable. Wikimedia's problem is that they do not immediately ban those beginning with such arguments, as they cannot be seen as productive, credible, or done for any reasonable purpose. Ottava Rima (talk) 00:15, 25 September 2010 (UTC)Reply
I think you didn't read it correctly. I specifically said exactly this. Ours is a *traditional* western viewpoint (as opposed to my liberal north European viewpoint). You could say that here it is argued that we choose to "pick" this as a safe point. Now a thought experiment; I shift that safe point a tad to the side of certain muslim cultures. Now the safe point says: We should limit all images of unveiled women. My point is that for others to tell me that certain images of nudity should not be included based on a certain "decency" argument that I do not share, is as offensive as someone telling you that you should limit the amount of images of unveiled women due to their viewpoints on decency. We can make that decision, it's a self imposed limited censorship on our own community. I just like to demonstrate, than in that decision, I'm losing part of my possible "Intellectual Freedom" in participating in this community. In the first situation (the one harris suggests), you are choosing to insult certain islam cultures, by not giving them a censor button for this issue (at least that is how it is looking now, just for actual sexuality and Mohammed images) and you insult me, because I don't get a "view" button for the deleted content. If we don't limit the category we are insulting a few more westerners, but at least they get a censorbutton to make a choice themselves. In practice there is not much difference between the options, but you might understand better how I feel if we moved the safe point to women without a veil.
I would say that a safe point for inclusion, that is at the traditional western viewpoint, is so culturally biased and based on the cultural background of our own community, that it is not a valid safe point. A real cultural neutral safe point, should if anything, probably be even more restrictive. Since that is clearly not desired (well perhaps by you), the idea of a safe point is thus in my eyes an invalid idea in the first place. Which makes me wonder why I need to be bothered with it. TheDJ 11:53, 25 September 2010 (UTC)Reply
Do not claim others didn't read when you were the one with the problem. My point was that the "western viewpoint" you are talking about is the same view as held by Middle Eastern and Asians. Your attempt to dismiss a moderate view is an old trick used to justify the most extreme and libertine views in the whole world. Then your attack on Muslim requirements of veils -in public- shows a cultural insensitivity and misunderstanding that I find very inappropriate. We have many Persian editors (including a steward) who are from one of the most conservative Muslim countries who have made it clear that the excessive porn needs to go and that is it. Putting forth a false slippery slop to justify a lack of rules is intellectually dishonest and shows a lack of willingness to acknowledge the actual situation. Ottava Rima (talk) 13:04, 25 September 2010 (UTC)Reply
I could give a proper reply, but I won't because you don't listen, so it would be a demonstration in pointlessness. TheDJ 13:28, 25 September 2010 (UTC)Reply
TheDJ, as demonstrated you have inappropriately attacked two thirds of the world's population with highly condescending language while not having any real background in the area to verify your usage of such vulgar and demeaning language. Ottava Rima (talk) 19:03, 25 September 2010 (UTC)Reply

Image deletion is not permanent

edit

Image deletion is not permanent. Images can be undeleted, just like wiki pages. This was not the case when Commons was started, but has been for several years now. -- Duesentrieb 21:58, 22 September 2010 (UTC)Reply

Good point. I do note that Commons deletion in general is seen as rather disruptive because images are automatically removed from many projects. Restoring an image is easy, but restoring the way an image was used is more difficult (and gets harder as time progresses due to changing article content). TheDJ 22:01, 22 September 2010 (UTC)Reply
They can only be undeleted by admin staff: and unlike earlier article revisions in a Wikipedia, they can only be seen by admins as well. --JN466 22:54, 22 September 2010 (UTC)Reply
earlier versions of articles and images can be seen by anyone. deleted versions of articles and images can only be seen, and undeleted, by admins. it's exactly the same. -- Duesentrieb 06:58, 23 September 2010 (UTC)Reply
Yes, this is indeed an important difference: while references to a deleted page are generally kept, references to a deleted image usually are not. Bot-aided removal of dead image links are generally considered a feature, but perhaps we should think of a mechanism that also makes it easy to undo them. -- Duesentrieb 06:58, 23 September 2010 (UTC)Reply
The relevant passage in the document is "unlike text, a deletion of an image in Wikimedia projects is permanent". When we delete controversial text (profanities/vandalism, BLP issues, undue issues etc.), we don't usually delete the entire article. We edit the controversial text out, to make the article compliant, and the controversial text usually remains in the archived history. (When we delete articles, this is because of notability and lack of reliable sources, not because of their containing controversial text). I think this is what the comparison tries to get at, and the point is valid. --JN466 16:43, 23 September 2010 (UTC)Reply
Then indeed the point is valid, but the wording is misleading. I suggest something like this: "deletion of an image in Wikimedia projects is much more disruptive and harder to undo than removing an offending section from the text of an article". -- Duesentrieb 07:12, 24 September 2010 (UTC)Reply

I disagree with Jayen466. Images can be edited. For example File:Ministro da Cultura Juca Ferreira se encontra com a embaixadora de El Salvador Rina del Socorro.jpg is an edited version of an originally higher picture that contained a non-free copyrighted painting. The non-free version was deleted (deletion log, but the smaller edited version was kept. It could be a problem if this whole report is based on wrong representations of how Wikimedia Commons actually works. Teofilo 10:16, 17 August 2011 (UTC)Reply

No reason to delete "controversial" images

edit

So, recommendation 4 calls for a purge of images meant to "arouse, not educate". That recommendation will not work.

I'm even more deeply disturbed by the justification given:

And, although not our primary motivation in making this recommendation, it must be noted that they are offensive to many people, men and women alike, and represent with their inclusion a very clear bias, and point of view – that of woman as sexual object. They are far from neutral. We would never allow this point of view untrammeled and unreflexive presence on any Wikipedia site as a clear violation of NPOV – we should not allow it on Commons either.

This statement makes some huge assumptions that I very controversial. Firstly, "women as sex object" carries an unspoken assumption that the images represent a "male gaze" of a "female object". I assume, and would hope, that our Commons has infinitely greater diversity than this. I hope that it reflects the totality of human diversity in such images-- every manner of gazes pointed at every conceivable subject. That's what the internet is all about.

I am a wikipedian, I don't actually know the people who have shared their own creative commmons 'arousing' images. But these are real people, who decided to take time to upload images to commons. These people are not necessarily sleazy exploitative capitalists. It's all well and good to imagine that the only people who make "controversial images" are Larry Flynt, but realistically, knowing nothing about the commons community and the editors who have uploaded these images, I'm inclined to assume good faith, and to suspect that they're generally happy people who aren't 'objects' but instead are active participants who have chosen to appear here of their own volition.

That this recommendation so easily and automatically assumes the opposite is a bad sign.

Recommendation 4 is really the heart of it. Who is going to decide what to delete-- the normal commons deletion process, or a mandate from on high? --Alecmconroy 08:52, 23 September 2010 (UTC)Reply

do you think we should retain material which isn't educational though? Assuming you support the concept that we should only keep material with a realistic educational purpose, could you suggest some examples of material you would feel would fail that test? Perhaps that's the heart of it too ;-) Privatemusings 09:30, 23 September 2010 (UTC)Reply
"Not realistically useful" is a fine standard because people know how to debate it. Unfortunately, the standard here is calls upon us to guess the "intent to arouse, not educate". This sets up a very weird and false dichotomy-- an artist's "intent to arouse" is completely unrelated to whether or not their images are educational. One of our jobs is to document images that may have been intended to arouse.
It's not really about me finding images that I think fail the test-- rather, my concern is that someone will use "arouse not educate" to justify deleting an image that is, in fact, useful. And I'd rather 99 potentially-useless images get kept than lose 1 truly useful images to prudishness. --Alecmconroy 09:44, 23 September 2010 (UTC)Reply
It’s clear we have caused something of a misunderstanding of what we are recommending with the language we used in Recommendation 4. First off, let us say that we are not recommending that anything other than the normal Commons deletion process be applied to the discussion of the images defined there. We made this recommendation in the hope that it might provide some focus for precisely that discussion.
Secondly, we are not suggesting that the question to be asked in that discussion is the intent of the image producer or uploader. In fact, one could argue that those intentions are completely irrelevant. We agree with you and previous posters that to introduce those kinds of questions of intent of photographer or uploader would be both impossible to determine and a direction with serious negative consequences. It is the intent of the image we are questioning. And you may ask how in the world the intent of an image is to be determined, or why we chose that language to determine potential educational value. Here’s the situation we were trying to isolate. If I take a picture of the piano in my living room, we might determine that the educational value of the image lay in the documentation of the particular type of piano, or the documentation of the use of a piano to decorate a twenty-first century living room. If however, we added to this image a picture of a young woman, perfectly made up, nude, posed on the bench with her legs splayed apart, facing away from the piano, looking directly at the camera, it adds a completely different element to the picture, one that fundamentally changes – we would say reduces -- its educational validity. And the reason for that is the image of the young woman has no informational value. Her presence is purposeless except to create an emotional response, in this case, sexual arousal. (Whether it succeeds or not is irrelevant). You may ask how we know for what purpose her presence (or his, if it were a male) is included in the photograph. Obviously, we don’t “know” definitively, but a “reasonable person”, we believe, might come to this conclusion by asking questions such as: “What purpose, other than arousal, does the image of the woman fulfill? , “What other reason may be adduced for its presence?” or, “What educational value does the figure of the woman add to the image?” Note that we do not think these are necessarily easy questions to answer, which is why we believe the normal delete process is beneficial to their discussion. We chose “intent to arouse” as an attempt to isolate the qualities of the images we thought were questionable for an educational resource. But, again, not the intent of the photographer, or uploader – rather the educational intent of the image – the educational potential of the image.Robertmharris 15:13, 23 September 2010 (UTC)Reply
Lets say that the images in question were Blue Tits rather than Bare Breasts. Is it really credible for Commons to have 3000 Blue Tit images with more being up loaded every day? At some point someone is going to say "haven't we got enough Tits on Commons already?" some how I don't think that many would think that statement to be controversial. --194.193.183.253 12:39, 23 September 2010 (UTC)Reply
And when do we have enough images of war victims in your opinion ? TheDJ 13:54, 23 September 2010 (UTC)Reply
How many photos severed limbs, bullet riddled torsos, and crushed heads do you need? As some point, other than to the fetishist, it becomes more of the same and nothing extra is actually being added.--194.193.183.253 14:57, 23 September 2010 (UTC)Reply
We actually have 203 images of Blue Tits currently (Commons:Category:Cyanistes caeruleus). Commons is a big place. But this isn't really the valid comparison. Commons:Category:Women is the appropriate comparison, but I don't see an easy way to count how many files are in its 31 subcategories (and their subcategories etc.), and I suspect many images containing women aren't actually categorized in any of them. But I bet there are millions of them. Now if, betraying the direction of your mind, you binary sort "Category:Women" into "Category:Women with exposed breasts" and "Category:Women without exposed breasts", how can you argue that millions of one are no excess, but thousands of the other are an unacceptable surfeit? Wnt 14:37, 23 September 2010 (UTC)Reply
Presumable the none bare breasted photos have something else to recommend them other than being bare breasted - no? 203 images of Blue Tits is most likely to have an lot of redundancy as far as being of educational value is concerned. I could upload a few hundred myself, several 100 Common Blue damselflies, 100s of photos of the same species of fly, beetle, duck, etc. Perhaps a stock photo agency would disagree but all are redundant unless they are exhibiting some additional behaviour pattern.--194.193.183.253 14:57, 23 September 2010 (UTC)Reply
Such reasoning endangers our cooperations with photo archives. It comes back to the issue of "Are we collecting Free images as an imagebank of educational material (as defined by GLAM), or are we just a slave to the other foundation projects when it comes to educational value". A scope issue that has still not been settled, but is at the very core of the whole controversial images issue. I have personally heard in deletion discussions, the argument: "If you write an article about it, it can stay in Commons", only because it was a controversial topic. Such things worry me greatly as a Commons participant. TheDJ 17:02, 23 September 2010 (UTC)Reply
If anyone thinks 203 pictures of Blue Tits is too many, I encourage him to look at those photos again - this time, look at the feet. There are tits clinging upside down to the bottom of a feeder, tits on the side of a feeder, tits clinging with their feet on a 90-degree edge, tits on a surface beneath them, tits perched on vertical branches, tits perched on horizontal branches... Now imagine, just imagine for one moment that you weren't just looking for a picture of a Blue Tit because you don't know what one looks like, but because you're writing anything from a term paper to a PhD thesis where you want to make the statement "The Blue Tit's feet have evolved to allow it to........" And you don't want to look like an idiot because you read something about the bird in American Naturalist but you didn't really understand what they meant - you actually want to see the bird making this exact motion so you know it's for real, and maybe even include a pretty Wikipedia photo (with credit I hope) so that your readers aren't as confused reading your paper as you were reading the last person's.
The great defining feature of even the deletionist, let alone the censor, is a lack of imagination, and the lack of appreciation for what useless information can become. Wnt 19:57, 23 September 2010 (UTC)Reply
The great defining feature of the inclusionist is a lack of ability to discriminate. I have numerous images like these some of the same scene from different angles, at different moments of the kill, or of Scathophaga with a different prey species. Add in all the photos of Scathophaga species on different types of grasses, because someone somewhere might want to do a paper on how their feet grasp the various supports, etc. Of course the photos will be totally useless as prevailing wind speeds will need to have been recorded along with a number of other variables, but at least they'll be 3000+ images of Scathophaga on a grass stem.
In those two photos, did you notice the variation in pigmentation on the notum? The first fly has a nearly uniform pigmentation, while the second has a quite visible "trident" pattern akin to that known from some Drosophila strains. So these pictures also have quite an interest from the point of view of demonstrating variations in pigmentation. But in any case I don't see why you would mind accumulating a collection of quality images of these kills. Wnt 16:44, 24 September 2010 (UTC)Reply
These are just two images, and you can always pick out a number of differences between two individuals, but my point was with 100s, or 1000s of similar photos, and remember that for each of those two photos I have a number of others from the same session. These two may even be different species there are a number of different types in my area and you can't tell unless you dissect them, it is almost impossible to differentiate most of them by photographs. Colour differences are of little interest as that is going to depend on the light and in particular what tweaks I did to the levels, colour saturation, and a host of other changes in photoshop. They show predation by Scathophaga which is different from say Empididae and there is an argument to include examples of Ethology for a different families, but 100s, 1000s? There is a point where, just with the Blackbird eggs, and Bare Breasts enough is enough.--62.49.31.176 18:55, 24 September 2010 (UTC)Reply
Maybe there is a point where we have enough images and no more are needed. But where is that point? How many is too many? There's a very difficult way to measure that, with deletion discussions and arguments and policies — but the outcome of such a process will essentially be random, skewed by invalid arguments and prejudices. It would be much easier to develop an automated process — every year, each image has a 50% chance of deletion. That way contributors would eventually learn that the work they contribute will inevitably be held only by private image collections and available only as sold in (slightly) modified form under their copyright, and would learn better than to contribute them for public distribution, and the glut would eventually end.
The solution I prefer is to trust that if and when new images contribute no new value, people will stop bothering to upload them. Wnt 01:26, 25 September 2010 (UTC)Reply
If hoping and wishing was all it took, then we wouldn't have any vandals, sock puppets, or other problematic users. Your ideas, if enacted, would completely destroy all of the WMF and I hope to god no one takes your statements seriously. Ottava Rima (talk) 02:05, 25 September 2010 (UTC)Reply
Apart from rare exceptions like Jimbo Wales' famous purge, Commons pretty much doesn't "weed out" images. That's what works in both theory and practice. Wnt 05:17, 25 September 2010 (UTC)Reply
That is only true if you pretend that images for deletions does not exist. There is even a bot that cleans up images deleted on commons, which would seem ridiculous if anything you've stated so far is true. Wnt, you have no history or background. From what I can see, you've basically just appeared for this debate. That troubles me greatly. Ottava Rima (talk) 13:07, 25 September 2010 (UTC)Reply
"Images for deletion" is still, thankfully, a small fraction of the total, as deletionism is still, thankfully, a disease largely confined to en.wikipedia. Though in truth, if I spoke with a good insect photographer like the one taking the fly images above, I can't say for sure that I'd tell him now to put all his work out there for unrestricted use on Commons, when he might store it more safely on Flickr, perhaps even reserving it for noncommercial use only so that he might later profit on its use in textbooks. I suppose this is a great victory for you, and a sign of Commons' growing weakness and declining morale. As for me, I was invited with other users from Wikipedia and Commons to come here for debate; I think I'd happily leave Meta alone if it would happily leave us alone. I don't want to become involved in ad hominem arguments with you, though I believe I'd have the better end of the bargain. Wnt 16:46, 25 September 2010 (UTC)Reply
Outdent: "deletionism is still, thankfully, a disease..." Proof why Wnt's statements cannot be taken as anything other than disruptive. Wnt has very few edits on any of the sister projects let alone at Commons. His pronouncements are based on no actual experience and cannot be seen as based on anything credible. Ottava Rima (talk) 19:05, 25 September 2010 (UTC)Reply
There is an endless supply of people posting photos of their penis or their partner's anus on flickr. Also over on flickr there are 100s of people each posting 100s and 100s of photos of feet, or elbows, or particular brands of socks. OK those have a sexual fetish connection, and speaks more about fetish then the subject of feet, or elbows themselves, but does Commons need a 1000s photos of ugg boots in order to document the fetish? One may want a photo of each design, but 20 blurry shots of the same boots being worn by some unaware women walking down a London street? Collections need to be curated otherwise they might just as well be a jumble of similar photos in a box. Take these in commons of The Green Man there are about a 1000 different examples in the UK but unless you know where they are found, the age, and can relate them to other sculptures of a similar date and region, then their use is severely limited --62.49.31.176 11:21, 25 September 2010 (UTC)Reply
Decades ago when I was about 11 we moved from London to the edge of a midlands city, one of the new friends I made was an Egger. He could walk across fields, hedgerows, and through woods pointing out nests birds every 20 or 30 yards. How he did it I never knew, most of the time you couldn't see a thing until he pulled back a few branches, or smoothed back a tussock of grass. For a city kid it was a wonder. One day we went around to some kids house to see his collection of bird eggs. Out came these cigar and shoe boxes, and other containers full of nothing but Blackbird eggs, there were 100s of them. We just stood there and stared, then Davey upped and punched this other kid straight in the face. He walked off saying 'one is enough' we never went egging again.--62.49.31.176 18:45, 23 September 2010 (UTC)Reply
I have to agree with Alec. Such are very conservative viewpoints on nudity, based on how some people grew up. And while we are giving all the freedom to people who want to use images of Mohammed, we are saying that we should not give such freedoms to people who are comfortable with nudity, by denying them the same leeway as uploaders of pictures of Mohammed have ? That troubles me. Equality matters, and my western(-european) freedoms affect me more than the freedoms of people in Iran, so I do not see why I should return some of those freedoms in these projects, while other societies get to keep them. That doesn't mean I might not be willing to participate in the project any longer, it means that I think it would be an unequal treatment, that is ultimately the result of a smear campaign by Jana from Fox News. Such bothers me deeply. We already have the Sexual content guideline almost finished, something that I fully support, because it does not limit the scope any further than already defined. We still have no idea what our scope exactly is (archive of culture or imagehost for other WMF projects), but we can discuss that on a case by case basis, until we do figure it out. TheDJ 14:17, 23 September 2010 (UTC)Reply
I'm surprised that I haven't seen anyone question the suggestion that a picture of a piano inherently has more informational content than a picture of a naked young lady in a pornographic pose. Surely if one serves as illustrative of a piano then the other serves as illustrative of a pornographic pose? I can undertand that there may already be plenty of pictures that cover one or other of those informational uses or indeed that the WMF or the community may choose not to educate on the subject of pornography (or of pianos) but the claim that one or other of the pictures lacks informational content seems to be arbitrary. I don't think it can seriously be disputed that one can be educated on the subject of pornography, a field that I would suspect is much more diverse than the study of pianos (but maybe that shows my ignorance of pornography or pianos or both). 217.28.12.199 16:03, 25 September 2010 (UTC)Reply
I don't disagree with you that educating people about what pornography looks like is a valid instructional goal, and that certain images are valid for inclusion on Commons for that reason. My point is that some images contain more than one kind of content -- in my made-up image of the young woman there is a second form of content (its arousing potential) that, unlike the first (its educational potential), does not warrant inclusion on Commons. So these images aren't exactly analogous to images which have only educational value, and therefore demand a more careful consideration for inclusion. Robertmharris 18:26, 25 September 2010 (UTC)Reply
Though your piano/woman example is well thought-out, I'm not sure the same standard is applied to many other images on commons which don't contain exclusively 'purposeful' or 'educational' content. Though W:OTHERSTUFFEXISTS tells us that two wrongs don't a right make, and these images have the additional negative impact of bad press, social backlash, or 'prurient interest', I'm not sure it's worth the distinction. Naked women exist and are a part of the world. So are birds and trains. We don't make value judgments about whether trains are shocking representations of capitalism or if birds are horrific reminders of the fall of dinosaurs. We just don't care, because no one raises objections. But I don't think that even if they did it should matter. Is nudity and sexuality really that bothersome, that inherently offensive, that meaningless, that unavoidable, that different, that we should create a separate regime to handle it? Ocaasi 04:37, 26 September 2010 (UTC)Reply
Thanks, I think that I understand you now. I mistook what you were saying as amounting to "a picture of a piano has informational values whereas a picture of a naked woman in a pornographic pose doesn't". I think this was purely misreading on my part and you actually meant something more like "If a picture is intended to be useful as a picture of a piano then including a naked lady in the picture is likely to detract from its usefulness as a picture of a piano as people will be distracted from the main content". Is that right? If so then that seems reasonable enough. I guess instead of a naked lady you could have a bullfight going on alongside the piano or a firework display or someone dressed as darth vader or the house the piano is in might be on fire as other distracting elements that would likewise detract from its usefulness as a picture of a piano. 217.28.12.199 10:20, 27 September 2010 (UTC)Reply

replying to above

edit

Replying to earlier post by Robertmharris

It’s clear we have caused something of a misunderstanding of what we are recommending with the language we used in Recommendation 4. First off, let us say that we are not recommending that anything other than the normal Commons deletion process be applied to the discussion of the images defined there. We made this recommendation in the hope that it might provide some focus for precisely that discussion.

That's very reassuring. A recommendation that's literally just a recommendation isn't a concern at all :) . I think a lot of us were worried that that 'recommendation', in this context, might be the first sign of a 'new policy' that would take takes precedence over the normal consensus-based conclusions in Commons.

Secondly, we are not suggesting that the question to be asked in that discussion is the intent of the image producer or uploader. In fact, one could argue that those intentions are completely irrelevant. We agree with you and previous posters that to introduce those kinds of questions of intent of photographer or uploader would be both impossible to determine and a direction with serious negative consequences. It is the intent of the image we are questioning. And you may ask how in the world the intent of an image is to be determined, or why we chose that language to determine potential educational value.

Here’s the situation we were trying to isolate. If I take a picture of the piano in my living room, we might determine that the educational value of the image lay in the documentation of the particular type of piano, or the documentation of the use of a piano to decorate a twenty-first century living room. If however, we added to this image a picture of a young woman, perfectly made up, nude, posed on the bench with her legs splayed apart, facing away from the piano, looking directly at the camera, it adds a completely different element to the picture, one that fundamentally changes – we would say reduces -- its educational validity.

Whether a piano or a naked woman is more educational depends, I think, on the context of the student. Someone studying a piano will probably prefer the piano alone. Someone studying contemporary art/porn might find the latter more education.
But the situation you try to isolate is, basically, a 'straw man' in the sense that we never are never in reality forced to make this kind of choice. If we were given two pictures and forced to choose the "more educational" of the two, that would be a monumental task indeed. But that's never the choice we have to make.
We don't get two images and have to pick one. Instead, we get one image and have to decide whether to keep sharing it to people who want to see it or whether we should delete it outright. The deletion doesn't even save of us disk space resources, I'm told, as deleted images are still saved on the servers, accessible to admins.
We don't choose the between two image of varying usefulness-- we choose between a potentially useful image or a 'file not found' message. Once you realize that-- then the image is almost always going to be more 'educational' than an error message.
So, here is the real scenario-- if someone comes looking for a particular image, we can either given them image we want, or we can say, in essence, "Sorry. While we do have a copy of that exact picture you're looking for, and even though it's 100% legal, but some people have found it to be more controversial than educational, so we're considering that image 'deleted' and 'off limits'".
There are lots of valid reasons for deletion, but controversialness isn't one of them. Two human beings are trying to communicate-- if everything is truly legal and ethical why should a third party intervene to object and try to impede that communication? No more deletions necessary, just let people filter themselves. --Alecmconroy 14:54, 1 October 2010 (UTC)Reply
I decided when this wiki went up that I would confine my comments to ensuring that our recommendations and the reasoning behind them were clearly understood, not to try and convince anyone to hold one opinion or another. And it's possible that you and I see the same things, but just disagree on how to interpret them. Fair enough. But in case that's not the case, I'll note that we agree that "there are lots of valid reasons for deletion, but controversialness isn't one of them." We're not suggesting that images be deleted because they're controversial. We're suggesting that they be deleted because they lack educational value. Perhaps it's at the definition of educational value that we part company. I'll grant you your point that "pornographic" images display "pornography", in the same way that images that are out of focus display the nature of focus. Having some pornographic images (images that have no other educational value than that they display pornography) in Commons is valid so that we can answer the implied question "I've heard of pornography and pornographic images. What do they look like?". A few out of focus images surely belong in Commons for the same reason, to answer the potential question "I've heard about images that are out of focus. What does out of focus look like?" But in both cases, I would say that these exceptions do not prove the rule. The presence of a few such images is not an argument for an unlimited number of them, because they both represent images that have no real educational content. In the case of the "pornographic" image it's because it has no other instructional value than to demonstrate itself, (and the nature of pornography is to arouse rather than instruct, so that means it has no instructional value other than to demonstrate arousing imagery). In the case of the out of focus image, same thing, it demonstrates the process of being out of focus, not the thing it was supposed to represent. (And I know, of course, that you have not even hinted that we should house a catalogue of out of focus images in Commons, I'm just trying to establish an analogy to demonstrate what I mean about an image having no intrinsic educational value.) Not every image is educational, no matter whether it is demanded by a potential user or not. That may be an arguable proposition, but as I said above, I'm not trying to convince anybody, just explain where we were coming from. Robertmharris 01:50, 2 October 2010 (UTC)Reply
"Not every image is educational, no matter whether it is demanded by a potential user or not." Very true, but a lot of people wont understand that. Ottava Rima (talk) 12:51, 2 October 2010 (UTC)Reply
"if everything is truly legal and ethical why should a third party intervene to object" Controversial, by definition, means that something is either not "truly legal" or not "truly ethical". Things that are sound are not controversial. Ottava Rima (talk) 12:51, 2 October 2010 (UTC)Reply

Two main concerns

edit

I agree with "least astonishment" and allowing users to switch off areas and types of content for themselves (and not by default). But I have two big concerns:

Image bank use

edit

My main concern starts at the point where the text acknowledges Commons' role as an image bank, but then suggests deleting images of nudity where "the intent of the image, to a reasonable person, is merely to arouse, not educate".

I think this misses the point. Commons is for both educational and reference use - as a free image bank it legitimately contains images whose purpose (to face the issue head on) is to "arouse". A person constructing a sexual themed book has every right to expect Commons to be a relevant image bank as much as a person writing a book about steam locomotives or the carbon atom.

That's not to open the door to "pornopedia", but it is to say that we do not judge end users or their uses, nor "target" certain groups as not worthy of image bank access. The focus here should be on removal due to effective duplication and poor/ordinary quality - if we have many crap or duplicate pics, that defeats our goals by making it impossible for users to find the ones that meet their needs - whatever those needs are.

There is a logical fallacy also. Say we have 10000 legitimate penis or Mohammed images out of 30000 total. Is the problem then that the world can accept 10000 explicit sexual images but not 30000? Or what? Those who would object to 30000 images will surely still object to 10000.

The criteria for removal need to be much more around usefulness or duplication - ie if many ordinary images are uploaded in some area then set a higher standard for images kept and delete images when they are either substandard, or we already have exceptional coverage in an area and the new image is ordinary and unlikely to add much if anything.

FT2 (Talk | email) 19:52, 23 September 2010 (UTC)Reply

I agree with that. The images that have the greatest potential to damage our credibility are the "here is my girlfriend's butt, ready for entry"-type snapshots, or the blurry penis shots. A well-done glamour photograph is much less of a problem, and in fact within scope.
What to do when we don't have good pictures within a given category is more difficult. Editors often argue, "We don't have anything better to illustrate this, so let's keep it." I tend to take the opposite view and feel that a poor picture, in particular of sex acts, comes across as somehow demeaning, doing as much harm by its lovelessness and lack of skill as it may add value in terms of pure informational content. YMMV. --JN466 02:32, 24 September 2010 (UTC)Reply
"Somehow demeaning, doing as much harm by its lovelessness". Again, I appreciate your evaluation, but I don't know if it should be taken as universal. I much prefer quality images to poor ones, but don't know if we should apply something as value-laden as 'demeaning' and 'loveless' to our analysis. It is possible that images you find demeaning were created without disgust or with love. It's even more likely that some viewers will find them that way. I'm not just talking about the 100th rear-entry photo of a guy and his girlfriend--those are easy to limit on the basis of quality and redundancy--I'm talking about whatever act or subjects falls outside of someone's comfort zone and is rejected on that basis alone. Ocaasi 05:15, 24 September 2010 (UTC)Reply
"those are easy to limit on the basis of quality and redundancy" isn't that what is being proposed? Above a certain number of images, of a particular subject, more becomes less. The 'controversy' over deleting porn images is simply because it is porn images as in most cases the sole reason to keep the thing is because it is a porn image. To go back to an earlier example, if I search for "blue tit" on Getty images I get 143 examples, Commons has 33% more, and as more and more are added the dross will overwhelm the good. In fact the Blue Tits is already highly redundant with a number of examples of the same or similar images. Note: a few of the nest shots are illegal too, no naturalist would use them, and nor would any respectable publication. --62.49.31.176 10:20, 26 September 2010 (UTC)Reply
FT2, you make statements about Commons which are new and unique to you. Where you got the idea, I do not know. Looking at your contributions, it is not based on expertise on what makes educational or encyclopedic content. You have made the same claims on IRC without having any real grounds or basis for the claims. We do have standards here regardless if you believe that "we do not judge end users or their uses". We always have, and we always will. Then you make claims like this: "A person constructing a sexual themed book has every right to expect". We are not here to provide people with their source material for their books. Commons exists to provide a central location for the various -projects-. That means we can have translations with images on the various Wikipedias without having to upload each locally. To suggest that we are to cater to some unknown author is to justify adding every possible image, no matter how unuseful or crappy. Do we allow Wikipedia to be filled with a page on every topic no matter how unnnotable? Of course not, and people who suggested that were normally laughed off the Wiki or banned for trolling. Ottava Rima (talk) 00:21, 25 September 2010 (UTC)Reply
The deletions for "unnotability" are arbitrary and codify an extremely strong cultural bias. For example, Wikipedia just deleted an article about the people killed in recent Kashmir protests — even though, as I pointed out in the discussion, these people individually have been been the focus of major public protests and the circumstances of their deaths individually published in the regional press. Wikipedia maintains separate articles for each of the four people killed in the Kent State shootings. I feel that Wikipedia has sent a clear message that the life of someone in the Kashmir is worth less than 1/25 of the value of a life in the United States. I am so very tired of hearing about the good selective work done by deletionists, when all they do is stamp their prejudices on history. Wnt 01:34, 25 September 2010 (UTC)Reply
The only cultural bias is a small group of Europeans who wish to ignore over 4 billion people in the world that want something more restrictive than "anything possible". We are already black listed by many nations and schools because of people like you trying to justify unnecessary penis images that do nothing more than promote some strange political feeling you wish to impose upon the WMF. The people in Kashmir don't want pornography like we have here, yet you failed to mention their thoughts when you defended thousands of images that serve no honest purpose. You are selective in your examples to the point of hypocrisy and I am disturbed that you felt that your statements were appropriate to be posted. Ottava Rima (talk) 02:08, 25 September 2010 (UTC)Reply
Ottava Rima, while I respect your disagreement, I don't think it holds up to scrutiny. If everything on Wikimedia was limited to a majority vote about its controversiality, we would lose a lot of content. The scope of this mission is not to satisfy any culture's preferences. While we shouldn't make it hard for them to do so, we should not in any way limit content which might be useful to 2 million, or even 2 thousand people because 4 million people don't want it. That's not what we do here. That's not what an encyclopedia is. An encyclopedia is not safe or friendly or polite. You seem to equate encyclopedia with education, as in 'something you might learn in school'—but the truer definition of encyclopedia is a worthile collection of all things that exist. Some of those things are seen as vile or immoral by many. Politely, so what? Ocaasi 12:23, 25 September 2010 (UTC)Reply
Lose content? And that is a bad thing? Oh no! Here is the thing - nothing that matters ever disappears from the majority's say. The worse argument around is from the fringe claiming that their ultra important item must be preserved at all cost, even if it is blatantly ridiculous. And an encyclopedia is not a collection of all things. Whoever told you that is ridiculous. We do not have separate articles on all of the pokemon anymore. An encyclopedia is an educational resource on -what matters-. It is not on the little stuff but on the big stuff. I've never heard of Britannica being protested for being immoral or vile, but I've also never seen them host hundreds of thousands of pornographic images that are only used by a small group of people to wank to each other. Ottava Rima (talk) 13:11, 25 September 2010 (UTC)Reply
Good God, and you wonder why anti-censorship people are such hard-liners! We merge a couple of Pokemon articles and you use it to justify an all-out crusade against a vast collection of content! What wouldn't you demand if we were willing to go along with that? Wnt 16:09, 25 September 2010 (UTC)Reply
Censorship? No, its called appropriateness. To use the word censor is to admit that you lack a real argument. Ottava Rima (talk) 19:06, 25 September 2010 (UTC)Reply
Appropriateness is a vague and context-dependent concept. Is it appropriate for whom? You have some a priori ideas of what is and isn't appropriate, but not everyone shares it. Again, just go a little further down your censorship slope and recommend censoring the faces of women: surely, some people think it inappropriate to see them in public. This proposal is discussing censorship, no matter how you justify it. That just means that if we are going to censor, the justification has to be narrow and well thought-out and the execution has to be flexible and minimally intrusive. Ocaasi 69.142.154.10 23:24, 6 October 2010 (UTC)Reply

"Reducing credibility"

edit

I also do not agree that this content is "dangerous" or "reduces credibility". Once the shock ("omg Wikimedia hosts sexual images!) dies down the world will continue to use our remaining 99.99% of content simply because it is good. So gradually the world will acclimatise to the notion Wikimedia Commons hosts sexual pictures. Whoopee-doo. I give them 3-5 years to get bored of the issue if that long, and that, more than anything else will probably advance our mission of education, because people will tire of the sex pics, the people who want them will use them, and we will have taught people to handle controversial knowledge outside their personal norms a little more sensibly, rather than emote over it. It's in the news now. Will it be in 4 years? Unlikely. Will our reputation be diminished in 4 years if we kept our usual standards but said "sorry, ignoring your concerns, use us or not as you please"? Probably no real impact once this stops being "news".

FT2 (Talk | email) 19:37, 23 September 2010 (UTC)Reply

Sex being what it is, there is at least one sense in which people will not "tire of the sex pics". They are our most frequently viewed pages, making up a considerable part of overall Commons traffic today: http://stats.grok.se/commons.m/top This is not likely to change, nor is pornography likely to become entirely uncontroversial any time soon.
What hurts our credibility is if we feature crude, "cheap and nasty" sexual content. In deletion discussions, editors feeling the need to keep censorship at bay have at times tended to argue against deleting any media featuring sexual content, almost regardless of quality. I believe making sure that the quality of the pictures we host is of a high standard -- technically, aesthetically -- can do much to convince the general public of the legitimacy of our hosting them, and counter the impression that we are a webhost for geeks' amateur porn, or a platform for exhibitionists. --JN466 02:50, 24 September 2010 (UTC)Reply
@FT2, This is a really important point, not just about Commons in general but about 'shock' itself. The first time you see a picture of [a bound and gagged naked man] or [a bloody corpse], the only reaction is shock. Systematic disgust. There is no capacity for reasoned evaluation let alone appreciation or even simple disinterest, because a person's core values are being threatened. But that fades. It fades with individual images, and given time and the right justification, it will fade with Commons as a whole. Even the broader social outrage at certain kinds of images will fade over time, provided moderation and explanation are handled thoughtfully.
Some people never want that disgust to go away, because they think that to not be shocked is to sanction such images and to lose a part of their moral beliefs. We have to ask if we want the sensibilities of religious observers or cultural warriors or even the reasonably prudish--really anyone with any kind of agenda or personal bias limiting the scope of the resource just to avoid this shock reaction.
This is a turning point. The Study authors have suggested we make concessions now to protect our reputation. The alternate approach is at least as reasonable: we hold fast until the outrage ends, give users the option to exclude images as they see fit for their personal use, moderate based on quality and redundancy, and get back on with the business of the project. Ocaasi 05:15, 24 September 2010 (UTC)Reply
The world is not using our material as we are still black listed by many schools and even countries. FT2, you know that and have had it brought to your attention multiple times. It is not very respectful of you to proceed with such an argument when you have been corrected regarding the matter so many times before. Ottava Rima (talk) 00:23, 25 September 2010 (UTC)Reply
I think this is a good argument but ultimately limited in its implications. It is not an issue of whether there are lots of images of penises and violence on Wikimedia sites, only whether there is a reasonable option to filter or opt-out of them. So, I can see this as an argument for the practical recommendations of the study regarding user-side image selection, but not for any broader exclusion of content. Ocaasi12:49, 25 September 2010 (UTC)Reply
We weren't always blacklisted, you know. So we could delete the excessive stuff and achieve the same results without having to implement a new system. Ottava Rima (talk) 13:12, 25 September 2010 (UTC)Reply
I think that culling content to try and be more in line with traditional cultures (assuming most of the 2 million you disagree with are less 'traditional') is going in the wrong direction. I don't want to impose my values on others, nor do I presume that a moral free-for-all is best, but the direction of history has been towards greater freedom of information and broader exposure to what was once taboo. Some people see this as corrupting, and I respect the desire to not want to see any of it. But I really don't think that Wikipedia's remit is to line up with those values. In my opinion, if nothing in an encyclopedia offends someone, the encyclopedia is probably not broad enough. I want Muslim countries or Chinese students to have access to Wikipedia, but not at the expense of its content. Ocaasi 14:35, 25 September 2010 (UTC)Reply
The content serves no honest purpose. The only people that would "suffer" from not having the thousands of porn images are those who get off from the images. They can easily get off by hosting it on flickr. It isn't about "offending" or "not offending", it is about having standards. The images deleted have -no- purpose. There is never a proven purpose. You do not need thousands of duplicated images, especially when medical textbooks and other things that are "educational" do not have the extent of images sitting around and doing nothing. Ottava Rima (talk) 19:10, 25 September 2010 (UTC)Reply
Wikipedia is an industrial production facility for content, not a retail boutique. We make all this content free, even for commercial reuse. If some country blacklists Wikipedia, then it is possible for private enterprise to make a profit by making a mirror site and having someone go over it and "cull content"; if the censors are willing to tolerate a few accidents it is also possible for someone under to make a profit (perhaps at public expense under w:CIPA) by intruding into the user's connnection and blocking service of just a few Wikipedia pages.
It is far more productive for Wikipedians to focus their efforts on making an encyclopedia so vast and complete that no country is willing to do without it, and to set a standard so fair and unflinching that no one can claim that allowing a particular type of content is a deliberate affront, than to focus efforts on a censorship regime that only encourages further demands. Wnt 16:06, 25 September 2010 (UTC)Reply
Agree. FT2 (Talk | email) 01:01, 26 September 2010 (UTC)Reply
Sorry, but when you have thousands of penis images, people don't want to bother helping out to make the place complete. Our reputation is in the gutter in order to appease a few dozen people who are exhibitionists. Ottava Rima (talk) 19:10, 25 September 2010 (UTC)Reply
Let's put your "thousands of penis pictures" in a more accurate frame. As of today, Commons has 7,434,776 media files. Commons category:Human penis has a comparatively tiny number of images - go look, then look at the diversity of sub-categories involved that a person might want to look for (and consider that many pictures will be in multiple categories, adding up will over-count them). When Commons has 7.5 million files and 0.5 million of them are human genitalia, then we might have a problem. We don't, or at least not in the sense you think of it. FT2 (Talk | email) 01:00, 26 September 2010 (UTC)Reply
A tiny number would be one or two. It is not that. A penis is one object. Having thousands of images of it is excessive. You know that. You've admitted it before many times. You honestly know better than what you are saying right now, which makes what you are saying now very questionable. You have defended some of the weirdest, most fringe stuff for no reason and have admitted to not having a reason, so I can't help but think that you are just messing with people with your posts now. After all, you stated that you "enjoy the argument". This isn't about arguing. This is about encyclopedic integrity. Ottava Rima (talk) 03:46, 26 September 2010 (UTC)Reply
I am trying to remember when someone last put so many views and stances in my mouth that I didn't say. Please don't do it. Thanks. FT2 (Talk | email) 09:57, 26 September 2010 (UTC)Reply
I have the IRC logs so everyone can see that I am not lying. They can also seem some of the most ridiculous and fringe POVs FT2 has clogged the IRC channel with as part of his "I enjoy arguing". Ottava Rima (talk) 12:45, 26 September 2010 (UTC)Reply
Please stop with the ad hominem arguments already. Even if it mattered, how would I even know if only one person used FT2 as an IRC nick? Actually, looking just now, I think the main categories of penises on Commons have about 300 images, quite a few of which have very obvious educational relevance (circumcision, circumcision repair, hypospadias, piercing, skin bridges, demonstrating erection and ejaculation). Even among the unannotated penises one can hardly say that "a penis is one object". Some guys have weirdly schismed penises where it looks like the corpora cavernosa are about to declare their independence. Foreskins sit differently from one to another. If someone later to write about such distinctions, it helps to have a public domain image resource to fall back on to be able to document the point. There is educational use here, if a person is willing to look for it. Wnt 07:42, 27 September 2010 (UTC)Reply
Thank you for revealing to the world you know very little about how IRC, a very common internet chat system, works. This just verifies that you are not qualified in general when it comes to internet items, including Wiki. Your claims that "penis repair", "hypospadies", "piercing" are all needed has no basis in fact. Any credible medical textbook or guide only has one or two -drawn- images and does no need to bother having any such thing like the above to differentiate between non notable penis statuses. One image of a penis can be rationalized as education. Any more? No. Ottava Rima (talk) 13:50, 27 September 2010 (UTC)Reply
What are you claiming about IRC? I just went to www.mibbit.com, typed in "FT2" and "#wikipedia", and there I was, nick FT2 on #wikipedia. What more is needed? Wnt 18:28, 27 September 2010 (UTC)Reply
The nickname FT2 is registered on Freenode in the IRC channel. You would have received "-NickServ- This nickname is registered. Please choose a different nickname, or identify via �/msg NickServ identify <password>�." before being booted from the name. FT2 has a cloak on IRC, which means that his account on IRC is directly connected to his WMF account and those who work the IRC channel have verified it. This is just more proof that you might have a compulsive issue when stating things that are not true. Ottava Rima (talk) 18:39, 27 September 2010 (UTC)Reply
Alright, I'll concede the point. Wikipedia has no official IRC channel, but it's apparent from w:Wikipedia:IRC that Freenode's #wikipedia is the official unofficial Wikipedia IRC channel. Mibbit put me on a different #wikipedia. I had underestimated how much effort people would put into talking about Wikipedia in a non-Wiki format; the only reason I've ever gone to such channels was to ask about outages. Wnt 22:08, 27 September 2010 (UTC)Reply
Likewise one green check icon (Commons has 75 green check icons), one image of a Roman Denarius (Commons has 312), one image of the Statue of Liberty (Commons has enough to need several subcategories), one image of a heart (we have hundred of anatomical drawings, including 8 images of arrhythmogenic right ventricular cardiomyopathy and 5 of left ventricular non-compaction alone), one image of a liver (Commons has about 100 including 4 frames of Hepatocellular carcinoma alone), one image of an arm (Commons has several hundred including 25 arm Xrays alone), or one image of a nose?
Neutrality is not a euphemism for "selectively target the areas the writer has strong feelings about". The category sizes are probably broadly comparable. FT2 (Talk | email) 16:23, 27 September 2010 (UTC)Reply
We aren't black listed in schools and in other countries because of unjustifiable checkmarks. You know that and you know that argument. Stooping down to such a level is disruptive. Ottava Rima (talk) 18:39, 27 September 2010 (UTC)Reply
It's clear that there isn't going to be an agreement here on your view. Every other class of image has multiple images - often multiple images of even extremely narrow subcategories (8 images of arrhythmogenic right ventricular cardiomyopathy for the heart). Your view is that although we have a hundred (or hundreds) of images for everything from Roman coins to many human body parts, and for biological parts we often have many images showing different types and conditions, the entirety of the human race and Wikimedia Commons users should (in your view) consider one single image of a genital organ sufficient for all uses - medical, social, diversity-educational, and all others. In fact your view is that "Any credible medical textbook or guide only has one or two -drawn- images". I think the authors of medical textbooks might disagree. So might the authors of sex education, adult information, and sexual health information. I think you're onto a no-winner, but that's your choice. FT2 (Talk | email) 21:58, 27 September 2010 (UTC)Reply
You've only successfully argued for, at most, 10 total penis images. Furthermore, you have not demonstrated the need for -real- images as opposed to drawings, especially when many traditional medical books (i.e. educational works) use drawings and diagrams in order to avoid issues of race and the rest. But as to the current reality, we do not have images of black people, Asian people, etc, but only white people according to the study (there was -one- black penis). In essence, you are defending something that is not actually present or true and it is something that you know of because you, of course, read the page this is the talk page of, right? So, what is it? Did you forget that there was no representation of anything but white penises? Did you ignore it because it did not make your point? Or did you not read what you are supposedly responding to? Either way, it shows that your statements are problematic and I would ask you again to refrain from such practices. Ottava Rima (talk) 23:41, 27 September 2010 (UTC)Reply
I would like to see more Black and Asian genitalia. There. Also, real images are phenomenally, well, real. We don't ask for 'cartoons' of bees and trains, because there is meaning and content and beauty in reality. People should know what things actually look like. It helps the stigma go away, aside from being more factual and more interesting. Ocaasi 69.142.154.10 00:00, 7 October 2010 (UTC)Reply

A suggestion for an approach

edit

User preferences only affect a logged in user, and cookies can be removed inadvertently during cleanup.

I'd like to see simple downloadable browser software (extensions, add-ons, userscripts etc as required for each main browser) that in effect add a browser setting to set a default show/hide for such images, that affects all Wikimedia content on that browser.

My feeling is that a far better way to ensure "reputation" than any degree of censorship is:

  1. Make default-hide or remove accessible as a browser add-on - Making image default-hide available both as a user preference and also via a browser plug-in/add-on (affecting all access through the given browser),
  2. Tightening image selection/retention/quality policy for areas with large-scale redundant imagery - Setting a policy of selecting for high quality and non-redundancy in areas where we have plenty of images,
  3. Make clear our stand and don't "cop out" - Making clear that we stand against external pressure to change our content for good reasons, as an integral part of editorial neutrality and image bank usage, and making clear why that is so; and that we do not target users by their end use. Steps are being taken to provide those who do have such objections with a way to remove them from their own viewing or their own computers and to reduce over-stocking of redundant imagery, but Wikimedia itself will remain uncensored.

This would probably be the most desired solution to complement user preferences, for a very wide range of concerned users, so that places that offer computer access professionally (church study rooms?) or homes whose parents deem the images inappropriate, can set these images to "hide by default" or not show on their own computer.

Time will take care of the drama and the rest.

FT2 (Talk | email) 06:53, 24 September 2010 (UTC)Reply

Other than your suggestion for a browser add-on, this is, as far as I can see, essentially what we have recommended.(The latter in out first two principles in Part 1). I don't see what censorship has to do with this. We said pretty categorically, that no content deemed appropriate by Wikimedia editors on Widimedia sites by current, existing tests of appropriateness should be able to be removed by any user. Delayed, yes, removed, no. Robertmharris 23:50, 24 September 2010 (UTC)Reply
With respect, it's not. Here are the key differences:
  1. Recommendation 4 is centered on "arousing but non educational" as the criterion for deletion. I think that's inappropriate because 1/ we do or may in future host material that would not commonly be understood as "educational" but is within mission as "providing knowledge" or "informative", 2/ if it's non-educational in the broad sense (ie "providing knowledge") then we shouldn't host it anyway and it should be removed for that reason, 3/ we should not target content or users by their end use, 4/ it's hard to imagine an announcement that we have culled some sexual images as placating any critic, it will merely strengthen the attacks on the holding of any other content. The focus should be on culling by higher standards and "duplication of ordinariness", not estimates of what is "arousing" v. "educational".
  2. We should be much more firm and outspoken on supporting fully our position of open hosting and believe in it and present it as a positive point of value, not excuse it or imply it's open to debate as "reluctance".
  3. I feel that concerns that we will be negatively impacted by a "wave of censorship" are unfounded or poorly conceived. History suggests the opposite, that those who placate an implacable demand find they have given 50% as a compromise only to have weakened the discussion on the other 50%, their opposition taking their give-away as weakness. I suspect this will largely blow over in a short time (will it still be headlines in 3-4 years? Doubtful) and people will find us too useful, the world will vote with its feet if we instead say "this is our belief, we're prepared to provide tools for individual use but as for anything more, no - you may use us or not as you please".
  4. We should provide software plug-ins to allow default hiding of certain material on a given machine (ie its browser generally) - I think you've understood this as an additional suggestion
Some of these are perhaps thoughts you expressed but slightly differently nuanced. Hope this clarifies. FT2 (Talk | email) 01:22, 26 September 2010 (UTC)Reply
Thanks for the clarification. Let me start where we agree – your point #2. I think the notion that the project’s support for open hosting should be expressed positively, not negatively is a good one. I will find a way to include this notion in the report.
To your point in Part 3 about “give-away as weakness”, a couple of things. One, it’s hard for either of us to predict how people may or may not react in the future to certain actions. My only point is that the ability and willingness of the world to use censoring tools on the Internet is increasing (and that’s documentable, see this recent book http://opennet.net/accessdenied) and that therefore our current commitment to user respect might be challenged in a different way in the future than it has been in the past. However, as per your statement to users, I definitely agree with the need for it, but I would turn it around to make it a “half full, not half empty” notice, ie “We provide an open information resource, as our mandate, and we believe that is appropriate for us to do, but we also respect your right to decide what you and/or your family might choose to view on our projects, so we’ve provided you that option” – very clunky, but I hope you get the idea.
The reason I think you feel (and you may be right) that we disagree more than we agree I think centres on our Recommendation #4, about the deletion of certain classes of sexual images. We do disagree here, somewhat, I think, basically because we feel that the images we described in #4, fail your test of “educational”, even using your wider definition of “providing knowledge.” Yes, we are willing to admit that a few representative examples of such images do provide knowledge (of what pornographic images look like, for example), but our view is that those images are the exception, not the rule (and therefore, the argument that we should have a few such images is not an argument that we should have an unlimited amount). To us, these images lack informational content, provide only emotional content, and are therefore not educational. However, that is an argument to be made, we understand, and remember that we are only recommending, in the end, that such images be considered for deletion using the current methods of Request for Deletion, which provide an opportunity for the community to decide these things on a communal basis. However, we stand by our analysis – these images, we believe, like other classes of images, don’t meet current standards for inclusion on Commons.
And, finally, that, of course, is another fundamental question that your remarks address – are there any images that should not be in Commons? This fundamental question centres on the equally basic question (raised in differing contexts by you and others on this page) of whether there is a difference between “educational” and “informational” images, and if there is, whether Commons should restrict itself to one or the other, or both. This is a bigger question that we were asked to study, but it entered our discussions so many times, that we have addressed a few remarks to the question in Part 3. Robertmharris 15:45, 27 September 2010 (UTC)Reply
Okay, I guess I wasn't misunderstanding you originally after all. You really are saying that a photo of a naked person can contain less informational content than other photos purely on account of the potential to arouse. That doesn't make sense to me at all. If I have two pictures of an attractive young lady, one attractively dressed and the other naked then can the nakedness really remove information? What information is lost? Information about her clothes I guess...? This just seems bewildering. Is the information lost for everyone or only those actually aroused? 217.28.12.199 21:40, 27 September 2010 (UTC)Reply


Here here, especially on #3. I want a wikimedia that is unabashedly proud of its stand against censorship. --Alecmconroy 08:09, 24 September 2010 (UTC)Reply
(with huge apologies for being a pedant, it's 'hear hear' - as in 'hear him') - I've previously linked to Ting Chen's posts to foundation-l on this matter which explain, I feel, very well why diverting this discussion into one fundamentally about 'censorship' is a bad idea. The worst aspect, which I noticed here and elsewhere, is that it's a label which unfortunately acts as a bit of a 'thought stopper' - tending to polarise and limit discussion instead of helping it. fwtw. Privatemusings 09:22, 24 September 2010 (UTC)Reply
OK, we can call it self censorship that we impose upon our readers, if that suits you more. TheDJ 11:59, 24 September 2010 (UTC)Reply
I don't feel that we need #2. For what? To make censorship proponents happy? It's not a really good compromise -- to propose big unbiased indiscriminate flush instead of little biased flush. Are there rational reasons to tighten current inclusion rules? I feel that in practice such tightening will result only in upsetting and driving away contributors, wasting nerves and time in unnecessary deletion discussions, linkrot, and deletion of useful good quality content (of course -- DRs on Commons usually get not enough Community review, and there are a lot of mistakes because of it). Trycatch 12:49, 24 September 2010 (UTC)Reply
"... places that offer computer access professionally (church study rooms?) or homes whose parents deem the images inappropriate, can set these images to "hide by default" or not show on their own computer": FT2 has put forward some useful ideas here. --JN466 15:35, 24 September 2010 (UTC)Reply
There has recently been some controversy in the press about Wikipedia's inclusion of spoilers. It occurs to me that this too is something that we could have a user-selected regime for -- spoiler text could be tagged, and users who don't want to have spoilers visible on their computer would be able to have that text greyed. --JN466 13:02, 25 September 2010 (UTC)Reply
That's certainly not happening on the English Wikipedia, that discussion's already flown. But it's also probably beside the point. What is the point, here, is that it's being suggested Commons should "cull" images we have "enough of". But hard drive space is cheap. Why have "enough" at all? If I'm looking for a photo of an owl, maybe I need one of an owl exhibiting a very specific trait or caught in a very specific act to illustrate a specific facet of owl behavior. In that case, the more photos of owls we have, the better, and the more likely I can find what I need. If there are concerns that if I just need a really good image of an owl, and may not be able to find one, I would propose that the featured content system be used (and perhaps expanded, to highlight excellent quality photos from each category). But regardless of photographic quality, if an image doesn't contain a depiction of exactly what I'm illustrating, it's useless to me.
Commons is intended to be a free content repository. By all means, let users "opt out" of images they personally would not like to see. We shouldn't be out to force a reader to read or view something, only to present it and allow anyone who wants to go and have a look. But we shouldn't be culling content on the grounds of "arousing" (what if I'm illustrating an article on "Erotica" or "Pornography"? Isn't that exactly what I need to illustrate those?), nor on the grounds that they may be offensive or shocking to some people (if I'm illustrating an article on a specific sex act, and someone has provided us a free photo of that act, the most useful thing to be done is to show that photo—we don't resort to "line drawings" of things elsewhere, we show the real thing, even when it may disturb some people, though of course excluding images it would be illegal to have at all in the US.) "Not censored" demands no less—that we do not delete free content images just for PR reasons. And after watching the image deletion issue, it was very clear to me that there is no consensus within Commons to "delete excess images" of a sexual nature, let alone any other nature. That's as it should be. Commons is intended to be a repository.
Finally, the concept of images being deleted to protect someone's religious superstitions is odious beyond even that. No religious group should ever expect those not of the religion to care a bit about what their religion says anyone may and may not do. And regardless of anything else, such "shuttering" settings should always default off, and be turned on only if the user specifically chooses to do so. Seraphimblade 16:37, 25 September 2010 (UTC)Reply
"Commons is intended to be a free content repository" This is a twisting of what the phrase means. The repository was for the sister projects, and the standard is use on those projects. Most of the images have no use or purpose. We are not flickr or a substitute for it. Ottava Rima (talk) 19:11, 25 September 2010 (UTC)Reply
Ottava, I just illustrated one purpose, above (an article on "Erotica" or "Pornography" would be a totally appropriate place for images that are, well, designed to titillate—that's what such a thing would be about!). Other examples would be articles or books on human sexuality, sexual behavior, or anatomy, or a Wikiversity course on the same. It's very hard to find an image that has no conceivable use whatsoever, and while it may make some people squeamish, sexuality and violence are legitimate areas of study and legitimate things to write about in a reference work. We should use free images to augment those writings just like we do for everything else. Seraphimblade 20:46, 25 September 2010 (UTC)Reply

In the News

edit

As always, Gregory Kohs runs his mouth. TheDJ 10:57, 24 September 2010 (UTC)Reply

WikiJunior

edit

There is a Simple English Wikipedia http://simple.wikipedia.org and a proposal to create simple wikipedias in all the languages which have as many articles as the English Wikipedia had when the Simple English Wikipedia was created. I hope that everyone who believes beginning readers should have access to the sum of all human knowledge will support that proposal. 71.198.176.22 03:25, 25 September 2010 (UTC)Reply

We also have an actual wikijunior. http://en.wikibooks.org/wiki/Wikijunior .Geni 03:11, 26 September 2010 (UTC)Reply

Simple English Wikipedia is not intended to have its content limited for children; only its language is simple. LtPowers 20:49, 29 September 2010 (UTC)Reply

No confidence in the recommendations; suggestions for improvement

edit

I have questions for the recommendations' authors:

1. Why is sexuality and violence identified as worthy of exclusion while religious and superstitious ideas and imagery are not?

2. There is ample evidence that exposure to religious images, ideas, and the tribalism, faith-based anti-reality viewpoints, and the palpable incitement to violence they often entail causes measurable harm. There is little or no actual evidence that exposure to sexual or violent images causes any actual harm or legal liability to the Foundation. To what extent was this evidence considered when formulating these recommendations?

3. Public school officials wishing to have a version of Wikipedia compatible with their nonsectarian educational materials requirements will be much less likely to attain one unless the Foundation actively supports production of nonsectarian editions of the dumps and image bundles. Do you support a nonsectarian edition of Wikipedia? Why or why not?

4. Why is Steven Walling identified as the author of the recommendations when Robert Harris and his daughter is purportedly authoring them? 71.198.176.22 03:39, 25 September 2010 (UTC)Reply

To take your questions in reverse order, Steven helped us put the study up, and did so from his account. He has authoured none of this, but that's why his name appears in the History logs. I'm not sure, I'm afraid, of exactly what you mean by a non-sectarian edition of Wikipedia, so I can't effectively answer that question at the moment. The question of harm you raise in your question 2 may be debatable by some, but actually, the question of "potential harm" was not one we considered, not because it is not important, but because we agree with the basic notion that Wikimedia projects have designated themselves as educational first and foremost, so questions of educational validity must take precedence over all others. We have suggested that service and respect to readers is also a Wikimedia principle, (not necessarily protection from harm, which we would say is something a bit different) so we have made suggestions to integrate these two principles where we felt it necessary using rules of engagement we defined in Part 1.
As for your first question, it is a very good one, and if you can wait a day or two, we will be addressing the answer in Part 3. But remember that we have suggested a regime that would allow individual users the ability to control their usage of some religious images, although it is a different regime that the one we have suggested for sexual and violent images. The basic reason is that it is our observation that concern about certain forms of sexual imagery and images of violence is quite widespread in many different societies and cultures, whereas concern about religious imagery is not -- or if it is, quite different images are the object of the concern, so a one-size-fits-all approach seemed very problematic. And remember, all of the "restrictions" we are suggesting on imagery are user-controlled, and user-initiated. We are offering our readers the option to collapse images in certain categories for their own use; we are forcing this option on no one. Robertmharris 05:21, 25 September 2010 (UTC)Reply
You say you have suggested a way to allow individual users to control appearance of some religious images, but you do not refer to sacred or religious imagery in your recommendations 7 through 11 at all. You say that the restrictions are controlled and initiated by users, and aren't being forced, but your recommendations 4 and 6 specifically refer to restricting numbers of images and deleting them, respectively. Am I missing something, or have you just misrepresented your own text?
Would you please list the authorities you relied upon for determination of actual harm and legal liability? 71.198.176.22 20:14, 25 September 2010 (UTC)Reply
Recommendation 9 specifically refers to other content deemed controversial, which we had previously identified as including certain images deemed sacred by one group or another. When we said that restrictions are controlled and initiated by users, we did not mean to imply that we were suggesting that requests for deletion based on scope should be abolished. We meant to say that restrictions on content determined to be in scope on Wikimedia projects should be initiated and controlled by users. Our Recommendation no. 4, as Wnt has pointed out, is nothing more, in effect, than a normal Request for Deletion of a group of images, with a rationale presented in advance of why we feel that request is justified using current policy on educational scope. We have noted our interpretation of the Foundation's legal liability in the Introduction to Part 2. This position is supported by the Foundation's legal counsel. Robertmharris 22:12, 25 September 2010 (UTC)Reply
I have no problem with restricting child pornography, which is the extent of legal liability mentioned in the introduction to Part 2, but the recommendations here go quite a bit further than that. What was the standard of harm you used to support the mass deletions and restrictions requested?
I'm not sure whether an official Foundation-sponsored effort can ever possibly be seen as a "normal" request for deletion, but that's beside the point. I was hoping that this was going to be an analytic effort, supported by respected sources measuring actual harm and legal liability. I fear it has become merely a lengthy essay without a research basis, supporting a moral panic whitewash. Relying on the careful studies of actual harm and legal liability that I've seen (and shared!) would not result in removing as much, or the same kind of, controversial content which periodically causes criticism of the Foundation in the popular right-wing press. But since there hasn't been any evidence that studies of actual harm or detailed examinations of legal liability were involved in formulating the recommendations here at all, it ends up looking to me like nothing more than pandering to the interests of the puritanical popular press.
Don't get me wrong: I've been arguing for removal of redundant images and sexual images with publicly identifiable people in them at commons:Commons:Sexual content, so I'm probably as much on the side of those who want to shield the Foundation from serious trouble as anyone. Without a solid basis in the facts of actual harm and legal liability, I predict this will turn into a counterproductive fiasco at best, and could backfire, further estranging commons users from Foundation interests, at worst. 71.198.176.22 06:47, 27 September 2010 (UTC)Reply
Exposure to religion doesn't increase any violent tendencies as there are just as many violent atheists. By the way, I don't think an article on St Augustine has the same legal ramifications as hosting naked pictures of a 16 year old girl, or even hosting images of a 25 year old who did not give permission for he image. By the way, "nonsecterian"? We've never had a problem regarding it so this is a fantastic claim. Ottava Rima (talk) 13:16, 25 September 2010 (UTC)Reply
Do you have cites for the claim that violent tendencies are similar in occurence between atheists and others, and for your implied claim that atheists have generally had less exposure to religion than other people have? (I'm not disputing either necessarily but I would be interested in reading the research supporting them) 217.28.12.199 15:43, 25 September 2010 (UTC)Reply
Do you have any cites saying otherwise? It is odd that you hide your identity, make bold claims, then demand proof from others. Ottava Rima (talk) 19:12, 25 September 2010 (UTC)Reply
No I don't, but I'd be interested in them too if you have them. I'm at a loss as to how you feel I've hidden my identity - I haven't even tried to conceal my IP address. I don't think I've made any claims in this context, not in the post you responded to anyway, bold or othewise. Likewise I find it hard to imagine how my words could have been read as demanding - but I apologise that they somehow came across that way, 217.28.12.199 09:26, 26 September 2010 (UTC)Reply
Don't take it personally. It's not at all odd that you are an ip and it's fully within your rights. Nor does it take away from your argument or put it on a different level, simply because you are not registered (or logged in). It's also your right to ask questions, particularly in complex debates, even if they are bold. You're not on a crusade; I wouldn't worry too much about defending yourself from insinuations that you are. Ocaasi 09:44, 26 September 2010 (UTC)Reply
Many of us are involved in Open Educational Resources projects which we hope will some day soon include materials providing free content meeting nonsectarian public school textbook standards. The treatment of and requirements for religious subjects under such standards is perhaps more involved than you realize. There are also common law liability issues with the fact that images of Muhammad have been getting people killed. My personal belief is that no religion is free from superstitious mythology contradicting empirical physical facts, while I understand full well that the adherents to any religion are willing to argue to various extents that their religion is the correct one. 71.198.176.22 07:03, 27 September 2010 (UTC)Reply
IPs can make up just about anything. That is why you should log into your account, as this is an obscure area and only people with long term experience in the topic would know of its existence. If you are too embarrassed to make the same claims under your Wiki name, then that shows that there is a problem with he claims. Ottava Rima (talk) 13:51, 27 September 2010 (UTC)Reply
I stay logged out as a matter of personal preference, because I want a better understanding of how logged out people are treated and what extra hurdles they face, among other reasons. I've been using this IP number on this topic for months now. I have no idea which area you are calling obscure, but the issues involved with state boards of education's textbook standards are well understood in the OER community, of which the Foundation is only a part. 71.198.176.22 10:21, 28 September 2010 (UTC)Reply
That is a bogus reason. I think it is obvious that this page needs to be semi-protected to stop the logged out editing. Ocassi is doing it and so are others. Ottava Rima (talk) 14:03, 28 September 2010 (UTC)Reply
Ottava, I think it's clear that you don't have to be on an IP to make things up. The arguments either stand or they don't. That said, I should note, in case one of the IPs hasn't heard about it, that there is a "global unified login" system so that your account on Wikipedia automatically gets you a login here at Meta and on other projects. You just go in "My preferences" at the top of your Wikipedia page and you'll see your "global account status", where you can get started. Wnt 18:07, 27 September 2010 (UTC)Reply

Contentious discussion

edit

I sincerely object to how Ottava is making this discussion much more personal than it has been so far. Calling people out on what they stand for and basically flipping them off. Telling that people like FT2 have no relevant experience and Wnt a seasoned contributor to Wikimedia projects has no right to speak here on Meta, simply because he hasn't contributed much to Meta apparently ?? He calls my thought experiment an attack on Muslims and my entire opinion dishonest. FT2, Wnt and I are all long time Commons and English Wikipedia contributors. Ottava however was thrown out of the English Wikipedia, does not participate in Wikimedia Commons and was almost thrown out of that project as well.

I can overlook that, and I have no problem discussing things with Ottava, but it does need to be a discussion and if he continues on the path that he has chosen so far on this page, then I refuse to participate here. Where I have defended my ideals and values, I have done so on a personal ground, and have never refused to acknowledge that large numbers of people probably don't agree with me, which might even result in certain restrictions within the project; I have even been a cautious supporter for filtering software.

I have however no interest in having to defend my cultural heritage against his personal opinion about what my culture is worth to him, nor should I have to. TheDJ 20:08, 25 September 2010 (UTC)Reply

Removed my comment because this is clearly not the right forum and TheDJ knows it. TheDJ has a long history of pushing a fringe agenda, edit warring, being incivil, and other problematic behaviors. His attack on Greg Kohs for no reason is just one blatant example on this page. His ideas are invalidated based on his lack of experience and his inappropriate dismissal of two thirds of the world. His post above also contains blatant unfactual statements, which are part of a greater trend within his posts across all projects. The fact that a WMF Board Member put up two of my pages a few months ago, which are some of the best written pages on Wikipedia, shows that I was not "kicked off" any project. I have received two FAs, many GAs, and many DYK over the past few months, things that TheDJ has never attained because he does nothing more than push a few scripts. Ottava Rima (talk) 21:58, 25 September 2010 (UTC)Reply
I agree that there's no need to accuse people. Ottava, saying someone is "pushing a fringe agenda", has a "lack of experience", makes "blatant unfactual statements, which are part of a greater trend within his posts" really isn't helping anything and I'm pretty sure won't be let stand if they continue. If you can't make your point without attacking those who disagree with you, then maybe take some time to reconsider or do something else for a while. There are lots of people who disagree with your view (and many who agree). That doesn't make either idiots, and you need to find or contribute to consensus rather than trying to prosecute your enemies and condemn their perspective.
(In fairness, the wording of the link to Mr. Kohs could be more neutrally phrased.) Also, to try and tone done the discord, I retitled this thread 'contentious discussion' rather than putting a single editor's name directly in it.
As for this discussion, which has been remarkably civil across the past few months, if anyone can't find a way to have it constructively, then take a break and go elsewhere, please. Ocaasi 02:59, 26 September 2010 (UTC)Reply
Sorry, but testimony is based on the character and experience of those involved. We do not give everyone equal say, after all, "experts" were chosen for this study. TheDJ has a long, negative history with problematic behavior connected to pushing a fringe opinion. The ramifications of his pushing for such ridiculousness has put the WMF at ethical and legal danger, such as his repeated attempts to defend the "right" of an admitted pedophile to edit. Such behavior is not that of one who wants to honestly participate for the betterment of the project. Ottava Rima (talk) 03:48, 26 September 2010 (UTC)Reply
The debate at w:WP:Child protection was well attended, and the outcome demonstrates the sort of bad policy that emerges when consensus policy-making is forbidden in favor of bans from the top. The strongest argument for allowing self-professed pedophiles to edit, whose comments and actions do not threaten other users, is that Wikipedia has a duty to the community to disseminate information that can help end child molestation. Our society doesn't imprison people for pedophilia, only for overt acts; someone has to persuade them to seek and obtain effective treatment rather than giving in to "the inevitable". For them to want to obtain treatment, they have to be able to learn what it involves first — such is the role of an educational resource. But will you or I be going to the library, visiting the psychiatrist, asking for information and self-help books about pedophilia? I don't think so. So Wikipedia actually needs pedophiles, who are honestly seeking effective therapy, in order to save children from abuse. This is as deadly serious a business as trying to protect children from being set up for abuse here on the Wiki, and the number of children affected is potentially much greater.
The policy actually forced on the community, at the link above and in w:WP:BLOCK, contains one paragraph to protect children and another paragraph to protect pedophiles. It prohibits people from calling attention to pedophile editing patterns in public forums. Everything is left to the increasingly concentrated power of the ArbCom, and whenever they decide to misuse that power on behalf of some trusted colleague, Wikipedia may be put in severe financial risk. Now tell me - is that truly a better or safer policy than leaving it up to the community to decide community policy?? Wnt 18:13, 26 September 2010 (UTC)Reply
Our society does imprison people for pedophilia. You don't like it, but you hold a fringe view and push it in a highly disruptive manner. You defend the most reprehensible things and it is nigh impossible for you to do so out of good faith. Ottava Rima (talk) 22:44, 26 September 2010 (UTC)Reply
So far as I know, a person could walk into a police station in Florida and tell them that his only sexual interest involves fantasies of molesting children, and the most they could do to him is give him the number of a psychiatrist. But Wikipedia can do a lot more than that. Wnt 07:29, 27 September 2010 (UTC)Reply
Banning someone from Wikipedia is the same as what they would do above - they would ban the guy from schools, libraries, and other areas with children. We have children who even administer on this website. Once a person reveals theirself as a predator, they are legally banned from interactions with kids. Ottava Rima (talk) 14:00, 27 September 2010 (UTC)Reply
I checked online to be sure, and to be fair, I found that yes, there have been a handful of public library systems that tried to ban convicted sex offenders after their release from prison.[2] However, the ACLU successfully argued to have this overturned on constitutional grounds.[3] Wnt 17:50, 27 September 2010 (UTC)Reply
The difference between what I said and that ruling was simple - I mentioned children. The "wholesale" ban does not differentiate between such. The Supreme Court has ruled that sex offenders can be held in jail even beyond their punishment is over simply to keep them away from children. Admitted pedophiles, rapists, molesters, and all the rest have very few "rights" within the US as they are a clear and major threat to the safety of children. Ottava Rima (talk) 18:46, 27 September 2010 (UTC)Reply
Wikipedia is much like a public library, with a healthy mixture of adults and children perusing the stacks. I suspect that public libraries draw more children, if anything, because they are allowed to lend copyrighted popular books, movies and music, from which Wikipedia has been prohibited. The Supreme Court ruling favoring w:civil confinement concerned sex offenders with a demonstrated lack of control over their own behavior, and ordered their imprisonment in a mental hospital rather than a jail. It is true, of course, that some sex offenders upon release from prison are still subject to substantial parole restrictions, but that is completely distinct from any prohibition on people who merely suffer from the psychiatric condition. Wikipedia should recognize that there is neither a legal requirement nor any practical capability to exclude such people from the project, and rather than making a cosmetic effort to do the undoable, it should make a deeply serious effort to help those believed to be beyond help. The only relevance to the current controversy is that once again Wikipedia is being pressured to compromise core principle in order to make a cosmetic effort, and once again this would be a mistake. Wnt 22:30, 27 September 2010 (UTC)Reply
The SCOTUS ruling in general said that pedophiles in general are dangerous and it doesn't matter about the "probability" of them doing it at all matters as even a slim possibility is enough to warrant them to be denied any "rights" they can claim. This means that libraries are allowed to prohibit people who are blatant pedophiles from having access to children. We are held to a much higher standard than libraries as we are putting children into a position of power and have a high level of trust we have to uphold with both schools and families. Ottava Rima (talk) 23:44, 27 September 2010 (UTC)Reply
I've already provided direct links to my sources and to the relevant article, which directly contradict your claims; I'll not waste more space on this digression. Wnt 03:07, 28 September 2010 (UTC)Reply
You provided links that don't say what you wished for them to say. That was pointed out multiple times. Ottava Rima (talk) 14:04, 28 September 2010 (UTC)Reply
Right, but this is not a trial, and your arguments are thoroughly weakened by stooping to attack editors rather than their ideas. Better yet, don't attack anything, just share your point of view and see how it can be integrated into a discussion.
I don't know the details of the DJs history, but rest assured that this is not the place to bring it up. If you want to create a thread at AN, contact the WMF, have an RfC about a specific issue go ahead, but it is poor form to raise the specter of someone's alleged character faults and political history every time they don't share your opinion. I'm an outsider in your dispute, and, though I was initially opposed to your perspective, I can see the merit in making sure Wikipedia is not preemptively banned in places with more traditional or restrictive cultures. But when you keep raving at other editors, it reflects poorly on your perspective and the many millions of people whom you might speak for.
Freedom is a radical idea, and I support the DJ in trying to defend it. I'm sure there are borderline cases that feel squeamish or risk crossing lines. But it's an old saw that if you're not willing to defend the rights of those whose behavior you would despise, then freedom means little. Censorship shares that quality. You disagree, fine--you're not a crusader for freedom-- maybe for other things. Still, stick to your argument, express it persuasively, and it will be noticed. Ocaasi 04:50, 26 September 2010 (UTC)Reply
To underline this, see this AN thread where a significant part of the community (self included) endorsed the right to edit of a user who appeared to be a self-declared radical racist and hatemonger, and opposed a ban, because here we don't ban people for their off-project opinions but only for their on-project actions. Without knowing theDJ's history I would back Ocaasi's stance here. We don't diss users for having radical ideas about freedom or about who may edit. We might not agree every time, but the Wikimedia tradition is largely one of radical openness. FT2 (Talk | email) 10:36, 26 September 2010 (UTC)Reply
What you and FT2 don't understand is that this is not a ban proposal. This is about project governance. Those with experience and putting forth honest arguments are the only ones listened to and deemed credible for such a thing. We are not your platform for promoting "freedom". We have many standards that we must be following, and it is nice that you are at least admitting that you are here to further a personal POV than do what is best for the WMF's integrity as an educational project. Ottava Rima (talk) 12:49, 26 September 2010 (UTC)Reply
Your experience is valuable. It will speak for itself and does not require dismissing, or devaluing other, perhaps less experienced, editor's views. If you know more and know why a proposal will or won't work, you should be able to back it up with better evidence and better explanations. In fairness, you can't really be judge and jury when it comes to labeling theDJ's arguments as 'dishonest'. Why are your proposed changes better? How could your goals be accomplished without sacrificing the freedoms that other editors are seeking? Why should a more limited view of controversy, usefulness, or education take precedence over a broader one--merely because some or even many object to it? When it comes to limiting content, I think the burden of proof is on those who want to exclude it.
More importantly, this study on controversial content had the potential to supersede Commons policy (or recommend doing so), so even existing policy isn't definitive for these kinds of changes. And as you know, even policy changes. So while this may wind up one day being a technical matter about project governance, any dramatic changes to mere process always involve dipping back into the foundational principles. Free access to free knowledge is about a foundational as it gets, so to the extent that this proposal raises that issue, it's not just fair game, it's the game. Trying to reframe this debate from philosophy or principles to pragmatics doesn't address the big issues.
As for my "personal POV" as opposed to "what is best for the WMF's integrity as an educational project"... again, you are assuming your conclusion, that your approach is what's best for the WMF's integrity as an educational project. Perhaps I object to your idea about the scope of what is educational. Commons:Project_scope says:
  • Wikimedia Commons is a media file repository making available public domain and freely-licensed educational media content (images, sound and video clips) to all. It acts as a common repository for the various projects of the Wikimedia Foundation, but you do not need to belong to one of those projects to use media hosted here.
  • The expression “educational” is to be understood according to its broad meaning of “providing knowledge; instructional or informative”.
  • We hold many high quality images of species-identified birds, and there is no realistic educational use for a small, blurry, poorly composed snapshot of an unidentified and unidentifiable bird. Of course, there is always room for another educationally distinct image, for example illustrating some aspect of bird behaviour that we do not currently cover, even if the image is perhaps not of the highest quality.
Those are not censorship-prohibiting free-for-alls, but they provide pretty wide latitude. When systematic re-examinations happen, big questions re-open and many try to sway discussions in the direction of their preference. I might be guilty of that to a degree, unapologetically speaking against censorship. You are claiming that you represent the verbatim policy-view, and that others are mere lobbyists for moral anarchy. But I don't think either side of that is true. I think I see use for some discrimination. I think you want a dramatically cleaner Commons. Ok, fight for it. But don't then say you're just doing it for policy and everyone else is just axe-grinding. Ocaasi 14:19, 26 September 2010 (UTC)Reply
"Good arguments" aren't enough because people lack the time and resources to differentiate between long term disruptors and honest posters. "n fairness, you can't really be judge and jury " I can. I have the ability to point it out like everyone else. "Why are your proposed changes better?" I didn't propose any. My purpose is just to point out the disruption for others to see and act accordingly. "without sacrificing the freedoms that other editors are seeking" Wikimedia isn't about freedom. There is no such thing as freedom here. Making arguments about freedom show a lack of understanding about what we do. "merely because some or even many object to it?" We aren't an anarchy. We have standards. "Perhaps I object to your idea about the scope of what is educational. " I have years of experience in both real life and working on educational projects here. I also deal with many schools, colleges, and universities who use our projects, so I know what our reputation is and the problems that exist because of a small fringe pushing a libertine agenda. Ottava Rima (talk) 22:54, 26 September 2010 (UTC)Reply
Ottava, to be quite blunt with you (and you, of all people, ought have no issue with bluntness), there's one party in this discussion I've already seen kicked off a project for continually causing disruption, and that person is you. Yet you insist on accusing everyone who disagrees with you of disruption. It's kind of the same principle that, if you encounter ten assholes on the road every morning, one of them is probably you. That being said, you really do need good arguments, even if you legitimately believe everyone but you intends only to disrupt. It doesn't even matter who makes the argument, that's an ad hominem argument. An argument is good or bad based upon its merits and the evidence that backs it up, not based upon who puts it forth.
On to your characterizations of being "libertine". This project is founded with, really, pretty "libertine" ideals—free content, knowledge presented without bias or favor toward anyone or anything, not being censored, and basically, a collection of knowledge not restrained by people taking offense to it, not restrained by copyright restrictions, not restrained by any of the normal issues we find in such things. So referring to something as "libertine" in the context of Wikimedia is, really, complimentary toward it—Wikimedia projects are intended to be libre. You need a good argument to argue that restrictions should be imposed, since the status quo is "We don't like imposing restrictions, and will impose as few as possible". Would you like to propose a restriction? Then please clearly state what you want the restriction to be, why the good it would do would outweigh the harm to our ideals of a free and open project, and how you propose to implement it. Continually casting aspersions against those who don't agree with you, while refusing to engage in real debate or put forth concrete ideas with concrete reasons, is what's really disruptive here. Seraphimblade 03:50, 27 September 2010 (UTC)Reply
Seraphimblade, once you tried to claim I was "kicked off a project" you invalidated your own argument. The project is not libertine, but it is obvious that you have a fringe POV and that with your claims above, you disqualified yourself from anything related to credible response. It is funny how you try to confuse two different roots with each other. The color "red" is a homonym (like you use above) with "read", and you read Wikipedia, so I guess you use the color too as part of the process? You made up a statement above, revealed a very fringe POV, and made arguments that are not based in rationale process. Ottava Rima (talk) 14:00, 27 September 2010 (UTC)Reply
Without good arguments, your appeal to experience amounts to an appeal to authority. Why is your view better suited to the mission? You keep pointing out the disruption as if it is unquestionable fact. But people question it. It is debatable what is controversial, what should be controversial, what kinds of controversies are tolerable, and how to handle them. All of these things are being discussed. "There is no such thing as freedom here" sounds a bit totalitarian. "We have standards" assumed I said we didn't. Maybe people interpret certain standards differently or argue about which ones should govern going forward. Your years of experience, again, may add important background, but they are not definitive (except for you), they don't reflect all universities, they don't reflect all concerns. They're useful but not sufficient.
Also, to put this in historical context, "a small fringe pushing a libertine agenda" is probably why universities even exist. I don't know where you have worked or what problems Wikipedia has caused, but you still haven't explained how Wikipedia causes them problems, what the solutions would be, why those solutions are tolerable (though Wikimedia may censor internally, it doesn't censor due to outside institutional rules), and how to accomplish those solutions with minimal interference to those who want to use the content you want to exclude.
Wikipedia wasn't designed to be convenient to govern, it wasn't designed for all universities to like, it wasn't designed for anyone's individual preferences. Also, as I said before, this is not a proposal about how to manage Commons. It is broader than that; it is a foundation-wide recommendation about handling controversial content. You can keep appealing to your work on Commons or with outside educational projects, but if you can't both detail them and address broader objections, I don't think your view will make headway. Simply saying you're right isn't persuasive. Simply saying x causes problems doesn't get at the bigger issues of whether and how and why to solve them. Simply saying others are pushing a fringe agenda is trying to label away the argument. And it's a meaningless statement, because anyone could just retort that it's you pushing a fringe agenda. Neither is an argument.Ocaasi 03:59, 27 September 2010 (UTC)Reply
Authority and experience always matters. Wikipedia and all the sister projects recognize it. By the way, Wikipedia was designed to be for schools and universities as Jimbo has always made it clear that he always wanted to aid education and always targetted traditional understandings of the word when he used it. You want to radically reinvision everything and lack any real experience in knowing what works and what doesn't work. You haven't put forth any legitimate objections, or anything else except that you have a fringe POV that actually detracts from what WMF is and supposed to be. Ottava Rima (talk) 14:00, 27 September 2010 (UTC)Reply
Of course authority and experience matter, but not until they are translated into reasons. Wikipedia was designed to be free so that schools could use it. Nowhere was it suggested that Wikipedia was designed to merely replace the Encyclopedia Brittanicas in middleschool libraries. There are always levels of maturity suggested for different audiences and levels of sexuality or violence which are sought to be avoided by different audiences. Wikipedia might make it convenient for those who want to self-censor to do so, but it should not do it systematically except for the most obvious and extreme cases.
Pragmatically, I think it's important that there are at least versions of Wikipedia which can be used by 'more sensitive' audiences, but this is something to derive from the main branch, either as a separate project, a filtered version, or a full fork. As someone else mentioned, Wikipedia is totally copyable and anyone can scrape its content and repackage if for educational purposes if they see fit.
So back to where this all came from. Just because you want more restrictive cultures and institutions to not block Wikipedia doesn't mean Wikipedia should fundamentally change at its roots. Instead, offer some customization, full disclosure, and let the market of ideas and demands do its work to create products that people feel comfortable using. Until then, keep on expanding and developing the most comprehensive and accessible encyclopedia ever made. Ocaasi 11:00, 28 September 2010 (UTC)Reply
My point of being here was not to produce my own suggestions but point out problematic ones put forth by those without experience and have a history of disruption regarding the same fringe views. That was stated multiple times and you could see that. Since you refuse to acknowledge it, there is a problem. Now stop editing logged out or I will ask for a CU to verify that it is actually you signing as your name on that IP. Ottava Rima (talk) 14:06, 28 September 2010 (UTC)Reply
It sounds like you're saying: "I'm not here to make suggestions or discuss the ideas in general but just to tell other people why they're wrong because I know what's best and I've already determined others are either too ignorant to know what I do or too radical and dishonest to be trusted."
As for the i.p. situation, I do it regularly without qualms. My talk pages all link up, and the edit histories overlap. Ask for a CU if that will allow you to have the discussion without concern. Ocaasi 17:07, 28 September 2010 (UTC)Reply

A pragmatic approach-- is this a perception problem or a shared-resource problem?

edit

Sometimes divisive issues are easier to resolve when discussed from a pragmatic stance. Compromising on principles in the abstract is very difficult, but pragmatic concerns often lead towards the middle ground.

So, it looks as if there's a growing understanding that the project will continue to host notable, educational images even if they are pornographic, shocking, blasphemous, etc. I _think_ basically everyone agrees that any legal image will be should kept if it's being used on any Wikipedia article in any language?

difficult standard
"How many images"?

It sounds like the remaining point of contention then is over the "how many images are necessary to illustrate a subject?". As the discussion about blue tit feet above illustrates, it's very very difficult to decide precisely how many image a "free-image bank" needs to illustrate a topic. It's an nearly impossible task because we can't imagine ahead of time all the potential applications of free information. You just can't take a given topic and reasonably guess how many pictures are "enough". A dozen? a million? It's hard to say. And it's not at all clear to me at this point that having that "more than enough" can actually a bad thing.

better standard
"How much shared-resource usage?"

But surely there has to be some limit. People give money to Wikimedia for it to share _all_ the world's information, not just the world's porn. If images of potentially relatively-lesser value are really consuming a disproportionate amount of our shared resources, I could easily support the effort to cut back on them.

Is there any reason for us to believe the "controversial images" are representing a noticeable resource drag? If we were to try to guess how much it costs to host the 'controversial images' we currently host-- just in terms of direct actual hosting costs-- what fraction of our operating costs would it be?

If the answer is "noticeable", then that's good reason to justify cutting back. We don't want erotic art using more than its 'fair share' of resources-- something that's going to be even more true for video than for images.

If the answer is "infinitesimal", then that's important to know too. If it's not a practical concern, then we can focus purely on the bigger question of what values WM wants to embody.

Do we know the answers to these questions? Is there really a big problem with the sheer number of images on commons? Or is the real problem that people are saying bad things about us because these images exist on commons at all. --Alecmconroy 16:28, 26 September 2010 (UTC)Reply

There is a valid point here. There are practical limits to what the project can hold, and we shouldn't waste resources unnecessarily. While I've been a very strong opponent of sexual content deletions, I actually did propose one file for deletion (Commons:Commons:Deletion requests/File:Explicit wmf.OGG) because someone had simply used a program to puff up a gallery of small images into a 26 megabyte video. I should emphasize, however, that we do already have policies regarding deletion of duplicate images and images that don't add anything — these policies generally address individual items, not categories of content singled out as controversial.
Considering these images in terms of raw download bandwidth should be done with some philosophical consideration. One might argue, after all, that the image that is never downloaded is the true waste of project resources. A frequently downloaded image of female genitalia might be being abused for self-abuse ... then again, it is also certainly a topic of some very significant academic and practical curiosity. After all, American schools have a whole class about sex education. Wnt 18:34, 26 September 2010 (UTC)Reply
From looking at what's going on, I believe that the issue is more of a PR and usability one. Nothing in the study even mentions resource allocation. I would hope, if this were an issue, the feedback of our developers and system admins would've been sought as a part of the study as it was being done. Given that Commons alone hosts terabytes of content, and only a small fraction of that content fits the description of "controversial" as defined here, I doubt that resource allocation is a significant issue. Anymore, you can get a terabyte hard drive at retail for well under $100—and I sincerely doubt that WMF is purchasing hardware at retail price! As such a major website, we could easily negotiate very competitive terms on hardware.
Given this, storage is probably not a major issue, and regardless, deletion would not help if it were—deleted content remains in the database, it just has a "deleted" flag set on it so that it is not visible to non-admins. Deleting something does not reduce our storage requirements by one bit. If it turned out that storage were a major issue, some type of "permanent deletion" mechanism would have to be made available, where something deleted is actually removed permanently and irrevocably from the database. As no such thing has been proposed, and nothing technical has been part of this study at all, I don't think resource limitations are of serious concern. Bandwidth of course is, but I would venture a guess that we use more bandwidth to service English Wikipedia requests in an hour than we do for Commons images (let alone a small subset of Commons images) in a week. I of course could be quite wrong in that, but again, if that's an issue, we need hard numbers and recommendations from our developers and server admins. Again, since nothing in this study even asked for their input, I don't imagine that bandwidth usage is a major issue here. If it is, we need to hear from the devs and admins—what is the issue, how far out of hand is it, and what do they recommend doing to get things under control? Seraphimblade 18:57, 26 September 2010 (UTC)Reply
I seriously doubt that resources are an issue I have about a 100,000 images sitting on a 500GB drive. The issue is one of curating. Take the Blue Tits there are these nine images (5% of the total) 1 2 ... 8 9 of them all the same and screencaps from the video. It is not the only redundant sequence in Blue Tits, there are a number of others. Everyone looking for Blue Tit images has to bypass those 9 and the other duplicate images. Their presence in a resource is what I meant above by "more is less". Trainspotters record every engine/locomotive carriage that they see, plane spotters do the same with aircraft, truck spotters do the same with say every "Eddie Stobbart" vehicle that passes. They can probably tell you that vehicle such-and-such broke down on the A214 on February 23rd 2008 or whatever, fascinating stuff but one hardly needs an image of it along with photos of the rest of the fleet. Same with 1000 photos of pricks they aren't all necessary. NOTE: I was once in a village hall where someone was doing a slideshow on the history of local company, 90 minutes of photos of cement wagons. --62.49.31.176 19:51, 26 September 2010 (UTC)Reply
That's why we have categories and subcategories. We can drop images as far down into the tree as is needed to sort out them out so that you can always find the one you're looking for. True, there is room for improvement in the category system, and improvement will be needed to help the database remain as navigable with billions and eventually trillions of images as it is with millions. But devising new features to hide or delete images only takes away from the effort spent toward that needed innovation. Wnt 07:16, 27 September 2010 (UTC)Reply
It is my belief that when one has to create a category "Eddie Stobbart trucks that broke down in February 2008" one has gone too far. If Commons is to be some form of Image Bank then the category system is broken. Directing everything into the most specific categories is a woeful choice. For example if I'm looking for dragonfly pictures I wouldn't expect to have to drill down through several taxonomy layers in order to find a photo I want. If I search for Blue damselfly I'd expect to see more than one Azure in the list and perhaps a few White-legged ones too. The way it is constructed at the moment one pretty much needs to know exactly what one wants. LOL I just searched for "Queen Anne furniture" got poor results so I relaxed it to just "furniture" and what do I get an OTK spanking photo. --194.193.183.253 12:22, 27 September 2010 (UTC)Reply
I can readily imagine fixing much of the problem you describe with some sort of "best in category" feature, akin to Featured Images on a much smaller and more casual scale. There are also "article pages" on Commons that accomplish this a different way; they're essentially hand-made directories of the content (see Commons:Furniture). We should not adopt the project's mission to accommodate its software limitations, but fix the software limitations to accommodate the mission. As for furniture, I looked at both the "article page" Furniture and the Category:Furniture on Commons and didn't see any spanking at the top level. The proposed Commons:Commons:Sexual content emphasizes that (like all images on Commons) sexual content should not be "overcategorized", i.e. it should be placed in the most specific applicable category and thus not into a broad category like Furniture. Wnt 18:00, 27 September 2010 (UTC)Reply
I didn't say I'd looked in the category furniture but that I'd searched for 'furniture' the spanking picture is delivered up because it has 'furniture' in the title, and now I'm just a few clicks away from CBT images, and all I wanted was a "Queen Anne dresser". This is the classic problem of NOT flagging the content as sexual, or violent, or whatever, they leak out and show up when they really aren't wanted at all.
I'm not entirely sure how the search comes up with its results. But I'm also not sure that the current proposals would affect this anyway. There's a bit in the Commons sexual content proposal about naming sexual images appropriately, and perhaps there is some more specific term for that contraption than "bondage furniture", so that by renaming it appropriately it would be less likely (I assume) to come up in the search results. I don't know what you'd call it though. Wnt 22:42, 27 September 2010 (UTC)Reply

[emphasizes that (like all images on Commons) sexual content should not be "overcategorized"]

As I was saying if the idea is that commons is a general image bank rather than a repository for encyclopaedic images, then things should be overcategorized. Things should be categorised 'blue' and 'damselfly' so that someone searching for "blue damselfly" gets some Argia species too.

Sorry, I should have explained: Commons:Commons:Categories gives a technical definition of overcategorization, which is when you put an image into a category directly, when it is already present in a subcategory of that same category. Since blue and damselfly are not subsets of one another, the image can be placed in both (or it can be placed in "blue damselfly" and that category can be placed in both). Wnt 22:42, 27 September 2010 (UTC)Reply

I don't know art but I know what I like

edit

Recommendations on “Controversial” Images

It is recommended:
[...] 5. That historical, ethnographic and art images be excluded from such a review, and be considered in virtually all cases to be in scope.

So, what qualifies as an "art" image exactly? Must the artiste be well-known or acclaimed? Where would MET ART or analogous images fall? I smell a loophole that needs to be accounted for; one merely has to dress up their image up as "artsy" to get it approved, just as some porn once had to be dressed up as medically educational. --Cybercobra 18:18, 26 September 2010 (UTC)Reply

There's been some statement about this elsewhere, but I don't think there's any reasonable debate, in the abstract, about whether or not educational will include art. Educational can't just mean "The Three Rs", educational has to be interpreted widely-- "the sum of all the world's knowledge" absolutely includes art. --Alecmconroy 13:47, 1 October 2010 (UTC)Reply
I think you may have misunderstood. My point is about how one defines art in order to draw the art vs. porn distinction which the study authors propose. What MET ART is may not be obvious from the name. --Cybercobra 01:46, 5 October 2010 (UTC)Reply

Western cultural and other biases showing

edit

Setting up a wiki for children is a nice idea, and I don't think that creating a mechanism to "opt out" (if it can not be hijacked by other organisations) is really controversial either. But the meat of the study is in the recommendations 4 and 5. Here, a clear bias towards a modern educated western view of the world is visible:

  • breasts, genital areas (pubis, vulva, penis) and/or buttocks: Why those? There are lots of arousing pictures of lips, of bodies without any of the above visible. There is a non-neglectable part of humanity where an exposed female leg or even female hair would count as sexually arousing. So why mention these 3? Why not more? Or, for that matter, why not less? From a biological point of view, buttocks and breasts are not really sexual, so why include these? What counts as "nudity" is culturally defined, and the study suggests using one specific definition, namely the one currently used in the west and specifically the US and elevate it over others that are just as valid.
  • historical: Same problem here. Why does porn stop being stuff made to arouse once it ages a few years. Is the intent of Marqui de Sade any less clear simply because it is some centuries old? Is a playboy picture from the 50s any different from one from today? And who knows whether all those antique statues were not simply commissioned so some rich roman senator could jerk off? Again, the bias is "antique is good, modern is bad", ingrained in every western pupil since the renaissance.
  • art: Same as above. Defining art is nigh-impossible. Are you saying that placing a junk of fat on the ground and letting it rot is not art? Some (highly paid, famous, respected) German artist begs to differ. There simply is no objective way to define art. So in the end, whoever is to make the decision would use his personal guide to what art is (in case of commons editors, most likely, again, heavily biased towards the contemporary western definition).

Why do these biases show? Because the study is trying the impossible: To objectively justify removal of images that people dislike (for various reasons). But which pictures people dislike depends to 100% on their cultural upbringing, it is entirely subjective. Any try to pin it down only ever succeed in pinning it down to one culture specific set, in this case a mostly western one (or, as it seems, culturally specific to those who complained the loudest). --Xeeron 17:08, 28 September 2010 (UTC)Reply

1. Claiming a "Western bias" is ignoring the billion Muslims and over a billion Chinese individuals who are against porn and have laws that are far more conservative than the "Western" countries. If anything, the Western bias is to be too friendly to porn. 2. Porn is always porn and doesn't become art with age. 3. It is very easy to see what is porn and what is art. Art historians do it quite regularly. Your statement questioning arousing images of genitals as singled out shows that there is a problem within your statements. It seems like you would rather push a fringe POV no matter what and use bad arguments and misleading claims like others before. As a side note, Germany is part of the Western world if you haven't noticed. Ottava Rima (talk) 19:03, 28 September 2010 (UTC)Reply
Did you even read anything of my post apart from the header? I explicitly stated that several parts of the world have more conservative views and that to use the 3 body parts mentioned and not more, is part of the western bias. --Xeeron 21:29, 28 September 2010 (UTC)Reply
The 3 body parts are the only ones acknowledged as universally part of the "sex organs", which have, in all cultures, a different connotation than any other body part. As such, your attempt to correct me falls flat. There is no mass subjectivity as you are trying to go on about. Ottava Rima (talk) 01:17, 29 September 2010 (UTC)Reply
You are contradicting yourself. First you claim that billions of Muslims and Chinese are more conservative, now you claim that these 3 organs are universally accepted. Apart from the contradiction, without any serious scientific backup your broad claim is just that: A claim, which I dont believe. --Xeeron 09:50, 29 September 2010 (UTC)Reply
Ottava, as you know even "the three" are not universal, and many indigenous cultures find breasts completely innocuous. Many european countries allow exposed breasts in public or on television. It's not that every place feels that way, but to portray those body parts as universal is not quite accurate. Xeeron didn't identify the ethnographic bias, but that is the other one, aside from the ancient-bias. We tolerate nudity from the 'backwards' folk because it's 'ethnographic'. This is a pervasive kind of bias, but it is also revealing; all of those merely ethnographic people--they don't count for real or anything--have no problem with the parts of their body which are regularly exposed.
Xeeron did not suggest that Western bias was correct per se, so your objection about Chinese and Muslims is a straw man; he never suggested the Western way was right, merely that the Western way is not the only way. If anything, that would support your point of view. In fact, Xeeron's point is fundamentally proven by your persistent objection: different people and places do in fact have different standards about what is sexual/arousing/pornographic/taboo/controversial.
We can try to form pragmatic consensus, but not under the guise of some universal standard of decency, which is apparently not so universal. Also, all of those Muslims and Chinese, though I want them to learn everything they can, are also not exactly the shining examples of freedom and discourse for which Wikipedia was created. I don't think they are the poster-children for open, collaborative values, free-information, and full exposure to knowledge. It's not the fault of many of them that their cultures suppress this kind of knowledge, but it sounds like you're saying we should censor ourselves a little to accommodate the people who censor themselves a lot. I don't care for that reasoning, except for the fruitful end of getting more information to whomever is seeking it. Ocaasi 11:06, 29 September 2010 (UTC)Reply
Ocassi, as I said to Xeeron, I will say to you - don't try to claim words are saying something different. "did not suggest that Western bias was correct per se" That is disruptive. As I stated, I accused him of saying that the Western bias was wrong by saying that there is no such thing as "Western" here. This is something that, for all your words, you don't actually respond to. Furthermore, the three organs are universally known as sex organs. It is impossible to claim otherwise. The penis will always be the male sex organ and the vagina will always be the female sex organ. It is biology 101. Stop with the disruptive behavior already. Ottava Rima (talk) 19:38, 29 September 2010 (UTC)Reply
"First you claim that billions of Muslims and Chinese are more conservative, now you claim that these 3 organs are universally accepted" as pornographic. Please do not manipulate my words and claim things that are opposite. This is a serious problem with your claims and is highly inappropriate behavior. Ottava Rima (talk) 19:38, 29 September 2010 (UTC)Reply
Can you explain, then? Aren't you saying that 4 million people, a good majority of whom are Chinese or Muslim, have stricter ideas about sexuality, and that breasts/penises/vaginas are universally sexual? You have repeated those basic points several times, so it would be good if your meaning wasn't being misinterpreted. Ocaasi 69.142.154.10 23:54, 6 October 2010 (UTC)Reply

Pragmatic insight 2: if this is mostly "public image"/pr issue, let's take a stand and use this as an opportunity to teach about intellectual freedom

edit

Our preference for free content collection of the world's information is a radical idea. If it's controversial in some circles, then this is Wikimedia's chance to teach the world what free information is really like.

As discussed above, it seems to me that no one is seriously claiming arousing image are getting a disproportionate share of resource. While we speak of controversial images potentially being impeding our educational mission, this damage is seen as indirect, damaging the 'reputation' of the projects via negative press attention.

But, so long as it's not a excessive resource drain, I don't think we can seriously consider deleting any images just because they are arousing or pornographic. As long as we are sure that the image is 100% legal, by both the letter and the spirit of the laws, and as long as we are sure the image is Creative Common Share-Alike, then we should probably keep it.

An image bank is an image bank. Inclusion in a image bank does not reflect any value judgment about quality or morality of the images it hosts. The purpose of Wikimedia is to help nurture a free-content image-collection community. Deleting will almost never help advance our mission-- because ANY picture is more educational than no picture at all.

Instead, after all this thinking, I believe FT2 has said it best above. Enforce standards to avoid unnecessary resource use by screening for redundancy, ultra-low-quality, nonsense, etc. And then just accept that Commons, like the internet, will have porn images on it, and that's okay.

The sky will not fall in, armed mobs will not assemble and plo to burn copies of Wikipedia. Everything will keep on running just like it has been. Our volunteers and our editors both know we have 'controversial' images, and both groups are okay with it. People will keep donating and editing.

We don't have to impose Shari law or Christian Puritanicalism on Wikipedia. Right now, there's a lot of comedy in the US over a Senate candidate who once described masturbation as "sinful", while others say that's a "fringe" view. And it occurs to me that, beneath it all, we're having that same debate here. everyone's gotten all in a tizzy over the possibility that someone, somewhere, might be using images they found in Wikimedia to masturbate, and if Wikimedia is in any way connected to potential masturbation, we won't be taken seriously and it will damage 'our reputation'.

We don't need to be stressed over this. Society has made their peace with the fact that the internet has 'controversial' images on it. Our readers know that Wikimedia projects sometimes have 'controversial' images', and they love us for it. Our donors know that Wikimedia projects sometimes have 'controversial images' and they paid us to keep it that way.

The American Library Association is sponsoring Banned Books Week, in which public libraries celebrate living in a time and place with first amendment rights. Wikimedia will not start banning books from the planet's free-content library-- and that's something for us to be proud of.

By all means, Enforce legality with an iron first. Sure, implement any all manner of voluntary flagging and self-filtering. But don't cop out from the basic principle: we don't delete legal content just because it's controversial, period. --Alecmconroy 20:20, 29 September 2010 (UTC)Reply

We are not about promoting what is free, but about promoting a free encyclopedia, free educational resources, etc. This is not what you are saying above. Instead, you are positing a political agenda and then assuming that the WMF must cater to it. We are not a mouth piece for "free culture". Encyclopedias have standards. Yes, there is a resource drain by having thousands of inappropriate and unuseful images that get us black listed by hundreds of scores and many countries. Our donors are donating for an encyclopedia, which implies encyclopedic standards. We do not keep every crappy page someone ever comes up with but strive to have the best. Your philosophy would destroy anything credible, which is why it is a fringe POV that has been crushed by reasonable people time and time again. Wikipedia is not an anarchy. Ottava Rima (talk) 00:36, 30 September 2010 (UTC)Reply
P.S., I don't know of any credible library that has shelves full of pornography nor do I know of any that would keep such in a place available for children. The banned books, for the most part, were legitimate stories banned for silly reasons. There is no "pornography is great" celebration by libraries. They have too much self respect for such things. Ottava Rima (talk) 00:37, 30 September 2010 (UTC)Reply

@ Ottava Rima: I’m only asking but could you please list all the countries (with sources) and some "scores" (educational establishments?) that block Wikimedia projects because of only "thousands of inappropriate and unuseful images that get us black listed by hundreds of scores and many countries".

About not knowing "any credible library that has shelves full of pornography", have you ever heard of Legal Deposit, meaning that in some countries all that is published (even porn) goes to at least the national library, if some others libraries, where its available to patrons. See for example what happened with the Swedish National Library, that after putting some magazines with child pornography out of public access (even if possibly accessible to researchers) "the library said in a statement that its collection of “regular” pornography should once again available for viewing sometime this autumn., and might i add to every one of its patrons (they must be 18 or older. So there is "credible libraries that has shelves full of pornography", like the Kinsey Institute for Research in Sex, Gender, and Reproduction at the Indiana University, possibly with the world’s largest pornography collection.

Also, please, read this history about the NYPL, especially the part where Caroline Oyama, manager of public relations of NYPL says does not ask adult patrons to leave, stop what they are doing or move to another computer if another patron doesn't like the Web site he or she is viewing. Instead, we make every attempt to move the user who is offended to another computer where he or she doesn't have to see what the other person is viewing., and what are your comments about this policie of NYPL. Thank youTm 04:39, 30 September 2010 (UTC)Reply

If you ever been to the Library of Congress, you would quickly find out that you can't just ask for porn and be able to skim through their collection. Any national collection has all of the various materials for copyright reasons, not for easy access and availability reasons. And having to stoop to using Sweden as an example shows that no credible library system bothers with such material and only fringe groups do. Your reliance on people using computers as somehow to library holdings is equally baffling and an admittance that there is no argument. Legitimate libraries bother with legitimate works. They aren't porn shops. Academia isn't a porn shop. Wikipedia and the rest is based on being educational, thus part of academia. It has standards, basic requirements, and the rest. It is not the anarchist free for all the fringe wants to turn it into. Ottava Rima (talk) 04:49, 30 September 2010 (UTC)Reply
lol. Promoting a "Creative Commons Culture" isn't tantamount to anarchy-- it's part of our mission. But given how inexpensive the 'controversial' images are estimated to be in terms of resources, maybe it makes more sense to start looking at this from an "Internet Archive" perspective, where mere inclusion in the archive doesn't imply the archive endorses the opinions expresses in the material they host.
Since Commons _has_ to be an image bank for technical reasons-- accepting any image that any project wants to use, then maybe the 'practical' way out is to embrace the role as a content-neutral free-licensed image archive. --Alecmconroy 09:06, 30 September 2010 (UTC)Reply
We were around before Creative Commons. We also followed GFDL long before we ever adopted CC. And resources include time and respected individuals, of which the porn cost us a lo of both. Furthermore, Commons doesn't accept non-free licensed images, so it does not "accept any image that any project wants to use". Your understanding of Wiki, history, etc, is severely lacking. Ottava Rima (talk) 14:38, 30 September 2010 (UTC)Reply
As encyclopedists we are heir to the tradition of Diderot and support free inquiry and free access to information for the purpose of advancing human freedom and knowledge; we oppose ignorance, superstition, and tyranny over the human mind, and uphold the dignity and worth of the individual. But none of this has anything to do with hardcore pornography. Hardcore pornography is a profitable industrial business which addicts, enslaves the mind, degrades the dignity of the individual, damages the healthy development of young people, and in particular has both the intent of effect of normalizing the degradation of women. Why should we support that? And please do not conflate this with Sharia or whatever. I am a Unitarian and a Social Democrat and so are many others who await your consignment to the dustbin of history. Herostratus 18:27, 30 September 2010 (UTC)Reply
Britannica isn't a porno collection so there is no argument that can be made to say that the porn is encyclopedic. Diderot also did not support large collections of porn. You are a POV pusher of a fringe group that is the reason why the Netherlands has constant riots and protests. People of your own country are tired of your anarchist ways and the last refuge you have is an attempt to take over Wiki. Ottava Rima (talk) 20:15, 30 September 2010 (UTC)Reply

{od}I think we may be talking at cross-purposes here due to confusion over who is being addressed. I was responding to the original poster in this section. Herostratus 04:22, 1 October 2010 (UTC)Reply

Sorry, I didn't mean to make it responding to yours but a follow up. Ottava Rima (talk) 14:37, 1 October 2010 (UTC)Reply
Wikipedia covers a lot of things you won't find in Britannica.
And as for Diderot, he started off writing erotica[4] which "mixes philosophy, pornography, comedy and social satire in a way that is still startling"[5] and in Le Reve de D'Alembert he wrote of homosexuality with "sympathy and understanding", to "strike an astonishingly modern note in his plea for common sense and tolerance".[6] Wnt 01:31, 18 November 2010 (UTC)Reply
edit

Under US law (anti voyuer and anti upskirt laws at both state and federal level), legitimae pornography hosting requires you to have a submitted statement that the individual in the image gave the consent for the image. This allows for anyone having their image stolen to be able to sue and make the uploader criminally liable. An addition requirement of proof of age for the models being over 18, as per US law, would also be a good match. It is also a matter that the WMF could be theoretically liable for (regardless of a lawyer's opinion) not requiring it since interstate transmitting of such pornography (upskirting images and child pornography) is against Federal Law, thus violating S230 immunity and it would be gross negligence on WMF's part to not have a preventive measure in place. It would seem that this would be one of the few basic steps that would be necessary to adopt. OTRS can easily handle such releases. Ottava Rima (talk) 14:44, 30 September 2010 (UTC)Reply

These all sound reasonable. But, do you really think the WMF legal counsel hasn't considered those issues? Mr. Godwin seems pretty up on things. Did they just overlook this, or are you suggesting things which are already covered by policy? Ocaasi 17:32, 30 September 2010 (UTC)Reply
This has been a major point of contention at Commons:Commons:Sexual content. First, bear in mind that consent cannot possibly be obtained for previously published photos, and sometimes was entirely lacking for such photos, e.g. Commons:Category:Abu Ghraib prisoner abuse. Next, consider that the meat of the proposal is to require that for every existing self-posted sexual image, the uploader has to be tracked down and forced to submit a little blurb saying the content really has consent. But invasion of privacy is already illegal, and illegal images are already against WMF policy. Lying on the little form is also against WMF policy, but lacks any serious legal implication. So requiring a "statement of consent" from the photo uploader is just a ruse to dump content by people who aren't watching Commons.
It is true that the current upload screen may need a stronger warning than "Compromising or embarrassing images of non-public people taken without their knowledge are often problematic. Use good judgment." But that's just a courtesy warning for uploaders — WMF is not their lawyer.
There is a separate discussion of the "w:2257" regulations that has occurred at the above page and several other places. The consensus seems to be that these are regulations affecting commercial porn producers, and that while the bill was written to be as ambiguous as humanly achievable, it's not being used seriously as an attempt to censor every naked image on the Internet via crushing paperwork, however much certain elements might want to use it that way. The EFF is on the case.[7] Personally I think the WMF belongs in court about this law ——— as a plaintiff. Wnt 17:58, 30 September 2010 (UTC)Reply
The law makes it clear that the images focus on the genitles, so your use of Abu Grab is silly, unless you are saying that the WMF must be a host to uncensored genitalia of prisoners, which is a BLP violation. You do realize that BLP is one of our most important and WMF supported policies, right? The WMF, by definition, is a distributor and would be violating Federal law through distribution. This is why Yahoo and Craigslist were both forced to shut down parts of their company that were allowing for illegal actions. Your attempt to say that regulating age and verifying consent is "censorship" is disruptive and ridiculous. Ottava Rima (talk) 20:18, 30 September 2010 (UTC)Reply
Photos like File:Abu ghraib nakedhang 06.jpg do include uncensored genitalia, as do many of the copies on the Web. They are reproduced on quality search engines and deeper-delving news sources. Wikimedia should stock the genuine article. BLP is supposed to be a policy against unverifiable negative information, not a ban on saying anything bad about people — though admittedly I know it's not often interpreted that way. As for political ploys about text ads for prostitution on a personals site... this has to do with us how? The only connection I see is you people shoot one hole in the First Amendment and you dance around singing that it's dead. I'm not willing to give it up for dead and I don't want any more holes shot in it. Wnt 21:03, 30 September 2010 (UTC)Reply
And you just linked to a major BLP violation, making you a violator of BLP. You do not have the right to distribute such negative images like that of living people. By the way, all other images used on Wikipedia blur out the genitals specifically because of the BLP violation. Ottava Rima (talk) 22:56, 30 September 2010 (UTC)Reply
I reiterate that it is never supposed to be a BLP violation to tell the truth and report the facts. This is an encyclopedia, not a make believe. Wnt 01:58, 1 October 2010 (UTC)Reply
"Facts" are not enough when it comes to BLP. Fringe material, lack of objective portrayal, and the rest are involved when it comes to BLP. Presentation matters, and putting forth naked pictures of people without consent is not only against the law (per distributing illegal images) but also damages people's reputations. Most subjects don't even come close to notable, so there would be no justification at all. This is just a fraud to put forth as much porn as possible without any regulations. Ottava Rima (talk) 03:01, 1 October 2010 (UTC)Reply
Let's be clear, facts are enough to reveal negative information about a living person--so long as they are sourced and not a privacy violation. This case arguably is a privacy violation, and only on those grounds might Wikipedia blur the genitals in the image. Also, many of the faces in the abuse images are already obstructed or covered, so it's not necessarily a BLP issue. Do you give any consideration to the availability of those images through other very reputable sources like major newspapers? Ocaasi 69.142.154.10 23:35, 6 October 2010 (UTC)Reply

Request for clarification/expansion re sacred images

edit

Robert, thank you for your work. I wonder if you (or others) can provide an expansion. You said "In effect, our surmise is that only three categories of images can currently pass this test [of being controversial] -- sexual images, violent images, and certain images considered sacred by one spiritual tradition or another." By "certain images considered sacred" do you mean certain images considered sacrilegious (which I guess is actually the opposite meaning)? Can you give some examples of what you have in mind? Are you mainly just thinking of portraits of Muhammed? What are some examples of images that we host or have hosted where this stricture would apply vis-a-vis Christianity? Judiasm? Hinduism? Bhuddism? Islam? Herostratus 04:57, 1 October 2010 (UTC)Reply

Your "sacred", "sacrilegious" distinction basically lies at the heart of the issue (we talk about this more in Part 3, which should be posted momentarily.) As far as I know, other than the images of Muhummad controversy, this has been an issue in Wikipedia two other times -- once in regards to the Sacred Temple Garments of Mormons and their depiction, and the other surrounding the inclusion of a picture of the founder of Baha'i on the page devoted to that religion. (Followers of Baha'i don't believe in the public exhibition of his photo) Robertmharris 11:45, 1 October 2010 (UTC)Reply

"User-controlled viewing options" Part II

edit

Robert, interesting and thought-provoking work. This is what I come away with, stripping the details and secondary efforts. You are proposing that:

  1. The people at Commons are to be somehow persuaded (through what magic I don't know) to tag some images as controversial.
  2. And the various Wikipedias etc. will implement a filter. (Presumably this will be easier to do since, I suppose, the Foundation has some degree of leverage and/or good working relationship with the software creators.)

And there was mention of a WikJunior. WikiJunior may be a fine thing, but it will not be a significant player in addressing these problems, for three reasons:

  1. A main (or maybe the main) entry into Wikipedia is through Google. Wikipedia articles rise to the top, and WikiJunior articles will not, for a long time if ever.
  2. Wikipedia is pretty famous, and a second main entry into Wikipedia is by people having heard of it and going directly there to search, I guess. WikiJunior will not become as famous for a long time, if ever.
  3. Kids do not like to be told to use the kiddie toys, and "it's for your own good" is not a notoriously successful way to get kids to do something voluntarily.

As for the tagging/filter, these seems a reasonable way to deal with the situation. I mean, people who don't like it can turn it off, so it seems the least objectionable way to address the issue. There is one very important point about the filter: It must be opt-out. If it's not opt-out (that is, default "on") then it's not just useless, it's worse than useless, because it provides the illusion of a solution.

Regardless of the filter setting, I'm confident that well over 90% of the people using Wikipedia will be unaware of the existence of the filter. If you add a large message "FILTER EXISTS, CLICK HERE TO CHANGE SETTINGS" to every page, I'll drop that to 80%, but only if it's blinking red. Really. Please do not overestimate how savvy, interested, or aware of the details of how their computer program works the average person is. (Think of the "Flashing 12", and there were many many flashing 12 households, I would guess easily the majority. It's not stupidity, it's more complicated than that.)

You will have the devil's time convincing many Wikimedians of this, for a couple of reasons:

  1. They skew highly technically literate, and don't really connect with the mindset of people who aren't.
  2. It's more convenient for them not to believe it.

Also, if the filter is opt-in, how are you going to deal with the many (probably majority) of our readers coming in as anon read-only users from search engines (thereby skipping the main page)? Plus, how are you going to get kids to set the filter on for their own good?

I don't know why there would be much objection to this. Anybody who is able to edit this page to reply is perfectly capable of turning the filter off, so what's your beef? The principle of least harm indicates an opt-out filter. Herostratus 06:06, 1 October 2010 (UTC)Reply

You raise many interesting points. Let me respond to just a couple. First off, Commons editors haven't tagged any images as controversial, but they have organized them into Categories, and while there may be resistance to the notion that the Category "Photographs of people having Intercourse" may be controversial, I would hope that there might be some consensus that it could be labeled "sexual". Secondly, in terms of your opt-in, opt-out notion, we have recommended an opt-in filter, for several reasons, but one of which has to do with your "Flashing-12" phenomenon. I meet many people (I was one of them myself) who are shocked to learn that the default setting for Google Search is "moderate" and that, therefore, their Google Image searches have been "censored" for them without their knowledge or approval. This is what we are trying to avoid with our recommendation of an opt-in filter. I don't disagree with you that many people will be unaware of its presence; but we are trying to serve those people who are actively looking for such a solution. Up to now, to someone who has said to us, "I want to choose what I, or my child sees, on Wikimedia projects", we've basically said -- "we won't let you." With our recommended filters, we will be able to say -- "Do this". It's not how many people use it, as much as providing an option for all those who actively seek such a possibility. Robertmharris 12:11, 1 October 2010 (UTC)Reply
So, as I think you know-- I'm heavy in the "deletion bad, voluntary filtration good" camp. Out of curiosity, what happens when there is disagreement about whether an image is to be considered "controversial"? If we have a high bar to being considered "controversial", then we risk offending people with mislabeled images. Alternatively, if we filter too many, the filter will render the site useless. So, this set up will lead to a new source of debate-- whether a image should bear the label "controversial". Artists won't want their images filtered, etc etc. Whenever someone disagrees with how we've labelled an image, they'll try to make hay about us "approving or disapproving" of opinions expressed in the image.
We could try to do it through consensus, but do we really want to be having conversations about controversialness ratings on a regular basis? Do we want to stand behind a group process that will, at times, reach inconsistent conclusions-- why is THIS controversial and not THAT? etc etc.
What if, instead, there were a completely automated process to "click whatever you think should be filtered" and then people can choose what to hide or show automatically based on how many readers have objected.
The advantages of using some automated process to resolved what is or isn't controversial is that it's truly cultural neutral, which is important to us. And since it's an automated process, rather than a human-mediated decision, we can never be accused of being too strict or too lenient--- we're just a mathematical reflection of what people have flagged as controversial. Most importantly, we wouldn't waste time arguing over what is and isn't a "badimage" anymore-- let the readers who want filtered images automatically flag things on their own, don't make us argue about it on a case by case basis, only to get criticized by all sides.
If you use filter mode, you can flag images. Every account can set a "filter based how often the image gets flagged " or even "filter based on badimageslist available at some unafffiliated website" . That is, to build a filtration, maximize end user personalization and don't make people argue on the merits about "controversialness"-- it will just lead to fights. Let people share their opinions on controversialness, and uses those opinions in aggregate to filter. --Alecmconroy 14:10, 1 October 2010 (UTC)Reply
You raise an interesting option, one that had been expressed to us before. Let me address your comments in a couple of ways. First, to leap to the future, one assumes that if some version of these recommendations were to be accepted (by the Board and the community both), a process of construction of the filters would begin where the questions you raise could and would be addressed. Secondly, let me suggest that we are not as far apart as you might think. We both, I think, are seeking an objective, rather than a subjective manner of determining the meaning of "controversial". Your "automated" way, if I read you correctly, is merely a technical way of saying that "controversial" will be defined by what images people actually choose to manage -- " Let people share their opinions on controversialness, and use those opinions in aggregate to filter," as you put it.
In effect, I would say, that's more or less what we've recommended, although it might seem quite different. We have said that the designation of controversial to images should be applied when there is demonstrable public debate over such images, through laws being applied to their distribution, or studies and reports being written about them, discussions in widespread media forums about them. To us, that's just another way of determining "people sharing their opinions on controversialness" -- maybe yours is more exact, and scientific, but both are very much the same, we would argue. The reason we chose our way is that we were concerned that we not create a system that was too complex for the average Internet user, the kind of person who might want to avail themselves of the service the filter represents. But perhaps there's no reason to suggest that your system would be difficult to use. And, by the way, we are just as concerned as you are about setting the bar too high, or too low. That's why we specifically exempted art works from this whole regime, for example, and I guess it speaks to another one of our concerns about a completely user-defined set of "controversial" images. I know Wikimedians value freedom, it is a defining characteristic of the projects and the community, but do we really want to give people the right to label as controversial Goya's "Naked Maja" or "The Birth of Venus"? Personally -- speaking personally now -- I don't think the community should give people that right, neither for themselves, and certainly not for someone else, like their children, or their students. (maybe your suggested system would prevent that, I'm not trying to put ideas into your proposal that aren't there). Our view, reflected in the study, is that the projects are dedicated to openness and freedom of information, and therefore there are certain limits we should place on what is filterable within the projects, even if that limitation takes away some freedom from our audiences to completely choose what they may or may not view. I guess it's a way of saying, that for us, freedom is an ultimate value in the Wikimedia universe, but it's freedom to see things and be educated, not freedom not to see things and be uneducated. Robertmharris 15:18, 1 October 2010 (UTC)Reply
Sigh. As I say, an opt-in system will only make things worse. As someone who occasionally advocates on the English Wikipedia for the redaction of at least those images which are most harmful to young people, this will move my task from "extremely difficult" to "utterly impossible", as the AfterAllTheyCanALwaysOptOut argument will swamp all other discussion. In fact, with AfterAllTheyCanALwaysOptOut, all bets will be off as to image content, and I expect the introduction of images even worse than the ones we have. Thanks a lot. Pending future discussion, my inclination is to oppose any filter system at all. Herostratus 18:11, 1 October 2010 (UTC)Reply
I know- I have negative feelings about proposed deletions, but the proposed filtration you suggest is very close.
And yeah, one of the benefits of an automated system is that it's culture-neutral, so that people are just as free to censor Goya as they are Muhammad or Playboy. In this way, Wikipedia would remain 'neutral' in the overall debate of what should be filtered and what shouldn't be. Maybe other people won't have my priorities, but I would prefer a content-neutral and a culturally-neutral filter system.
My concern with a single, monolithic, filtration scheme is that it leads to us having to spend time deciding what that system should be. We won't agree with each other, and outside parties will always find fault with our classification.
For example, the first big debate we'll have is that the people who find sexual image controversial are going to find homosexual images to be more controversial than heterosexual. Other people are going to think that it's wrong for us to discriminate. If we have to decide, through discussion, what's controversial and not controversial, then Commons will develop, in essence, an 'official list' of what's controversial and what's not-- and every conservative with a blog will just keep attacking us all the same when our values to mirrors theirs.
The ideal solution would be for third parties to rate our images, and each visitor could select the screener that most meets their values.
But when someone comes to us and says "why is this image controversial and filtered, while this image is unfiltered", the answer can't have anything to do with the actual content of the images or our opinions about those images--- if you think the people are mad that we host stuff, wait till you see how mad they get when we take something they find controversial and don't list it as controversial. A consensus-based in-house filtering system could create more problems than it solves. Everyone has their own ideas on what should be filtered and what shouldn't be, and every person loves to argue endlessly about how their view particular menu of taboo information is the correct one view.
Right now, Wikipedia articles are so successful because the editors put aside their philosophical, religious, and moral differences and try to focus just on making a good article. But the community at large to decide, through debate, controversialness-- I can't imagine it will pretty or a good use of our time.
Still-- filtration is better than deletion. --Alecmconroy 19:19, 1 October 2010 (UTC)Reply
This is getting at the crux of the difficulty. Once you open up this box, everyone will want to take out their favorite controversy. Which means that you either create overly broad categories, like (all photographic images with exposed nipples, genitals, or anuses), or you leave Commons open to continual debate (e.g. in the case of the category 'sexual content').
Also, Robert asked about censoring the Birth of Venus, and said that he personally would never want users to have that option. Well, in fairness, what if I think users should never have the option to censor what an actual human body looks like? Or, on the other side, what if someone thinks that your Western art is just more puerile trash that corrupts a sacred culture, Birth of Venus, included. In other words, this situation is phenomenally complicated and rife for controversy, and you have to make sure that whatever suggestions you make, you ask the simple question: will this new regime--pros and cons--be better or worse than the current one, with all of its issues. Identify which problems and solutions each situation has, measure the change between them, and do some heavy and subjective public calculation. Ocaasi 08:30, 2 October 2010 (UTC)Reply

Comment on the above

edit

I would emphasize that filtering has potential to – and may well – raise anger and add fuel rather than diminish them. It needs extremely careful thought how people in the real world will respond.

  1. People will have strong views on what should, and should not, be in filtered categories. The strength of views in POV wars shows how divided people are generally on matters of opinion and belief. There will be people who feel that filtering shows we admit we're wrong but have done a token or half job to try and cover our asses (LookAtAllTheseImagesWhichAren'tFiltered) or will increase their demands (OtherStuffIsn'tYetFiltered). There will be people who feel we are categorizing material that we shouldn't (ThisIsLudicrousAndCensorship).
  2. Different cultures have different views and more conservative cultures may feel we haven't done a proper job. Rather than the present position of grudgingly acknowledging over time that Wikimedia has a non-censorship stance (a valuable learning that will gradually percolate into General Knowledge) they can now feel real anger that we agree we should censor but aren't doing so or aren't doing so properly.
  3. There will be edge cases and more debates and anger over them since we are now "fighting for the decision to include or exclude from a filter" which adds impetus and motivation to argue (think Virgin Killer debate for an example - art, historical imagery, or child porn?)
  4. The category overlap cases have the potential to turn a theoretically simple idea into a nightmare. Classical art with nudes? Pop art with nudes? Book and album covers with nudes? Historical, artistic and medical images of children? Images with incidental "controversial content" that isn't their main focus being tagged as a "controversial image"?
  5. Those who want more censorship will be strengthened not weakened. They will have a "see, we were right, if we keep fighting we'll get the rest too" belief in their pocket. Look at the Internet Watch Foundation case - because we stood firm and argued for our side the general internet blogosphere of public opinion was broadly positive and the IWF backed down instead. Had we not done so they are on record as saying they would have targeted Amazon next. So much for appeasement and placation. When others will not compromise and will take any concession as a sign of weakness or future ground given away, one must be extremely careful to give any ground away at all and the manner of doing so. If on principle some ground should be given, it must be hammered home hard that it is from principle that the line is now drawn where it is.

All in all, very careful thought needed here. Before anything is done people may agree or not but they know where we are and it's a straightforward position (we broadly keep images that are felt to have value by the community, delete those that don't, and those we have aren't censored). If filtering is undertaken then we change that position. We have admitted (it will be felt) that some of our content is not appropriate, and the battle will be on over how much. Once we make that admission (if we do) then we cannot take it back.

As a corollary, if anything is done, it must be done strongly and with a very strong (and very positively backed) line stated and drawn.

Ideally it would make no concession but merely state that as part of our current development work, while Wikipedia remains uncensored we appreciate a number of users wish to have control and tools are now available to prevent display of images in certain categories for those who would like them, see <how-to link>. FT2 (Talk | email) 13:12, 2 October 2010 (UTC)Reply

On the topic of filtering, a lot depends on how exactly the filter is implemented. I'll sketch 3 methods, of which I find the third to be the best. I'll assume that all filters work by filtering categories.
  1. One filter, determined by consensus
    This basically means editors come together on the talk page and discuss whether this picture should be filtered or not. Very simple to implement and uses consensus. However, the problems are obvious: There will always be disagreement about some picture. So, in the end, such a filter will always be to strong for some and not strong enough for others.
  2. Automatic filtering
    E.g. by having people rate pictures and using an individual "If more than 50% of viewers found this controversial, dont display" setting. Gets rid of all the debate and employs "wisdom of the masses" instead of relying on a small group of editors. However, this does not address the fact that different groups might find different matters controversial. So, no matter how you'd set your individual level, you would always have stuff filtered that you'd rather see, and some stuff not filtered that you'd rather not see. Oh, and I shudder to think what a coordinated group of vandals (think 4chan) could turn any automated system into ...
  3. External Lists
    My favorite. Enable every editor to point to a list hosted somewhere that says "don't display any picture in these x,y,z categories". Everyone can make their own list (or, more realistically, point to a list of a trusted group with likewise ideas). All controversy disappears: You think your list is not strong enough? Switch to another. Still don't like it? Make your own.
As a side note: All the negative points about 1. automatically apply to any "opt-out" filter. Therefore I strongly prefer "opt-in". --Xeeron 10:35, 7 October 2010 (UTC)Reply

Recommendations discussion

edit

It might be helpful to have discussion focused recommendation by recommendation here. Summaries of points made above listed recommendation by recommendation above would be awesome too. -- phoebe | [[user_talk:phoebe|talk] 03:50, 15 October 2010 (UTC)Reply

1: no changes be made to current editing and/or filtering regimes surrounding text

edit
  • Agree with this. Text should never be filtered. If you go to an article on anal sex, you should expect to read about, well, exactly what it says on the tin. If you'd rather not read about it, why in the world would you go to such an article? Seraphimblade 04:35, 15 October 2010 (UTC)Reply

2: no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children

edit
  • Agree with this. "Wikipedia contains many different images, some of which are considered objectionable or offensive by some readers. For example, some articles contain graphical depictions of violence, or depictions of human anatomy." Wikimedia projects are not intended to be for the use of children. Seraphimblade 04:35, 15 October 2010 (UTC)Reply

3: creating Wikijunior

edit
  • Fine with this. If we want to create a project which is explicitly a project for the use of children, we can establish appropriate rules there. However, the general Wikimedia projects should remain uncensored. Seraphimblade 04:36, 15 October 2010 (UTC)Reply
  • Creating Wikijunior sounds nice, but when time comes to make it, whose notions of appropriateness will prevail? For example, will it be an anti-gay Wikijunior or an anti-Christian Wikijunior? There's a risk that squabbles breaking out there will harm other projects. Wnt 12:01, 17 November 2010 (UTC)Reply

4: review images of nudity on commons

edit
  • Disagree with any "special" review. Any editor on Commons can review any image they like, but nudity or sexual images should not be subject to special criteria (either by rule or in fact) that other images are not expected to meet, either individually or in aggregate. If we can have 5000 images of a butterfly species, we shouldn't argue against 5000 images of a human sex organ. Seraphimblade 04:35, 15 October 2010 (UTC)Reply
  • Haven't most of these images have already been subjected to deletion reviews? The community decision making process already used should not be overruled. Wnt 12:01, 17 November 2010 (UTC)Reply

5: historical, ethnographic and art images be excluded from review

edit
  • Since I don't support any "special" review at all, this point is moot for me. However, if a "special" review were to be decided upon, any image which is even possibly in these categories should be excluded from it. Seraphimblade 04:35, 15 October 2010 (UTC)Reply
  • To avoid opening more cans of worms, the term "ethnographic" must be considered in a non racist way, to be as applicable to images of whites as it is to images of other races. Wnt 12:01, 17 November 2010 (UTC)Reply

6: active curation within controversial categories

edit
  • "Curation" is a vague term here. If it means "removal", then no. If it means "appropriate categorization and description", then that should be done for all images. Seraphimblade 04:35, 15 October 2010 (UTC)Reply
  • I agree that "curation" must not mean censorship. If what you mean is that someone who knows the category system makes sure that a naked girl spraying herself with whipped cream doesn't go directly under "People eating" but goes to a sexual fetishism category, then that is sensible enough. Wnt 12:01, 17 November 2010 (UTC)Reply

7: user-selected shuttered images (NSFW button)

edit
  • Fully agreed. While we should not censor ourselves by not offering free images of any given thing, we should also not force any user to view an image they do not wish to see. Users should be allowed to "shutter" images by category. Seraphimblade 04:35, 15 October 2010 (UTC)Reply
agreed only if this be outside software, and neither developed by the foundation nor integrated into any foundation project; our attitude to this should be purely passive, and we jshould neither assist nor hamper, any more than we would we any other legal outside program that used our data.
Agreed, though I would marginally prefer a one-time, user account-linked opt-in to viewing the relevant content, per defaults at google, flickr etc. --JN466 02:13, 11 November 2010 (UTC)Reply

8: no image be permanently denied with such shuttering

edit
  • Fully agreed. It is not our job to try and be a "Net Nanny". Users can install filtering software at their end if they so desire. Our job is to try and prevent the user from viewing content that the user does not want to see, not that someone else does not want them to see.Seraphimblade 04:35, 15 October 2010 (UTC)Reply

9: allow registered users the easy ability to set viewing preferences

edit
  • Somewhat agreed, but not enough. Both registered and anonymous users should have the ability to set viewing preferences. This could be easily enough implemented through cookies or the like. Again, we want to offer relevant images in all cases, but never force someone to view them. Seraphimblade 04:35, 15 October 2010 (UTC)Reply

10: tagging regimes that would allow third-parties to filter Wikimedia content be restricted

edit
  • Largely irrelevant. Third parties could filter by category anyway. Seraphimblade 04:35, 15 October 2010 (UTC)Reply
  • Such a regime must not involve people arguing on Wikipedia to decide what ends up being "filtered" by the third party. That can involve using serial numbers only, interpreted by third party software by its own standards, or it can involve lists of files in userspace not subject to argument by outsiders; but it must not involve us deciding here what image goes in the PG-13 category. Wnt 12:01, 17 November 2010 (UTC)Reply

11: principle of least astonishment to be codified

edit
  • It already is. But since "codified" seems to instead mean "implemented", I think it depends on what it's to mean. A reader should not be astonished to go to an article about a sex act and find information, including images and media, about that sex act, nor to the article about the Jyllands-Posten cartoon controversy and find the controversial cartoons there. They would reasonably be astonished to go to the article on George Washington and find media depicting sex (even if there were media of George Washington engaged in sex). If that's what we mean, sure. If we mean "We must never risk shocking anyone anywhere", absolutely not. Seraphimblade 04:36, 15 October 2010 (UTC)Reply
  • This is a generally unworkable idea. Some people are astonished to find pictures of gonorrhea under gonorrhea.[8] No one will agree on what to expect, and it will just lead to endless arguments. Wnt 12:01, 17 November 2010 (UTC)Reply

fully support the Study

edit

A general problem on Wiki projects is that a well thought out contribution attracts many critics but gets apparent little support because either other supporters only agree with 90% of what was proposed or because a "me too" comment that doesn't provide further argument is thought to be a waste of space. In this case I am concerned that the Study authors may not be familiar with this dynamic and may conclude that they have no support. That's not true. There is something of a populism problem here in that just as public policy experts have more influence in representative democracies than direct democracies, the authors here will be better received by the Foundation then by Wikipedians at large.--Bdell555 22:45, 21 October 2010 (UTC)Reply

Serving the reader

edit

I support allowing readers to choose for themselves whether to view NSFW content, whether through collapsible boxes defaulted to a closed state or through a link at the top of an article leading to text-only versions of them.

The question is whether we should be serving readers or putting up the content that we want with no thought to what readers might want or not want to see. Saying, "I'm against censorship" is fine as a general ideal, but we live in a real world, in which people can get in trouble at work or school for having NSFW images on their computer, even if they were not out to look for that kind of thing. A person ought to be able to access the text of an article on, say, pornography, without loading the content that could get them in trouble, or that they might just not want to see. At the same time, we should preserve the ability to view "controversial content" for those who do want to see it.

To date, the prevailing opinion among Wikipedia editors has been that readers should not have a real choice when it comes to NSFW images or other content they might not want to see in an article they are otherwise interested in. There should be no warnings, no collapsable boxes, nothing to allow readers to make their own decisions about what they want to see. That to me, is an attitude along the lines of, "We're going to do what we want, no matter what you, the reader, might want, and if it gets you in trouble, screw you, we don't care, don't use Wikipedia at all then. We have a 'Disclaimers' link at the bottom of the page, so we're wiping our hands of this matter."

I recognize there may be disputes about what content should be subject to this kind of proposal, but the alternative of doing nothing is like sticking up a middle finger to the world. -- Mwalcoff 01:49, 13 March 2011 (UTC)Reply

I agree. I could easily imagine a few of our editors who routinely work with articles or images about pornography, for example, choosing to have a separate account for use at work/school/public computers, and ticking the 'not getting myself fired and/or hauled into the office to talk about sexual harassment if someone happens to look over my shoulder at the wrong moment' box in that account. At the moment, if this editor knows that work/school/public computer is a hostile environment, then the editor's only realistic choice is "don't go to any Wikimedia website there". WhatamIdoing 15:31, 28 April 2011 (UTC)Reply

Flashing images omitted?

edit

I am disappointed that flashing images, which might trigger seizures in people with photosensitive epilepsy, didn't get mentioned at all. Somehow, the harm of people not only being physically injured, but also possibly dying from en:status epilepticus seems to have been completely overlooked, while the discussion focuses on hurting people's feelings with images related to sex, dead bodies, and religion.

Is it really reasonable to expect some of our disabled users to permanently (for their rest of their lives) suppress 100% of images, even images that are extremely likely to be benign, because there's no way for them to know which are probably safe? Are we serving the public and educating users by making all the images on Commons inaccessible to this group of disabled users?

I know these folks aren't represented on Commons, since it's a small population, and since they currently can't use Commons in any meaningful sense because of the complete lack of filtering options. Couldn't we extend any classification scheme to address this point as well? WhatamIdoing 15:45, 28 April 2011 (UTC)Reply

That can be done in the browser by blocking animated .gif images. It's not worth the risk to try to filter them by site or transient content (files can be re-uploaded over on wikis, for instance) if it's a problem for someone. wikihow:Stop-Animated-Images-in-a-Browser Kylu 17:12, 28 April 2011 (UTC)Reply
Sure, but the problem is bigger than that. I've been told that some high-contrast optical illusions are a problem for some people, as are real-life places (a row of regularly spaced trees seems to be a typical example). WhatamIdoing 01:18, 29 April 2011 (UTC)Reply
I think a fair approach to a lot of personal image issues would be to allow users to alter a user setting which controls the size of thumbnails displayed, and which overrides the NNNpx settings in the Wiki code. Presumably a very small thumbnail even of a flickering or regularly spaced image should be harmless, and likewise the personal shock and awe from seeing a genital or a Muhammad picture would be shrunk in due proportion. Of course, this doesn't satisfy censorious urges... Wnt 05:28, 28 May 2011 (UTC)Reply

Deliberateness

edit

I wonder...

We keep saying that if you deliberately go to an article about anal sex, then you get what you deserve.

But what if you ended up at the article as a result of clicking on Special:Random? What if you clicked on http://en.wikipedia.org/w/index.php?oldid=426197774 on some other website and ended up at the article as a kind of en:Rickrolling? What if you actually went to an article on perfectly benign subject, like en:Wheat, which had just been vandalized to include a full-screen image of someone's genitals, or to have had a bunch of potty-mouth text inserted?

I don't think it would be entirely unreasonable to let users use account settings to reduce the likelihood of these "oops" events. WhatamIdoing 16:51, 28 April 2011 (UTC)Reply

It'd be nice, but I don't know if the costs are worth the potential risks. Reducing the likelihood of such events is possible, but enabling such a feature would make users not familiar with its limitations feel secure when such security isn't warranted, which puts the Foundation and the volunteers in jeopardy ethically, possibly even legally. Similar to how the various anti-vandalism bots and workers are constantly fighting sneakier vandalism, the backside of a feature set like this would cause an escalating war of attrition against people wanting to insert shock material. I imagine you still get the occasional email advertising "v1@gr4" right? Such a battle against spam has waged far longer than Wikipedia has been around, and yet it's not really any closer to being won, despite an entire industry having been formed in order to fight its spread. Kylu 17:07, 28 April 2011 (UTC)Reply
Um, actually, I think it's been several years since I've seen any spam about medications. WhatamIdoing 01:18, 29 April 2011 (UTC)Reply
I'll go out on a limb and guess you're using peer-moderated systems like Google and Yahoo mails (and much anyone else, anymore). One person gets a spam message, then another, and they mark it as spam. This all assumes the mails are mass-sent, of course. The anti-spam system marks it as spam for everyone else, and nobody other than the initial reporters notice. If Gmail, for instance, were perfectly spam-free, there'd be no "Report Spam" button. It's still being sent, you're still receiving it, it's ending up in your spam inbox, probably alongside emails you wouldn't have marked as spam. My point wasn't that you're getting mails for anything specific, but that spam is still there and a problem, and that we're not going to simply be able to hire a really spectacularly nifty developer and get an extension anytime soon that can prevent this sort of foolishness.
If we want to boil down the entire argument, really, it's that our defensive measures are inherently going to constantly need to be modified, implying failure on the part of that measure. Why? Because Wikimedia uses reactive defensive measures to protect its content: Anti-vandal accounts (bots and humans), blacklisting certain files from being displayed outside their articles, etc. Proactive measures are possible also, but generally they're antithetic to Wikimedia's purposes for a project. You don't see wmf: getting vandalized much, for instance, nor otrs-wiki. :)
Best of luck with your suggestions, though. Kylu 16:42, 30 April 2011 (UTC)Reply
One of my e-mail accounts is on a server run by a friend in his home. I don't know what method he's using to control spam, but there's no "report spam" button, there's no spam folder, and I'm still not seeing drug spam. (Instead, all the spammers seem to be trying to sell me electronics devices; it's both the only account that sees more than one spam message a month, including things that get dumped to spam folders and the only account whose address has been published in plain-text on the web.)
I'd like to clarify that my proposal above is probably most appropriate for logged-in users, who ought to be allowed to deliberately restrict automatic image and video viewing, if that's what they want to do. I don't think that WMF should continue to refuse to permit people the option of voluntarily restricting these images. WhatamIdoing 18:48, 30 April 2011 (UTC)Reply

sexual content: lack of universality

edit

In hundreds of cultures worldwide, female breasts are not usually covered (or buttocks, for that matter), and their mere exposition does not usually arouse males. In fact, breasts are not recognized universally as sexual organs.

On the other hand, on some other cultures, an uncovered female neck is recognized as a body part that usually arouse males. With such diversity on human sexuality, I find it hard to believe that Wikimedia may come with a comprehensive, objective, universal criterion for classifying images that are meant to arouse.

There is hardly a "common denominator" here. Or, if there is, it must be very hard to objectively apply. If a systematic classification will be carried, I expect it will be heavily biased towards an US-centric point of view. And since such definitions aren't verifiable or objective, there is plenty of room for misclassifying borderline cases.

I agree that it is impossible to categorize universally and objectively what constitutes "intent to arouse" for all cultures. However, I think that practically speaking, a large plurality - perhaps even a majority - of Wikipedia users, as well as groups producing the majority of images with intent to arouse - tend to agree on a similar interpretation, which matches the criteria as described in the guidelines. These are the "common denominator". As long as the consequences for misclassification are small, then I see no reason not to attempt to classify images according to such guidelines. MDuo13 01:12, 22 August 2011 (UTC)Reply

Another entirely different concern I have is relying on "intent". This is also hard to assert objectively, if at all. When this is combined with the exemption of "art", things explodes: many contemporary nude photographs can either be considered art or pornography, and has both the intent to arouse and the intent of producing fine art.

So, the whole problem is: there is no universal definition of what is art.

A notable example, pointed by others, is W:MET ART: their take nude photographs, and their public includes both pornography consumers, and people who otherwise appreciate that kind of art. Many of them are actually somewhere in between. It may be that the "pornography component" of it is more responsible for its commercial viability, but nonetheless it does not seem clear to me that the intent of arousing is meaningful as a test for determining whether an image is out of scope, specially for borderline cases like that.

This is also the case for some historical images (such as some that Jimbo unilaterally deleted). One can reasonably argue that some of them had the intent of arousing, and at same time, were fine art. Per this proposal, I suppose that by two different criteria (historical and art) they would be kept. But the problem is where to draw the line - essentially, defining what is art.

United States has judicial procedures for that, because obscenity laws can't bar images that has artistic value (see w:Miller test). and, unsurprisingly, it depends on community standards, that are viewed as evolving in time. I don't think it could be otherwise, since you need some anthropological context to define artistic value. I hope US court judgement won't be used as evidence that something isn't art, because Wikimedia projects aren't restricted to United States.


The inherent difficulty of classifying sexual content should be summed to the vagueness of the wording of the proposal in this report. Specifically, it can be argued that relying on a hypothetical "reasonable person" might lead to a broader tagging than clear, objective and verifiable criteria (this is also a problem for the Miller test, accordingly to William O. Douglas).

There are lot of pornography filters to use on the web, but they are hardly universal or unbiased. On the contrary, they are designed to match the sensibilities of its target public. To avoid embarrassement they typically err on the filtering side, so the rate of false positives is usually high. Wikimedia projects can't and shouldn't work like that, because of their universality.

And the problem is, probably it will. The current "referendum" for its implementation is so biased towards its acceptance that the people in charge of it must be pushing hard for it.

--187.40.227.156 04:05, 19 August 2011 (UTC)Reply

Two points:
  • The "people in charge of" the referendum are not "pushing hard for" the filter: they have been directly ordered by their employer to create an image filter whether they like it or not.
  • It is not necessary for the filter to be perfect. In fact, it is acknowledged that it will occasionally produce flawed results, with both wanted images being hidden and unwanted images being shown. This is a "reasonable effort" project, not a "perfect, indisputable, objective, ideal classification for 100% of images" project. Disputes will be resolved the same way disputes are always resolved, which is through discussion by interested parties. WhatamIdoing 17:46, 23 September 2011 (UTC)Reply

Fist two footnotes take you to the same place

edit

At present, footnotes/references 1 & 2 take you to the same article, the list of controversial articles. Was/is there a general article on what is controversial? Ileanadu 02:51, 21 August 2011 (UTC)Reply

Image of Kenny G

edit

In the section >>Images of the "sacred"<<, the report says:

>>With images of this sort, rather than confronting a series of responses arrayed on the same scale of values, different groups of people judge the image using different scales of values, and the key to reconciling them is for each group to absorb and understand the values of the other, after which, and only after which, a determination of action around these images may be taken.<<

So the study is trying to sell us a model of "different scales of values", with the aim to start discussing taking actions around (=removing or hiding) pictures which some religious groups think are inappropriate to show.

In the year 2000, another musician accused Kenny G of playing the "worst music on earth" etc. Though at first one would expect some kind of personal problem with Kenny G as motive for such a public statements, the musician later extensively explained his aversion against Kenny G in purely musical terms, with the ultimate reason that it was "bad taste" and "disrespectful" by Kenny G in one of his recordings to over-dup a song of the dead Louis Armstrong.

At the latest since then, the persona of Kenny G in some circles seems to serve as a general vanishing point for aversions. The Report says:

>>Objective criteria have been established and explained in both these articles to define and test for the concept “controversial”, and they seem perfectly adequate to us. Two things should be noted in regards to these definitions and articles. The first is, that the list deemed “controversial” in Wikipedia is relatively long. As might be expected, it contains a healthy list of articles dealing with sexual issues, but also includes the article on “smooth jazz”. Because text and image are different, the list of “controversial” text articles is much larger than the similar list for images (unless you object to a photo of Kenny G.)<<

Perhaps this is just meant as a joke, and Mister Goretzki has to live with appearing in a Wikimedia study as the general example of someone whos photo will cause objections.

However there seems to be a deeper reasoning hiden behind this remark. The report seems to believe, that today for many people, modern art or music has occupied the same place that religion traditionally had. So the report seems to believe that a controversy about Kenny G is comparable to religious emotions. The reader who personally may share or understand objections to Kenny G or his music is supposed to think: "we secular in the west are not really different from, or more advanced than those religious groups who want to remove picture from wikipedia articles".

In my opinion, this way of reasoning is wrong multi-culturalism, --Rosenkohl 19:08, 23 August 2011 (UTC)Reply

Return to "2010 Wikimedia Study of Controversial Content: Part Two" page.