Wikipedia:Meinungsbilder/Einführung persönlicher Bildfilter/en
This page contains an English translation of de:Wikipedia:Meinungsbilder/Einführung persönlicher Bildfilter from the beginning of the poll at 18:02, 25 August 2011. This page serves for informative purposes only and is no actual opinion poll itself. |
Wikipedia: Opinion poll/Implementation of personal image filter
[Quelltext bearbeiten]The opinion poll hast started on 25 August 2011 at 6:00 p.m. and ends at 15 September 2011 at 18:00 at 6:00 p.m.
Initiators and supporters
[Quelltext bearbeiten]Initiators
[Quelltext bearbeiten]Supporters
[Quelltext bearbeiten]- de:User:Don-kun 10:25 17 August 2011 (CEST)
- de:User:Niabot 11:14, 17 August 2011 (CEST)
- de:User:Joyborg 13:35, 17 August 2011 (CEST)
- de:User: Chaddy 15:58, 17 August 2011 (CEST)
- de:User:Michileo 16:42, 17 August 2011 (CEST)
- de:User:Blogotron 08:51, 18 August 2011 (CEST)
- de:User:Widescreen 10:36, 18 August 2011 (CEST)
- de:User:Trockennasenaffe 10:40, 18 August 2011 (CEST)
- de:User: Carbidfischer 12:01, 18 August 2011 (CEST)
- de:User:Rosenkohl 13:39, 18 August 2011 (CEST)
- de:User:Eingangskontrolle 15:01, 18 August 2011 (CEST)
- de:User:Re probst 13:04, 19 August 2011 (CEST)
- de:User:Zietz 21:50, 19 August 2011 (CEST)
- de:User:23PowerZ 23:20, 19 August 2011 (CEST)
- de:User:Florian Blaschke 15:15, 20 August 2011 (CEST)
- de:User:Sk!d 03:08, 22 August 2011 (CEST)
- de:User:Ianusius 18:19, 22 August 2011 (CEST)
(10 supporters eligible to vote needed; check voting eligibility)
The supporters are responsible to ensure that this opinion poll will start only if it is suitable for a vote. If the opinion poll has been reworded after your entry so that it threatens to be started in an unsuitable condition, you should withdraw your entry again.
Problem description
[Quelltext bearbeiten]The Board of Trustees of the Wikimedia Foundation (WMF) has at the end of May 2011 in the course of a Resolution: Controversial conent[1] decided to develop a software application for personal optional filter for images and to implement it on all its projects:
- “We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable readers to easily hide images hosted on the projects that they do not wish to view, either when first viewing the image or ahead of time through preference settings.”[2]
This is grounded inter alia on results and recommendations from a study on behalf of the Wikimedia Foundation from 2010, known as the Harris-Report.[3] The Harris-Report had proposed a form of personal image filtering in its recommendations 7 and 9.[4]
The proposed filter and filter categories and the method of implementation are disputed among the users of Wikipedia.
Planning by the Wikimedia Foundation for the implementation
[Quelltext bearbeiten]The precise function of the image filter is not fixed yet. So far the developers have created an informal working draft with sample pictures. The following is currently being planned:
The filter is designed to enable readers of Wikipedia, to not show files on their own screen at their own request according to specific criteria. The filter settings can be personally adjusted. Files with potentially controversial content will get an additional button “Hide image” to be able to change the filter settings so that the file is hidden. Files that are already hidden by the filter will get a button “Show image” to change the filter settings and then show the file. In addition, the page header always displays a link that leads to the filter settings. Although the Foundation speaks of an "image filter", according to the rough draft all categorized media will be included, e.g. video or audio files.
To enable the filtering, the readers of Wikipedia shall be able to specify criteria like “sexually explicit”, “medical” or “depiction of violence” for the files they don't want to see.[5]The files shall be filtered using corresponding filter categories to be created within the existing system of categories on Commons.[6] Corresponding filter categories shall be set up on the local Wikimedia projects, such as on the German Wikipedia, for the locally hosted files.[7]
In the period from 15 August to 30 August 2011 the Wikimedia Foundation is holding a Wikimedia-wide referendum “Image Filter Referendum”, to gather opinions on the type of design and use of the filter function to be developed. Eligible to participate in the referendum are registered authors with at least 10 contributions, developers of MediaWiki, staff and contractors of the Wikimedia Foundation, and members of the Board of Trustees and the Advisory Board of the Wikimedia Foundation. To six statements, they may submit a numerical vote.[8]
The possibility to explicitly vote for or against the implementation of the image filter function is not provided in the referendum.
Proposal
[Quelltext bearbeiten]Personal image filter (filter which hide illustrative files on the basis of categories of Wikipedia and can be switched on and off by the reader, see the preliminary Design of the Wikimedia Foundation) despite the decision of the Board of Trustees of the Wikimedia Foundation should not be implemented in the German Wikipedia, and nor should filter categories for files stored locally on this Wikipedia be set up. |
Arguments
[Quelltext bearbeiten]Arguments for the proposal
[Quelltext bearbeiten]- The Wikipedia was not founded in order to hide information but to make it accessible. Hiding files may reduce important information that is presented in a Wikipedia article. This could limit any kind of enlightenment and perception of context. Examples: articles about artists, artworks and medical issues may intentionally or without intention of the reader lose substantial parts of their information. The aim to present a topic neutral and in its entirety would be jeopardized by this.
- The categorization of content for suitability contradicts the principle of en:Wikipedia:Neutral point of view. In the same vein argues the American Library Association (ALA) who strictly rejects a labelling of content in libraries according to non-neutral points of view, and even considers it as a “censor's tool” when it is tried to give readers certain recommendations or to warn them of content.[9] The ALA has stated according guidelines in its Library Bill of Rights.[10]
- Consideration of interests or preferences of individual readers or groups is not the job of an encyclopedia (en:Wikipedia:Five pillars). Readers are responsible on their own for their requests to the selection of pictures (e.g. by accordingly setting the software of their own terminal or using own filtering software).
- Opponents of the proposed filter categories and file filter consider their use a censorship of content that opposes the claim of an encyclopedia.[11] In particular here the risk of extending the filtering, a requirement for filtering (requirement of logging in to disable the filter) is emphasized, once an appropriate infrastructure is created.
- Depictions of violence or explicit sexuality are found in textbooks, including school textbooks. (This is not disputed.)
- Schools could advise their students, to enable the filter when using e.g. the German Wikipedia and make no use of the option to show images. For students who follow these instructions and do not have Internet access elsewhere, this would be the same as censorship. The same is possible for other organizations that provide Internet access such as libraries, political parties and religious communities.
- The filter is not a content block or a parental control software. The filter is active only on the projects of the Wikimedia Foundation and will be shut down easily by each reader. Since the filter does not block access to content, this should be no reason why operators of content filters would cut down their measures (domain lock, locks of articles) or take Wikipedia from the index. Effectively there would be double filtering.
- The implementation of image filter does not substitute the debate on which images are suitable for a certain reader. Also the classification of content in certain categories of exclusion can support different interests and ideas. It is absurd to classify certain content over and over into filter categories, because it is not foreseeable that clear guidelines can be developed about which files belong into in the various categories of exclusion.
- The notion of what is regarded as offensive or undesirable, can vary by user, cultural background and language version. The use of globally valid filter categories is therefore not meaningful, since the aim to create filter which do justice to all readers and all cultures is not technically feasible. A reader who wants to decide for himself what he might see and what not, would have to look at the pictures, which ends in a contradiction.
- The task of setting up and keeping up to date filter categories is transferred to the users who need to expend time and effort. These resources could instead be used for other technical and substantive improvements (A re-use of existing categories from Commons as filter categories is at best limited. For example in the category Violence (violence) and its subcategories there are not only depictions of violence, but also images of memorials, demonstrations, portraits, critical cartoons, etc.).
- Is also questionable that Wikipedia should operate the filter at all, since the content was created by their own volunteers according to the en:Wikipedia:Five pillars and recognized in a collective process as worth keeping. An option to suppress own content therefore appears paradox.
- A boomerang effect could happen: editors then according to the slogan "There are filters" would be more uninhibited to put potentially offensive material into articles.
- The professional skills and methodological approach of Robert Harris and his daughter Dory Carr-Harris, the two authors of the Harris-Report on which the decision of the WMF Board of Trustees significantly rely, are questioned. Robert Harris is a radio broadcaster and journalist, who has worked 30 years for the en:Canadian Broadcasting Corporation (CBC), including the production of several series on classical music. He has also written several introductory books on classical music. At the CBC Harris worked 17 years with the current Wikimedia CEO Sue Gardner and in 2010 was recruited by the Wikimedia Foundation as a consultant. It is unclear what apart of journalistic experience qualifies Harris as a consultant for controversial content of Wikimedia. Harris compares Wikipedia with the CBC,[12] but a scientific and general education encyclopedia has different goals and methods than a journalistic institution. The report ignores the critical discourse on evaluative labelling of media, as conducted e.g. by the American Library Association.
- The argument of the Foundation (en:Principle of Least Surprise), which assumes that people prefer to experience few surprises, is taken from the computer sciences (ergonomics of computer programs). Both from a psychological point of view and in the communication sciences, a contrary opinion is advocated. In the press, for example, photos are used for elucidation as well as raising interest.(see e.g. en:Photojournalism#Ethical and legal considerations).
- The filter violate the encyclopedic secularity. The Harris Report recommends a special deal with what it calls “images of the sacred”[13] but proposes the filter only expressly for sexual and violent images. The resolution of the Board of Trustees assumes that user may be offended not only by sexual and violent but also religious content and also refers to diversity of age, origin and value orientation of user groups.[14]The rough draft of the Foundation now explicitly plans a filter on the category “pictures of the prophet Mohammed”.[15] But a filtering for religious preferences contradicts the religious neutrality and the universal educational claim of an encyclopedia.
Arguments against the proposal
[Quelltext bearbeiten]- Readers who feel, for example, the presentation of violence or sexuality to be offensive, consider their feelings hurt by it or don't want to be surprised by it, can hide accordingly categorized files.
- At work or at a public library, it can be disadvantageous for the user of Wikipedia to images perceived as inappropriate or offensive on the screen. The filter would be a means to avoid such situations.
- A larger number of readers could be reached and additional authors won because some readers and potential contributors will no longer shun Wikipedia or certain articles for presentations perceived as offensive.
- The wind is taken out of the sails of attempts to completely remove potentially offensive content (for example by deletion of image files).
- The implementation of the filter could reduce publicly expressed reservations about the Wikipedia, which are based on the presentation of allegedly questionable content.
- It is not censorship, since it is expressly a matter to use personal filters (see #Problem description), which are activated only by user request. The free choice of each user will be ensured by several functions:
- The user is informed of the ability to filter content uncomfortable for him.
- The user decides whether to activate the filter (en:opt in).
- The user can back off the filtering at any time, or show the hidden pictures individually.
- This is currently planned for unregistered users too [2]. Optional - depending on the outcome of the referendum (“how important it is […] that the feature be usable by both logged-in and logged-out readers”) - it is also possible that there will be no filter at all for unregistered users.
- It is unclear whether a waiver of the filter feature will be technically possible at all. If it will be “permanently installed”, the German Wikipedia with a filter ban would cut off itself from further developments of the MediaWiki software and would have to continue a parallel version on their own.
- The effectiveness of the proposed "prohibition" of the filter is questionable, since it can be avoided technically simple: The main filter categories with most pictures will be located at the Wikimedia Commons. It could be accessed by both external “censors” as well as access additional software such as a Browser plugin, which could exactly replicate the filter. A ban on the filter categories in the German WP would be bypassed if necessary through a third party-operated filter category system. A “filter ban” could provoke such third-party solutions that are no longer controllable and could provide censorship mechanisms.
- Logged in users can already now hide specific content through their CSS settings; so anyway not all users see the same.
- The Wikimedia Foundation also justifies the introduction of the filter with the “Principle of least astonishment” which is in force e.g. in the English Wikipedia.[16] This means that the content of a page is presented to readers that respects their expectations.[17]
- The Harris-Report recommends to allow users to place some images (of sexuality and violence) “in collapsible galleries so that children (for example) might not come across these images unintentionally or unexpectedly”.[18]
Voting rights
[Quelltext bearbeiten]Voting is open to registered users, who at the time of the start of the opinion poll were eligible to vote. Here you can find out whether you are entitled to vote. The specification of the time in this tool is automatically set as UTC. UTC is EDT (as used in Germany, Austria and Switzerland during the period of the opinion poll) minus 2 hours.
So the voting rights are more restrictive here than at the “referendum”, but in accordance with the long-standing practice here.
Formal validity
[Quelltext bearbeiten]If 50 percent + 1 agree on this opinion poll, it passes for counting. In the substantial counting also an absolute majority applys. In both cases abstentions count as votes not cast.
I accept this opinion poll
[Quelltext bearbeiten]I reject this opinion poll
[Quelltext bearbeiten]Abstention
[Quelltext bearbeiten]Substantial voting
[Quelltext bearbeiten]The current status quo is that filters can be implemented as a result of the decision of the Foundation. If the proposal receives a majority however, then on the German Wikipedia content shall still be presented without filter and no filter categories will be introduced (“internal project status quo”).
Personal image filter and filter categories should not be introduced in the German Wikipedia
[Quelltext bearbeiten]Personal image filter and filter categories can be introduced in the German Wikipedia (status quo)
[Quelltext bearbeiten]Abstention
[Quelltext bearbeiten]Result
[Quelltext bearbeiten]Both the opinion poll itself and its proposal were accepted. In contrary to the decision of the Board of Trustees of the Wikimedia Foundation, personal image filters should not be introduced in German-speaking wikipedia and categories for these filters may not be created for files locally stored on this wikipedia.
260 of 306 users (84.97 percent) accepted the poll as to be formally valid. 357 of 414 users (86.23 percent) do not agree to the introduction of a personal image filter and categories for filtering in German wikipedia.
External Links
[Quelltext bearbeiten]Zeit Online on 23 August 2011: Wikipedia-Autoren stimmen über Filter ab (Wikipedia writers to vote on filters. With a referendum the Wikimedia Foundation promotes a filter against potentially objectionable content in Wikipedia. Among the authors opposition is rising.)
Notes
[Quelltext bearbeiten]- ↑ Wikimedia Foundation Board of Trustees: Resolution:Controversial content, 29. Mai 2011
- ↑ Wikimedia Foundation Board of Trustees: Resolution: Controversial Content, 29. Mai 2011
- ↑ Robert Harris and Dory Carr-Harris: 2010 Wikimedia Study of Controversial Content, 2010
- ↑ “It is recommended:
7. That a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images (organized using the current Commons category system) into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command (‘under 12 button’ or ‘NSFW’ button). […]
9. That a series of additional user-selected options be created for other images deemed controversial to allow registered users the easy ability to manage this content by setting individual viewing preferences.”,
Robert Harris und Dory Carr-Harris: 2010 Wikimedia Study of Controversial Content:Part Two, 2010 - ↑ example figure from the rough draft
- ↑ Commons:Categories
- ↑ Category Equivalence Localization
- ↑ On a scale of 0 to 10, the voters can specify the degree of their approval of the following statements: It is important
“for the Wikimedia projects to offer this feature to readers.
that the feature be usable by both logged-in and logged-out readers.
that hiding be reversible: readers should be supported if they decide to change their minds.
that individuals be able to report or flag images that they see as controversial, that have not yet been categorized as such.
that the feature allow readers to quickly and easily choose which types of images they want to hide (e.g., 5–10 categories), so that people could choose for example to hide sexual imagery but not violent imagery.<br/ that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial).”
Wikimedia Foundation: What will be asked - ↑ Labels and Rating Systems. An Interpretation of the Library Bill of Rights. American Library Association, 19. Januar 2005, abgerufen am 23. August 2011 (english).
- ↑ Library Bill of Rights. American Library Association, 23. Januar 1996, abgerufen am 24. August 2011 (english).
- ↑ [1]
- ↑ “The CBC is an interesting place. Like Wikipedia, it is a powerful (in its world) and respected information-providing institution, dedicated to public service and the provision of unbiased (the analog equivalent of NPOV) news and information to the Canadian public. However, like your projects, the power of the institution, and its public-service character, make it the focus of intense and perfectly legitimate discussions over content, balance, mandate, and the need to serve different publics simultaneously.”, Robert Harris, meta:2010_Wikimedia_Study_of_Controversial_Content/Archive
- ↑ [:meta:2010 Wikimedia Study of Controversial Content: Part Three#Images of the “sacred”]], Harris-Report, 2010
- ↑ “Some kinds of content, particularly that of a sexual, violent or religious nature, may be offensive to some viewers; […] We recognize that we serve a global and diverse (in age, background and values) audience, and we support access to information for all”, Resolution, 29 May 2011
- ↑ sic, Personal image filter, Overview of this system, Mediawiki
- ↑ en:Wikipedia:Writing better articles#Principle of least astonishment
- ↑ “We support the principle of least astonishment: content on Wikimedia projects should be presented to readers in such a way as to respect their expectations of what any page or feature might contain”, Resolution, 29. Mai 2011
- ↑ “The major recommendation we have made to deal with children and their parents is our recommendation to allow users (at their discretion, and only for their personal use) to place some images (of sexuality and violence) in collapsible galleries so that children (for example) might not come across these images unintentionally or unexpectedly. As we noted in our section on basic principles, we did so because we believed it would show some basic respect and service to one group of our users (those worried about exposure to these images) without compromising the different needs and desires of another (those desiring, even insisting, the projects stay open).” Children, Harris-Report, 2010