Commons:Village pump/Proposals

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

Shortcuts: COM:VP/P • COM:VPP

Welcome to the Village pump proposals section

This page is used for proposals relating to the operations, technical issues, and policies of Wikimedia Commons; it is distinguished from the main Village pump, which handles community-wide discussion of all kinds. The page may also be used to advertise significant discussions taking place elsewhere, such as on the talk page of a Commons policy. Recent sections with no replies for 30 days and sections tagged with {{Section resolved|1=--~~~~}} may be archived; for old discussions, see the archives; the latest archive is Commons:Village pump/Proposals/Archive/2024/01.

Please note
  • One of Wikimedia Commons’ basic principles is: "Only free content is allowed." Please do not ask why unfree material is not allowed on Wikimedia Commons or suggest that allowing it would be a good thing.
  • Have you read the FAQ?

 
SpBot archives all sections tagged with {{Section resolved|1=~~~~}} after 5 days and sections whose most recent comment is older than 30 days.

Restrict webp upload?[edit]

https://commons.wikimedia.org/w/index.php?sort=create_timestamp_desc&search=filemime%3Awebp

i suggest restricting upload of webp files to autopatrol users (like mp3), because very often webp uploads are copyvio taken from the internet or previews of svg logos. RZuo (talk) 14:07, 22 November 2023 (UTC)Reply[reply]

 Strong support second in motion to @Yann, Abzeronow, and Glrx: et.al.. Examples of my autogenerated messages of WEBP copyvios: this, this, and this. And I can still remember the very first WEBP file I encountered here, which is a copyvio itself! Commons:Deletion requests/File:Beijing Skyline.webp. JWilz12345 (Talk|Contrib's.) 08:17, 26 November 2023 (UTC)Reply[reply]
  •  Support Would reduce copyvios for sure; I'm not sure the proportion is as high as some have mentioned based on spot checking, but I usually check the ones that look obvious so it's not exactly a random sample. Gnomingstuff (talk) 23:05, 29 November 2023 (UTC)Reply[reply]
  •  Oppose I think in general, discriminating on filetype is a bad direction (same with mp3). It further complicates and obfuscates the upload process and doesn't stop copyright violations, it stops contributors. Most of these can easily be spotted by filtering the upload list on new contributors. Or we can just ban SVGs as well, because most logos are copyvios. —TheDJ (talkcontribs) 18:46, 30 November 2023 (UTC)Reply[reply]
    If we would have enough people checking the unpatrolled uploads we would not need such filters. Unfortunately we do not have enough people checking uploads and edits and therefore need tools to reduce the workload. GPSLeo (talk) 19:31, 30 November 2023 (UTC)Reply[reply]
    I think that creating these kinds of non-transparent and highly confusing roadbumps is part of the reason WHY we don't have enough people. That's my point. And I note that just two posts below this we already have someone getting tripped up with the SVG translator software because of a similar rule #File overwriting filter blocks SVG Translate. It's one of those 'a small cut doesn't seem so bad, until they are a thousand cuts"-kind of problems. Considering how much ppl complain about UploadWizard, stuff like this isn't helping lower the barrier to entry either. —TheDJ (talkcontribs) 11:07, 9 December 2023 (UTC)Reply[reply]
    Plus we could just make patrolling itself easier by having uploads sorted per date, a single patroller can simple take a few minutes to patrol all new ".webm" files. Do this for every file type and we don't need to exclude people from uploading. If a patroller only wants to patrol videos, sounds, PDF's, Etc. they now have to go through all uploads, but by making it easy to filter out and making these pages easily accessible to everyone and transparent (like OgreBot's Uploads by new users) we could easily patrol everything with fewer people. --Donald Trung 『徵國單』 (No Fake News 💬) (WikiProject Numismatics 💴) (Articles 📚) 11:55, 9 December 2023 (UTC)Reply[reply]
 Support. Very few cameras or image editing tools output WebP images; when one is uploaded, it's almost always because it was downloaded from a web site which automatically converts images to that format for display (and, thus, is very likely to be a copyright violation). We already have abuse filters which block other types of uploads from new users which are overwhelmingly likely to be problematic, like MP3 files (Special:AbuseFilter/192), PDFs (Special:AbuseFilter/281), and small JPEGs (Special:AbuseFilter/156). Omphalographer (talk) 04:25, 3 December 2023 (UTC)Reply[reply]
  •  Oppose, per TheDJ. Additionally, this would exclude a lot of people who contribute to other Wikimedia websites but aren't necessarily active here, a user could be a trusted user, an admin, or a prolific contributor, Etc. on another Wikimedia website and "a noob" at the Wikimedia Commons. They could have good knowledge of how video files work and which ones are and aren't free, but they will find that they can't upload anything here. If we keep making the Wikimedia Commons more exclusive we will fail at our core mission to be for all Wikimedians. If new users are more likely to have bad uploads then we should have a page like "Commons:Uploads by unpatrolled users by filetype/.webm/2023/12/09" (which includes all users who aren't auto-patrolled), this will simply exclude too many people. We can't know which people and uploads we exclude because a user with a free video file will come here, attempt to upload, see "You have insufficient privileges for this action", and then never return (without ever telling anyone what (s)he wanted to upload and why they didn't). "Anyone can contribute to" is the core of every Wikimedia website, the moment you compromise this you lose whatever made this place Wikimedia. --Donald Trung 『徵國單』 (No Fake News 💬) (WikiProject Numismatics 💴) (Articles 📚) 11:49, 9 December 2023 (UTC)Reply[reply]
  •  Strong oppose, outlawing a file format will just lead to such files being converted into a different format, and be uploaded in a different way - but now with less possibilities to scan and patrol for it. This is classic prohibition: By outlawing X, users of X will find new ways to still do it, but in places where it can no longer be observed easily. I'm not even arguing in favor of the allegedly "just" 10% .webp images that are in fact okay, which is a valid concern as well in my opinion. So: Use this helpful file format to scan more efficiently for copyvios, rather than outlaw it and have the copyvios enter Commons nonetheless but via still uncharted routes. --Enyavar (talk) 15:25, 18 December 2023 (UTC)Reply[reply]
  •  Comment Giving that WebP files are essentially Google replacements of JPGs, PNGs, and GIFs, we cannot restrict the WEBP uploads into autopatrol users until we restrict the uploads of these three formats too (as well as SVG, even for own uploads), because if a non-patrolled users restricted their WEBP uploads, they would easily convert these webp files to PNG or JPG as a way to upload these images into Commons. We should find a way to close the loopholes of new users to convert webp files to a different image format before we can restrict the WEBP uploads to users with autopatrol rights, even with its own user's webp uploads. Yayan550 (talk) 15:33, 2 January 2024 (UTC)Reply[reply]
    • @Yayan550: I think you are missing the point here. Of course if they know what they are doing they can convert the file. The idea here is sort of a "speed bump" for a pattern that usually indicates someone who is ignorantly uploading a copyright violation. - Jmabel ! talk 19:24, 2 January 2024 (UTC)Reply[reply]
      Precisely. And, as I noted above, we already have AbuseFilter "speed bumps" for other types of uploads, like MP3 files, which are particularly likely to be copyvios. We're aware that users can bypass the filter and upload those files after conversion, but we can explain why an upload is being blocked in the AbuseFilter message (cf. MediaWiki:abusefilter-warning-mp3), and we can review the filter logs to see if users are deliberately bypassing the filter for infringing content. Omphalographer (talk) 21:24, 9 January 2024 (UTC)Reply[reply]
  •  Support Infrogmation of New Orleans (talk) 20:01, 9 January 2024 (UTC)Reply[reply]
  •  Support The issue seems similar to MP3 files. It's about a practical approach based on experience. No one, I assume, has anything against MP3 or WEBP as file types in principle, but it's just a matter of fact that Commons uploads of these file types tend to be copyvios more often than others, so a measure similar to the MP3 upload restriction already in place seems only sensible. The proposal is not about "outlawing" the format but restricting it to autopatrol users. Gestumblindi (talk) 14:22, 14 January 2024 (UTC)Reply[reply]
    •  Comment as I detailed above, this will only result in circumvential behavior by circumstantial users (those who upload ~20 files once and never again). So yes, it will bring the DETECTED number of copyvios down. --Enyavar (talk) 10:11, 29 January 2024 (UTC)Reply[reply]
 Support will bring the number of copyvios down. Only <20% of copyright violating users will actively evade by using an online file converter. Also I doubt that many competent users would use a WebP as a file format, most would use png/jpg/svg. —Matrix(!) {user - talk? - useless contributions} 16:53, 23 January 2024 (UTC)Reply[reply]
 Support yes, per Yann. Hide on Rosé (talk) 08:01, 4 February 2024 (UTC)Reply[reply]
 Neutral Enyavar and TheDJ made good points and I suggest these two points are addressed by support-voters. ( Oppose banning filetypes just because they're often copyvios, otherwise in 50 years all filetypes are banned and maybe PNGs are next. Instead I'd suggest WMF uses its millions of funds to get a bot working that detects "most-likely & likely copyvios to review" using tineye/google image reverse search and similar methods.  Support On the other hand, I don't know how often webp are copyvios and why that is and practicality etc brought up by support voters is a big point that must be considered. I just suggest that if the filetype is banned, it's only for a certain duration and/or this is revisited after some months or so. One could then try to see if it brought the actual (not just detected) number of copyvios substantially down and whether it made them harder to detect.) Note that for example all images in open access CCBY Nature studies seem to be webp files. Prototyperspective (talk) 12:21, 6 February 2024 (UTC)Reply[reply]

Disabling talk pages of deletion requests[edit]

While there now exists Template:Editnotices/Group/Commons talk:Deletion requests that notifies users to make comments on the deletion request pages themselves, it is evidently ignored, as seen in 54conphotos' comments on the talk page of Commons:Deletion requests/File:KORWARM2.jpg (which I transferred to the main page and in Amustard's comment on a Turkmen deletion request which I subsequently transferred to the mainspace. As it is very evident that the edit notice is being ignored, I am proposing that the "Talk" namespace be disabled in all pages with prefix "Commons:Deletion requests/". This should be a permanent solution to the incidents that should have been better avoided. For existing talk pages of deletion requests with comments, the comments (including mine if ever I had responded to uploaders in the talk page namespaces) should be transferred to the deletion requests mainspaces, with consideration to the dates of the comments or inputs. JWilz12345 (Talk|Contrib's.) 09:10, 26 November 2023 (UTC)Reply[reply]

 Support At least, the use of DR talk pages should restricted to power users (admins, license reviewers?). Yann (talk) 09:37, 26 November 2023 (UTC)Reply[reply]
@Yann that may be OK. Restricted to admins and license reviewers. Or the talk pages are still existing visually but those who don't have user rights, even autopatrolled ones, will be barred from editing talk pages and be presented with a boilerplate notice that they don't have the right to edit talk pages and should instead comment on the main discussion page, with a link to the DR itself in the notice (do not expect several new users to comprehend what they are reading in the notices). JWilz12345 (Talk|Contrib's.) 10:09, 26 November 2023 (UTC)Reply[reply]
 Support --Krd 11:23, 26 November 2023 (UTC)Reply[reply]
 Support Christian Ferrer (talk) 11:56, 26 November 2023 (UTC)Reply[reply]
Thank you for pointing out this Template:Editnotices/Group/Commons talk:Deletion requests location in Wikimedia. This was not ignored as you said in your comment, it simply was no where to be found at the time I commented. It's a shame it's too late to place a comment there as I would have done so. Even your notes to me are very confusing as the names of Comments pages do not match up so I can find them as are all the previous notes received by others. Being new to this platform, I have found it very confusing to find things that are suggested when seeing comments by others.
Hopefully I will have the hours to research and better understanding of the workings of Wikimedia Commons in the future. Thanks again! 54conphotos (talk) 13:32, 26 November 2023 (UTC)Reply[reply]
 Support or, if it's easier, systematically turn them into redirects to the relevant project page. - Jmabel ! talk 21:56, 26 November 2023 (UTC)Reply[reply]
 Support --Adamant1 (talk) 00:35, 27 November 2023 (UTC)Reply[reply]
 Support. Some good ideas above from Yann and Jmabel. We could also explore autotranscluding them to the bottoms of the DR subpages themselves.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 00:49, 27 November 2023 (UTC)Reply[reply]
 Support. Yes, good idea, esp. with Jmabel’s and Yann’s additions. -- Tuválkin 11:34, 27 November 2023 (UTC)Reply[reply]
 Support to restrict it to anyone with autopatrol, I think these users are knowledgeable enough to know that the talk page isn't to discuss the deletion. We must create an informal and easy-to-understand AF notice though. -- CptViraj (talk) 12:19, 9 December 2023 (UTC)Reply[reply]
Another one, this misplaced comment by ApexDynamo, which I have transferred to the main nomination pages. CptViraj, I don't think even autopatrolled users are still knowledgeable enough to know that talk pages are not proper forums to comment. Example: misplaced comments by Exec8 (which I also transferred soon after initiating this proposal). I suggest the use of those talk pages must be restricted to admins/sysops and license reviewers. JWilz12345 (Talk|Contrib's.) 09:38, 14 December 2023 (UTC)Reply[reply]
Still, rare cases for autopatrollers. IMHO we shouldn't unnecessarily take away the power completely, the problem is mainly caused by newbies/non-regulars. -- CptViraj (talk) 18:13, 23 December 2023 (UTC)Reply[reply]
 Support I have never used a talk page of a DR nor have I seen one being used. The DRs are usually also frequented by very few editors and the comments can easily be distinguished from one another.Paradise Chronicle (talk) 22:13, 30 December 2023 (UTC)Reply[reply]
One more problematic use, by @Balachon77: (see this). JWilz12345 (Talk|Contrib's.) 01:00, 8 January 2024 (UTC)Reply[reply]
Another problematic use, by SiCebuTransmissionCorrecter (talk · contribs) – Commons talk:Deletion requests/File:Line construction of Hermosa-San Jose Transmission Line. The line constructs above Hermosa-Duhat-Balintawak transmission line.png. JWilz12345 (Talk|Contrib's.) 00:10, 9 January 2024 (UTC)Reply[reply]
no no no no no no! SiCebuTransmissionCorrecter (talk) 01:12, 10 January 2024 (UTC)Reply[reply]
Commons_talk:Deletion_requests/File:Afrikan_och_Afrikanska_x_Ingel_Fallstedt.jpg ? DS (talk) 14:50, 22 January 2024 (UTC)Reply[reply]
@DragonflySixtyseven the discussion should have been made at COM:VPC or at concerned admin's talk page. Ping @Holly Cheng and De728631: for attention. JWilz12345 (Talk|Contrib's.) 21:35, 7 February 2024 (UTC)Reply[reply]
I was the one who started the discussion about the undeletion date. That's the type of thing that makes sense to do on the DR's talk page. holly {chat} 21:43, 7 February 2024 (UTC)Reply[reply]
@Holly, do you think this was useful enough to you that you would be opposed to making this change? I don't see a lot of loss if you had to do something like this directly on the DR page. I realize we normally don't touch DRs once they are closed, but we do add categories to them (for example) and I've seen a closing admin go back and add to their rationale. It's also what we typically do on a DR for a single image if the image is kept, then later nominated again for deletion. This seems similar to that, though I think you'd want to put the new content below the {{Delf}} template. - 23:45, 7 February 2024 (UTC) — Preceding unsigned comment added by Jmabel (talk • contribs)
@Jmabel: If we stick the new content below the {{Delf}} and then it gets nominated for deletion again, won't that mess up the bot that does the archiving? In the case of undeletions due to expiration, that would probably be extremely rare. I suppose if that does happen, someone can always manually archive it. I guess I'm not opposed. The benefits seem to outweigh the possible risks. holly {chat} 17:16, 12 February 2024 (UTC)Reply[reply]
@Holly Cheng: I'm not certain even what archiving you are referring to. There's nothing I'm aware of that any bot does to the individual DR page, and I'm not sure what is Krdbot's rule to remove the transclusion of the individual DR in the page of DRs for a particular day. It's obviously able to cope with categories below the {{Cfdf}}, but maybe not other text. User:Krd, I presume you could answer this. - Jmabel ! talk 01:07, 13 February 2024 (UTC)Reply[reply]
I meant the moving of individual DR subpages from the day page to the day archive page. holly {chat} 17:58, 13 February 2024 (UTC)Reply[reply]
Exactly. And I don't know User:Krdbot's rule for that, which probably only User:Krd can answer, unless someone feels like doing a bunch of research. - Jmabel ! talk 20:16, 13 February 2024 (UTC)Reply[reply]
"Only Categories" is the answer, though I'm not sure to which exact question. Krd 05:42, 14 February 2024 (UTC)Reply[reply]
That's too bad. Then it sounds like Holly's use case might be reason enough to keep the talk pages for these. Does anyone see a good workaround? - Jmabel ! talk 05:52, 14 February 2024 (UTC)Reply[reply]
I think the use case is invalid, as the objection should better be on a user talk page. Does anybody watch deletion requests? I don't. Krd 06:08, 15 February 2024 (UTC)Reply[reply]
@Krd I only watch DRs that I made, or that have significant interest in me that I am heavily involved, like responding to comments or questions every now and then. Though I heavily encouraged everyone to comment on DR pages themselves, not the DR pages' talk pages. On DRs that I am watching, I force the respondent (typically the uploader of the nominated file) to reply on the nomination page proper by moving their comment from the talk page to the nomination page proper and responding their question or protest, with the addition of a reminder for them to comment or respond on the nomination page proper. JWilz12345 (Talk|Contrib's.) 07:40, 15 February 2024 (UTC)Reply[reply]
 Strong support I see no reason to nominate someone's talkpage for deletion unless there is a strong reason, i.e. vandalism, or purely disruptive nature. However, I think this is only a very small number of them, and it should be handled by senior users. --A1Cafel (talk) 10:18, 15 February 2024 (UTC)Reply[reply]

Computer generated images used in the contests[edit]

Hello, I have been suggested to open a new topic here *this question was previosly asked in the Help desk. I have read that there is a topic wich discusses about AI images here, but it does not speak about using them in contests. As amateur photographer, i like to join contests. I use my own photographs taken with my Canon camera. I would like to make sure that only our own images taken as "humans" and not generated by AI are participating in the contests. - Is there any one of the admins or moderators who vet the pictures in the contests? - Are we all ensured that no AI Pictures are becoming part of the list of pics that compete in the contest? - What happens if some of us check the pics and see that there are some AI pictures in the contest? - Can we report them or those pics are fully allowed in the competition? (i believe not, but i ask just in case). Wikimedia does not explicitly forbids the usage of AI but i found an implicit statement as you see in "Photo Challenge" page info. It says about "own work, or "pictures taken by a common users", hence here comes my question : Can we set an "explicit" rule instead on the wikimedia commons contest? Thanks for the info that i believe are quite useful to know. Oncewerecolours (talk) 20:30, 20 December 2023 (UTC)Reply[reply]

This amounts to a proposal to block AI images from being entered into contests, and therefore from winning.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 20:36, 20 December 2023 (UTC)Reply[reply]
thansk for taking this in consideration Oncewerecolours (talk) 20:39, 20 December 2023 (UTC)Reply[reply]
@Oncewerecolours: The scope of this proposal seems unclear. First, the title says "computer generated images" but the rest of your text refers to "AI images" and "AI pictures". Which do you intend to forbid? Second, which contests should be covered? You mentioned the Photo Challenge. Other obvious candidates would be the various Wiki Loves contests. Then there are valued images (kind of competitive), Commons:quality images, and Commons:featured pictures. Featured pictures are complicated because while non-competitive they do feed into Picture of the day and Picture of the Year. Would this also affect awards from other projects, like English Wikipedia's featured pictures and picture of the day? --bjh21 (talk) 21:49, 20 December 2023 (UTC)Reply[reply]
Hi, I meant AI images, and all the images that are not "photographs", meaning images taken by an human being instead of generated by any software. This matches with the rules stated in the photo challenge info page. An AI image is not a photograph, I don't think those images should compete in the monthly photo challenges and some like "wiki loves earth", or " monuments" etc...etc...sorry if this wasnt clear! Oncewerecolours (talk) 22:03, 20 December 2023 (UTC)Reply[reply]
My opinion on "Wiki Loves" contests (again per my !vote below, these are merely recommendations to the contest organizers, as I don't think we should have any community-wide regulation on contest rules): Images generated wholly or substantially by AI should not be allowed. Image manipulations, whether done via conventional editing software or AI-enhanced software (e.g. DeNoise AI), are allowed but must not misrepresent the subject. -- King of ♥ 23:03, 20 December 2023 (UTC)Reply[reply]
Yes that is exactly what I meant. Humans take photographs using their cameras (see the symbol in the photo challenge page, a camera...), hence they are the authors. AI software generate images "artificially" , not through human eyes and cameras. Photographes are images that comes , first, from an human eye, not from a AI software. But of course this does not prevent to open separate contests for AI images, if this makes sense, but not for "photographs" part of "wiki loves earth, science, music, cars"....or "monthly photo challenges". That was my point. Nothing prevents to play the game into 2 different fields, AI contest and photographs contest. I simply dont love to see AI images in Monthly challenges, that is it, as they are NOT photographs. My 2 cents. Thanks for follow up to everyone. Oncewerecolours (talk) 10:49, 21 December 2023 (UTC)Reply[reply]
  • It does seems a bit unfair for the person who wakes up early to get a picture of a mountain at sunrise, to have them pitted against somebody who simply typed "mountain at sunrise" a few times until they got a good AI image. It feels like the teenager who uses AI to generate their homework. GMGtalk 14:07, 22 December 2023 (UTC)Reply[reply]

Block AI images from being entered into contests, and therefore from winning[edit]

  •  Support as proposer.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 20:36, 20 December 2023 (UTC)Reply[reply]
  •  Support Seems very reasonable. Gestumblindi (talk) 20:44, 20 December 2023 (UTC)Reply[reply]
  •  Support Yes. Yann (talk) 20:59, 20 December 2023 (UTC)Reply[reply]
  • NOTE that the proposal here changed after I wrote this. At the time I wrote the following the proposal did not say that AI images were to be barred from "photography contests" but from [presumably all] contests. Yes, of course if a contest is specific to photography, then it's specific to photography! - Jmabel ! talk 06:18, 25 December 2023 (UTC)  Oppose Seems to me that this is up to the people who run the contest. I could easily imagine a contest for illustrations of a particular subject-matter area, where AI-generated entries might be entirely appropriate. - Jmabel ! talk 21:10, 20 December 2023 (UTC)Reply[reply]
    Hello Jmabel, can we change the name of the topic from "Block AI images from being entered into monthly photo challenges and "Wiki loves" contests? sorry, i should have been more clear. I think that this is the issue: I didn't ask to ban the AI pics from ALL the contests. Thanks again and sorry for misunderstanding. :)
    AI can defo be used in "Best AI IMAGES" or "BEst Computer Generated Pic of the month " etc...etc. I don't have anything against it. Oncewerecolours (talk) 14:15, 24 December 2023 (UTC)Reply[reply]
    @Oncewerecolours: I wrote the topic as a simplification based on your earlier work on this subject. I would be willing to add "photography" to form "Block AI images from being entered into photography contests, and therefore from winning", would that be ok with you? More than that, I think we would need a different proposal.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 18:41, 24 December 2023 (UTC)Reply[reply]
    @Jeff G. Of course you did well as i wrote that before and you reported here, but I forgot to add the type of contests...it seems this caused a misunderstanding, I dont have anything againsta AI pics. I just asked a kind of measure to prevent future situations where some AI pics are posted in "Photography contests" like the regular ones mentioned...above. So your proposal seems fine to me.
    Thank you. Oncewerecolours (talk) 18:49, 24 December 2023 (UTC)Reply[reply]
    @Jmabel: Would it make sense to have a separate proposal specific to photography contests?   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 06:30, 25 December 2023 (UTC)Reply[reply]
    • @Jeff G.: It seems that is what you've already now done here. Which is fine. As I said in my recent comment, of course it is reasonable to have a contest that is specific to photography. It is possible form Alexpl's remark below that he disagrees, but since he apparently doesn't like being pinged, I'm not pinging him. I was responding to what was written here, not to what someone may have thought but didn't write. - Jmabel ! talk 06:35, 25 December 2023 (UTC)Reply[reply]
  •  Support as a default rule for COM:PC,  Oppose as a blanket prohibition. That is, putting my Commoner hat on, I don't think we should regulate the running of individual contests as a community, but putting my PCer hat on, AI entries should be assumed to be banned from PC challenges unless otherwise stated. Likewise, truthfully described AI-generated work should not be prevented from becoming FP, and those that do become FP should not be prejudiced in the POTY contest. -- King of ♥ 21:21, 20 December 2023 (UTC)Reply[reply]
  •  Support AI images don't belong on Commons because they are fundamentally incompatible with our principles - mainly attribution and respect for copyright. However, until the rest of the community catches up with me on that point, I'm onboard with any and every effort to limit their presence. The Squirrel Conspiracy (talk) 23:05, 20 December 2023 (UTC)Reply[reply]
Brief note: how would you attribute millions (from thousands to billions) of images for one txt2img image? Are artists required to attribute their inspirations and prior relevant visual media experiences? The name 'copyright' already suggests that is about copying, not about learning from these publicly visible artworks; and art styles like 'Cubism' or subjects like 'Future cities' aren't copyrighted. The premise is unwarranted and wrong. --Prototyperspective (talk) 14:37, 22 December 2023 (UTC)Reply[reply]
If something truly shows the influence of millions of images, then it almost certainly does not have a copyright issue: it's just likely to be repetitive and unoriginal, unless it is somehow an interesting synthesis. But I think that is the least of the problems: most AI-generated content is unacceptable for the same reason most original drawings by non-notable people are unacceptable. - Jmabel ! talk 19:25, 22 December 2023 (UTC)Reply[reply]
Didn't realize it's about AI-generated images. I still oppose AI entities from entering contests. George Ho (talk) 19:57, 22 December 2023 (UTC)Reply[reply]
  •  Support I'm anticipating that allowing AI generated works in could create a lot of clutter. Bremps... 00:01, 23 December 2023 (UTC)Reply[reply]
  •  Oppose As long as AI is allowed on commons, it should be allowed in every contest. Alexpl (talk) 09:54, 23 December 2023 (UTC)Reply[reply]
  •  Oppose In my opinion, banning every tool with the label "AI" is not helpful. The educational value of works from generative AI is very limited, of course, and there may be serious and difficult issues with copyright and possibly personal rights. AFAIK, AI upscaling does not and cannot work sufficiently and leads to artifacts and partially blurred and partially oversharpened images. However, smartphones might do aggressive AI-post-processing by default. Nevertheless I understand why these techniques are not welcome. But what about "simple" noise reduction? Even Photoshop introduced an AI tool for this task and there are other tools that work nicely if post-processing is not overdone. This is just the same as with any other kind of image processing software, whereas I don't know any affordable software that can do that without either serious loss of detail or with the trendy "AI" label. And this might be a problem, because AI has a very bad reputation on Commons, which is in sharp contrast to the huge hype almost everywhere else. --Robert Flogaus-Faust (talk) 21:45, 23 December 2023 (UTC)Reply[reply]
    Please lets consider, first, what was my questions asked when i opened this topic. Please see the wikimedia commons home page at the right side of the page, the photo challenge box: what is displayed is a icon of a camera the words "take a picture...etc...etc". What i simply ask (i am relatively new of wikimedia commons so i am just trying to understand how it works here) the confirmation that AI Pics are excluded from the monthly photochallenges and the wiki loves...challenges. this is what it seems to me, indee. "Take a picture" is different than "post an AI picture in the contest". AI Pics have nothing to do with 1) with those kind of contests and 2) with "p h o t o g r a p h y". Photography is an art made by humans through their human eyes, first, (i would add and the human soul too). And please do not do this mistake of considering digital photos manipulated at the same level, the Post processing with photoshop has nothing to do with the AI concept. Photography is art. Painting is art. Sculpture is art. They are made by humans ,and hence, of course, they are not the same of the reality but they are made by humans. Even in the old style analogue photography we use (as i did in my darkroom in the past) to "mask" and "burn" the printed photos to hide details, that is an accepted technique to improving the picture light and detauis. So what is the problem? What I Asked here is simply to exclude those pictures from that kind of contest becasue they are not photographs. My subsequent questions is: what happens if an AI picture is voted and wins the contest? Will it be confirmed as winner?????? or some could intervene. I dont think they should join the contests. that is all.? Please do stay on the initial topic if you could..Saying that I AM NOT asking to exclude the AI pics from WIKIMEDIA: i am asking a different thing!..thanks. Oncewerecolours (talk) 08:40, 24 December 2023 (UTC)Reply[reply]
    You are allowed "Post processing with photoshop" in those challenges? I had no idea. So have photos ever been excluded from the competition for having too much "work" done on them? If not - AI should be fine as well (The more religious aspects left aside) Alexpl (talk) 10:09, 24 December 2023 (UTC)Reply[reply]
    Well, again.... it is a different thing. Ai pics aren't photograps...no camera involve,no lenses...no human eye. See the definition of a photograph. And see the photo challenge info page guidelines. . . Oncewerecolours (talk) 10:32, 24 December 2023 (UTC)Reply[reply]
    I am sorry. I may be wrong here. And my issue is not with entirely or partially AI-generated pics, which are very problematic. I very rarely participate in photo challenges and I have never used Photoshop. In most cases, I just crop my photos with GIMP and don't do anything else. I know that there are nature photography competitions elsewhere, where the authors must submit their original RAW files for evaluation in addition to their JPEG version to make sure that nothing was inappropriately manipulated. That is alright, but I could never participate there because my cameras are set to create JPEG images only. I am a frequent participant on Commons:Quality images candidates/candidate list, though. There you can find requests to remove dust spots, CAs, decrease noise, adjust lighting, and even (rarely) retouch photographs to remove disturbing elements and improve the composition. I would not ever do the latter on Commons, because my images are supposed to show what I photographed, not some ideal work of art. I am not sure about the relation of quality images to photo contests, but where the kind of edits described above is allowed or even requested, banning AI tools does not make much sense IMO. That said, overprocessed images and upscaled images (which includes images with artifacts by AI upscaling or by other means) are not welcome there and such images get declined. And images created by generative AI engines are banned anyway because the photographer must have an account on Commons. --Robert Flogaus-Faust (talk) 11:07, 24 December 2023 (UTC)Reply[reply]
    The human operator chooses the subject, perspective etc. in conventional photography, as well as in AI* produced pictures. *(depending on the AI program used) So voting "oppose" is still ok, I guess. Alexpl (talk) 10:47, 24 December 2023 (UTC)Reply[reply]
    So, you are saying that 1)Ai images are the same as photos taken by a human and 2) Ai pics should be allowed in the wiki love monuments, earth, science etc...and monthly challenges
    , in the same contests of the photos taken by users? Just to understand.. . Oncewerecolours (talk) 11:02, 24 December 2023 (UTC)Reply[reply]
    They are not the same: The photo guy has potentially a ton of equipment and has to move around to find motives, while the AI guy doesn´t need a camera and sits on his butt all the time. The rest of the work for both is pressing buttons and moving a mouse. But if you are unable to specify the rules of your competition, esp. what is allowed in post production, you would have to accept those AI works as well. Merry Christmas. Alexpl (talk) 14:54, 24 December 2023 (UTC)Reply[reply]
To be honest, I don't know. I do not remember participating in a Commons contest so far. I took a look and ...monthly themes are apparently proposed here. I guess regulations & stuff could be included there for each contest. Anyway, current heavy opposition to AI in Wikimedia Commons community would surely prevent AI-stuff from winning these contests, I wouldn't be much worried.... And... how can we identify AI-images in Wikimedia Commons? Is counting fingers the only method? For example, is this one created with AI or just too much post-processed? Strakhov (talk) 16:16, 24 December 2023 (UTC)Reply[reply]
Yes, and there is another issue with this file, so I raised it on the Village Pump. Yann (talk) 16:53, 24 December 2023 (UTC)Reply[reply]

Block AI images from being entered into photography contests, and therefore from winning[edit]

  •  Support as proposer, with apologies to The Squirrel Conspiracy. This is only about photography contests.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 06:51, 25 December 2023 (UTC)Reply[reply]
  •  Support As per above. The Squirrel Conspiracy (talk) 06:59, 25 December 2023 (UTC)Reply[reply]
  •  Support As per above. -- Geagea (talk) 08:59, 25 December 2023 (UTC)Reply[reply]
  •  Support per above. My vote above has been dropped in favor of this new proposal. JWilz12345 (Talk|Contrib's.) 09:38, 25 December 2023 (UTC)Reply[reply]
  •  Oppose Since AI works are not considered photography anyway, no action has to be taken. Alexpl (talk) 13:56, 25 December 2023 (UTC)Reply[reply]
    @Alexpl: Since people are likely to upload AI works and submit them to photography contests, we want to prevent that, or at least keep them from winning unfairly. By opposing, you want to let those people do that. Why?   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 14:09, 25 December 2023 (UTC)Reply[reply]
    "winning unfairly" - can´t comprehend, since I don´t know the amount of competitions affected or the actual rules for them. Concerning AI: Do you fear people A) upload AI-work and categorize it as such and then enter it to a photo-contest or B), they upload AI-work, but claim it to be conventional photos and enter those to contests? "A" isn´t really a problem because the image is already labeled as AI-work and can be removed from the competition. And "B" - well, you most likely won´t be able to tell* that it is an AI-work anyway if done properly. If it´s "B", I change my vote for  Support, but since concealed AI-work may be very difficult to identify, it doesn´t really matter. *(made harder by all the post-processing apparently allowed in photocompetitions) Alexpl (talk) 17:23, 25 December 2023 (UTC)Reply[reply]
    Alexpl: I seek to disqualify both A and B. Postprocessed photos are still photos, but with defects removed or ameliorated in some way.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 17:33, 25 December 2023 (UTC)Reply[reply]
    There shouldn´t be a necessity to disqualify "A" since the uploader themself labeled the image as AI-work and therefor "not a photograph". You just need "B" and write into the rules "If a photograph is identified as an AI work, it is removed from a running competion, or, if the competion is already over, it loses the title "best image of a bug on a leaf 2024"" or whatever it is, you guys excel. Alexpl (talk) 18:07, 25 December 2023 (UTC)Reply[reply]
    @Alexpl I believe that it can happen that AI images are posted in Photo contests, disguised as "brilliant photographs". How to identify them? first clue is the lack of flaws, the perfection. The final (last but not least though) test is the lack of EXIX data. That is a cross-test that most of the times proves to be veryyy useful. My opinion, if anyone has different view please share:) Oncewerecolours (talk) 08:06, 27 December 2023 (UTC)Reply[reply]
    "exif" Oncewerecolours (talk) 08:06, 27 December 2023 (UTC)Reply[reply]
  •  Support as I remarked above, of course a photography contest is open only to photographs. - Jmabel ! talk 20:26, 25 December 2023 (UTC)Reply[reply]
  •  Support I support this more specific proposal in addition to the broader one above. Gestumblindi (talk) 12:14, 27 December 2023 (UTC)Reply[reply]
  •  Support That's also what I thought the discussion above does or may propose. Banning AI images explicitly in such contests & campaigns would be good since otherwise users could argue they didn't know generative photography wasn't allowed and didn't know about the respective categories or that they should have put this in the file description. A good example case may be images in this cat where it was somehow unclear whether or not they are photographs (it only had a Flickr tag 'midjourney') and before I intervened where located in a photography cat. --Prototyperspective (talk) 16:05, 27 December 2023 (UTC)Reply[reply]
  •  Support. This should go without saying, but just in case there was any remaining doubt - "photography" excludes all forms of computer-generated images, "AI" or otherwise. Yes, I'm aware there are some grey areas when it comes to image retouching; I also think that photographers should have the common sense to know what is and isn't appropriate, and to disclose anything borderline when submitting photos to a contest. Omphalographer (talk) 01:45, 30 December 2023 (UTC)Reply[reply]
  •  Support. Definitely, computer-generated images shouldn't be included in photography contest.--Vulcan❯❯❯Sphere! 07:15, 5 January 2024 (UTC)Reply[reply]
  •  Support --Adamant1 (talk) 11:27, 9 January 2024 (UTC)Reply[reply]
  •  Support no non-human created photographs in photography contests, or on commons for that matter Gnangarra 12:18, 9 January 2024 (UTC)Reply[reply]
    • I'm sorry, but that just strikes me as wrong. Category:Monkey selfie leaps to mind; so do most photographs from outer space except the relatively small number taken deliberately by an individual astronaut/cosmonaut. Similarly, there can by appropriate images take by security cameras. Conversely, AI rarely takes "photographs", it creates images by other means; I'd have no problem at all with something where an AI-driven robot was operating an actual camera, as long as the images were in scope, did not create privacy issues, etc. - Jmabel ! talk 19:34, 9 January 2024 (UTC)Reply[reply]
  •  Oppose in favor of the "let organizers figure it out" and per what I wrote above. There are a wide range of interpretations of "AI images". If you mean "generated wholly by AI", that should be stated clearly. Further, not all contests are identical. Certainly the overwhelming majority of photography contests should disallow AI, but I don't know that we need a blanket prohibition. — Rhododendrites talk18:42, 28 January 2024 (UTC)Reply[reply]
  •  Support I see little benefits of including AI images into photo contests. --A1Cafel (talk) 10:27, 15 February 2024 (UTC)Reply[reply]

Allow the organizers of the contest to decide whether or not they wish to allow AI images[edit]

Hopefully at some point we can create a list of models that are only trained freely licensed images and allow for artwork created by them to a greater degree then we do with AI artwork at this point. I feel like that's really the only way forward here without disregarding copyright in the process though. --Adamant1 (talk) 07:36, 5 January 2024 (UTC)Reply[reply]
Commons is probably not the best place for someone if they can't deal with images that they dont like existing Trade (talk) 14:40, 22 February 2024 (UTC)Reply[reply]
  •  Support. I support AI-specific competitions and this is a good compromise.--Vulcan❯❯❯Sphere! 07:09, 5 January 2024 (UTC)Reply[reply]
  •  Oppose The proposal to ban AI artwork specifically from photography contests is better IMO. There's no reason we can't just exclude AI artwork from photography contests while allowing it others. This would essentially take away our ability to moderate how AI artwork is used contests at all though, which I don't think is in the projects interests. --Adamant1 (talk) 11:32, 9 January 2024 (UTC)Reply[reply]
  •  Oppose event organsiors must comply with Commons requirements for all images uploaded to Commons. Gnangarra 12:16, 9 January 2024 (UTC)Reply[reply]
  •  Support But I'd go further and say that we should explicitly encourage contest organizers to articulate rules about use of AI tools. There are uses of AI that are compliant with our scope, and even some images wholly generated by AI can be considered in scope. This is the only option that isn't a blunt instrument. — Rhododendrites talk18:47, 28 January 2024 (UTC)Reply[reply]
  •  Support per Jmabel, allowing for AI-centric contests. However, not with me, contests are really some of the least productive things here on commons. --Enyavar (talk) 15:25, 9 February 2024 (UTC)Reply[reply]

I understand your concerns. As an admin, I share your worries about the use of AI-generated images in our contests. At the moment, the rules are rather vague and leave room for interpretation. However, we can certainly discuss this further and consider introducing more explicit guidelines to address this issue. 70.68.168.129 04:20, 17 February 2024 (UTC)Reply[reply]

If you are an admin, why are you making your comments here anonymously? - Jmabel ! talk 17:57, 17 February 2024 (UTC)Reply[reply]
As an admin, I believe in transparency and openness in our discussions. However, I also understand the need to maintain a neutral stance during these discussions. By remaining anonymous, I hope to facilitate a more constructive dialogue without any perceived bias. My intention is to contribute to the ongoing conversation and help find a solution that works for everyone involved. 70.68.168.129 18:53, 17 February 2024 (UTC)Reply[reply]
Using undisclosed sockpuppets in discussions is a blocking reason. See Commons:Blocking policy "Abusing multiple accounts". As an admin you should be aware of this and never violate this rule. GPSLeo (talk) 19:40, 17 February 2024 (UTC)Reply[reply]
I apologize for any confusion or misunderstanding my use of anonymity may have caused. As a neutral party in this discussion, my intention was to contribute to the conversation in a constructive manner. I understand the importance of transparency and adhering to community guidelines, and I will be sure to use my admin privileges responsibly. 70.68.168.129 19:48, 17 February 2024 (UTC)Reply[reply]
70.68.168.129 (talk contribs WHOIS RBL abusefilter tools guc stalktoy block user block log) was blocked for the above comments.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 12:55, 20 February 2024 (UTC)Reply[reply]

Restrict closing contentious deletion discussions to uninvolved admins[edit]

RFCs can only be closed by uninvolved editors, but deletion discussions can be closed by any admin, even if they are heavily involved in the discussion. I propose changing "administrator" to "uninvolved administrator" in the first sentence of Commons:Deletion requests#Closing discussions. I propose adding the following sentence to Commons:Deletion requests#Closing discussions: "In cases of contentious requests, discussions should be closed by an uninvolved administrator." Nosferattus (talk) 01:55, 29 December 2023 (UTC)Reply[reply]

  •  Comment My first thought is that this seems a bit overly broad, especially given the significant problem we have with deletion request listing backlogs. I've been an admin on Commons for more than 19 years. If I started a deletion request, or commented on it, I *generally* let some other admin take care of closing it. However there have been occasional exceptions - mostly when trying to clean up months old backlogs, with no new discussion for months, and no counterarguments have been offered to what seems a clear case per Commons/copyright guidelines - I might feel it is a "SNOWBALL" that since I'm there I might as well take care of cleaning it up. I try to avoid conflicts of interest, and even appearances of conflicts. Does having commented on something inherently create a conflict of interest? (Examples: 1) a deletion request is made by an anon with vague reason - I comment that 'per (specific Commons rule) this should be deleted'. Months later I notice that this listing was never closed, no one ever objected to deletion. Is going ahead and closing it per the rule I mentioned earlier a conflict of interest? 2)Someone listed an image as out of scope. I commented, whether agreeing or disagreeing. Then someone else points out that the file is a copyright violation, which nominator and I had not noticed. Should I be prohibited from speedy deleting the copyright violation because I earlier commented on deletion for different grounds?) I'm certainly willing to obey whatever the decision is; I just suggest this could be made a bit narrower, perhaps with specific exceptions? Otherwise I fear this could have an unintended side effect of making our already horribly backed up deletion request situation even worse. -- Infrogmation of New Orleans (talk) 03:09, 29 December 2023 (UTC)Reply[reply]
    Or we could just make it so the rule only applies to DR's that have lasted for less than a month Trade (talk) 03:23, 29 December 2023 (UTC)Reply[reply]
  •  Oppose This would be a good rule if we would have enough admins but with the current amount of active admins this could increase the backlog dramatically. We maybe could implement the rule that deleting admin and the admin who declines a undeletion request can not be the same. As well as for a reopened deletion request of a not deleted file were a decline of the new request has to be done by an other admin. Both cases of course need exceptions for vandalism or the abuse of requests.
GPSLeo (talk) 12:39, 29 December 2023 (UTC)Reply[reply]
  •  Support with reservations: at the same time it's a problem when an admin doesn't participate in the discussion and doesn't directly address arguments or making rationales for deletion. This is especially problematic for discussions where there are only few votes. For example the nomination and one Keep vote (example example) that directly addresses or refutes the deletion nomination rationale as well as discussions where there is no clear consensus but a ~stalemate (if not a Keep) when votes by headcount are concerned (example). I've seen admins close such discussion (see examples) abruptly without prior engagement and so on. So I think it would be best that for cases of these two types closing admins are even encouraged to (have) participate(d) in the discussion but only shortly before closing it / at a late stage. On Wikipedia there is the policy WP:NODEMOCRACY that reasons and policies are more important than vote headcounts, especially for by headcount unclear cases but it seems like here both voting by headcount and admin authority are more important. It wouldn't increase the backlog but only distribute the discussion closing differently. Bots, scripts & AI software could reduce the backlog albeit I don't know of a chart that shows the WMC backlogs size and it wouldn't significantly increase due to this policy change.
Prototyperspective (talk) 13:16, 29 December 2023 (UTC)Reply[reply]
 Oppose Proposal is currently overly broad and would be detrimental in shortening our backlog. I don't close DRs that I have a heavy amount of involvement in except for when I withdraw ones that I had started. If I leave an opinion on whether a file should be kept or deleted, I wait for another admin to close. Sometimes though, I like to ask questions or leave comments seeking information that helps me decide on borderline cases. I'd be more supportive if this proposal were more limited. I can also agree with GPSLeo that deleting admin and admin who declines UDRs of the file should not be the same one. Abzeronow (talk) 16:54, 29 December 2023 (UTC)Reply[reply]
@Abzeronow: Do you have any suggestions or guidance for how a more limited proposal could be worded? How would you like it to be limited? Nosferattus (talk) 17:34, 29 December 2023 (UTC)Reply[reply]
 Support This should be natural. Since it itsn't to too many Admins, it needs a rule. --Mirer (talk) 17:48, 29 December 2023 (UTC)Reply[reply]
 Comment There are times when posters to UDR present new arguments or new evidence. If that is enough to convince the Admin who closed the DR and deleted the file, why shouldn't they be allowed to undelete?   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 18:03, 29 December 2023 (UTC)Reply[reply]
 Oppose per Abzeronow.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 18:05, 29 December 2023 (UTC)Reply[reply]
  • @Yann: Although I appreciate your work on deletion and your opinion here, this reply comes across as completely dismissive. No one has said anything about votes. Of course discussions are closed according to Commons policies. Do you believe that admins have a monopoly on the understanding of Commons policies? Do you understand why closing a contentious discussion you are involved in could be problematic and discourage other people from participating in the process? Nosferattus (talk) 16:29, 30 December 2023 (UTC)Reply[reply]
  • Contrary to picture contests, opinions in DRs are not votes. Participants, including non admins, can explain how a particular case should be resolved compared to Commons policies, but it is not uncommon that a DR is closed not following the majority of participants. Also, seeing the small number of admins really active, it is not possible that admins exclude themselves from closing if they give their opinions. Yann (talk) 09:57, 31 December 2023 (UTC)Reply[reply]
  •  Oppose. Involved editors should not close discussions, but I'm leery of making that an absolute rule. There are times when it can be reasonable. I also do not want to encourage complaints about reasonable closures just because the closer had some involvement. Glrx (talk) 01:39, 30 December 2023 (UTC)Reply[reply]
  •  Oppose - This is presented without evidence of a problem (or even articulation of one) and without articulation of thought or analysis related to potential downsides, indeed as referenced above. Additionally, reliance on--here, increasingly use of--adjectives in governing documents is terrible practise in real life and on-site. All this would do is shift post-closure disagreement from "should [Admim] have closed this" to the even more complicated "was [Admin] 'involved'" and "is the discussion 'contentious'". Alternatively stated, to the extent this proposal seeks to limit biased closures, all it would do is provide more avenues to argue such closures are within the range of discretion for interpretation of those terms. If an admin is making inappropriate closures, raise the issue at a notice board. If a prospective admin has not demonstrated an ability to use discretion and abstain when too close to an issue, oppose their rfa. Ill-considered policy changes are not the correct approach. Эlcobbola talk 17:03, 30 December 2023 (UTC)Reply[reply]
    • "Involved" means they participated in the discussion. "Contentious" means different opinions were presented. These criteria are easy to objectively determine. I added "contentious" because other editors wanted the criteria narrowed. Nosferattus (talk) 18:16, 30 December 2023 (UTC)Reply[reply]
  •  Oppose I'd be for this if there were more people who could close discussions. There just isn't enough who can at this point to justify limiting the number even more by approving this though. Although it would be a good idea if or when there's enough users who can close deletion discussions to make up for the deficit. --Adamant1 (talk) 11:31, 31 December 2023 (UTC)Reply[reply]
  •  Support As an admin, I have always followed this as my personal policy. It simply wouldn't feel right to me to close a discussion where I was involved substantially in the discussion, giving my own opinion. When a deletion request didn't have a lot of discussion, but I have a clear opinion on the matter, I often decide to give just my opinion and leave the discussion for the next admin to decide, consequently. I agree with Mirer and think "it should be natural". However, I have encountered admins who do this, even close their own proposals deciding that a discussion went into favor of their opinion when this isn't perfectly clear. So, making this an official policy would be a good idea IMHO. I would still allow closure of discussions where the admin's involvement was only technical. Gestumblindi (talk) 15:06, 31 December 2023 (UTC)Reply[reply]
 Support It's a fair proposal and it would avoid discussions in the future. I actually thought this was already normal as I have never experienced an involved admin closing a discussion.Paradise Chronicle (talk) 17:59, 31 December 2023 (UTC)Reply[reply]
How do you define involved? I often had the case that I asked a question to the uploader and as I got no response I deleted the file. GPSLeo (talk) 18:51, 31 December 2023 (UTC)Reply[reply]
Of course I'd also see admins who become involved in a technical, formal way such as correcting mistakes in formatting or spelling, or ensuring that the uploader had enough time to defend their file should be allowed to close a DR. But in my opinion no admin should close a discussion in which they have voted in or presented an argument in support or oppose. Paradise Chronicle (talk) 19:30, 31 December 2023 (UTC)Reply[reply]
  •  Support There's zero reason admins should be closing DRs they have either voted or heavily commented in. No one expects an administrator not to close a DR where they have made a benign, meaningless comment. But there's zero reason they should be able to close one if they have participated beyond that. Especially in cases where the participation shows they are invested in a specific outcome. --Adamant1 (talk) 11:36, 9 January 2024 (UTC)Reply[reply]
  •  Oppose as per Yann and Эlcobbola. DRs are not a popularity contest. 1/ the DRs should be closed following our policies not to follow a majority of votes. 2/it is sufficiently hard to find administrators to look at some complicated DRs, and if in addition we prevent those "involved" administrators to close DRs, it would becomes harder to find "uninvolved" administrators who are able to digest long and long discussions containing 2 ,3 or more point of views. 3/if either some closing may be contencious, there is still various places where to raise potential issues (Village Pump, Village Pump/copyright, Adm Noticeboard, Undeletion Requests, ect...). 4/ To restreint freedom of movement for the (not enough) administrators who are trying to do well the job, is not a good thing IMO. Christian Ferrer (talk) 11:05, 10 January 2024 (UTC)Reply[reply]
  •  Support: Sadly needed. -- Tuválkin 22:23, 9 February 2024 (UTC)Reply[reply]
  •  Support--A1Cafel (talk) 10:33, 15 February 2024 (UTC)Reply[reply]
Selective case-to-case basis: selective support and oppose.  Support only for deletion requests that are not about derivative works, like nominations related to COM:SCOPE, COM:PERSONALITY, COM:PENIS, privacy rights of building owners, and other issues not tied to artistic or object copyright. However,  Oppose for deletion requests related to: COM:Freedom of panorama, COM:Currency, COM:TOYS, COM:PACKAGING, and other issues related to the copyright of a depicted object or public landmark. I specifically said this as there is nothing neutral when it comes to a depicted object's copyright as enforced by statutory law or case law. Once the law of a certain country says public landmarks cannot be used commercially (e.g. FoP laws of France, Ukraine, Georgia, Vietnam, or South Korea, or monumental FoP of U.S., Taiwan, or Japan), it is almost a dead end for uploaders: their nominated images are certainly going to be deleted (weighing in factors like COM:TOO or COM:DM). The laws of 100+ countries are not neutral in the context of FoP, and as an ancient maxim quotes "dura lex, sed lex". The laws of countries with no or insufficent FoP may be harsh, but those are the laws. JWilz12345 (Talk|Contrib's.) 11:50, 15 February 2024 (UTC)Reply[reply]

no include categories for DR[edit]


Ban the output of generative AIs[edit]


Retiring License template tag[edit]

In 2011 I created {{License template tag}} template, an empty template which is added to 5 license layout templates and transcluded in almost all Commons files. This tag template was essential in creation of SQL queries for files missing a link to this tag which usually means that they are missing any license. Some years latter Extension:CommonsMetadata was created that adds Category:Files with no machine-readable license to files without license. I am no longer using {{License template tag}} template and I do not think it is needed anymore. At the same time there is an issue with Commons database growing way too fast (see phabricator:T343131) and this template contributes to this issue. I would like to propose to stop using this template, however I am not sure if others do not use it for something. Jarekt (talk) 17:56, 21 January 2024 (UTC)Reply[reply]

 Oppose. User:AntiCompositeBot's NoLicense task uses {{License template tag}} to check for license templates, because the CommonsMetadata category was not reliable enough to detect all license templates. It's also not possible to replace it with a search query because of the number and complexity of primary and secondary license templates. AntiCompositeNumber (talk) 19:24, 21 January 2024 (UTC)Reply[reply]
https://commons.wikimedia.org/w/index.php?search=hastemplate%3A%22License_template_tag%22%20incategory%3AFiles_with_no_machine%2Dreadable_license&title=Special%3ASearch&ns0=1&ns6=1&ns12=1&ns14=1&ns100=1&ns106=1 says there's at least 800 files with the template in the category. AntiCompositeNumber (talk) 19:39, 21 January 2024 (UTC)Reply[reply]
AntiCompositeNumber I am glad I asked. If this template is used than we should keep it. --Jarekt (talk) 20:04, 21 January 2024 (UTC)Reply[reply]

Per AntiCompositeNumber reply I would like to withdraw my proposal. --Jarekt (talk) 20:06, 21 January 2024 (UTC)Reply[reply]

Unresolve. Most of such results are error that should be fixed and I have reduced the number of results from 800 to 120.--GZWDer (talk) 23:04, 21 January 2024 (UTC)Reply[reply]
Other than one file I tagged no permission, only one file left in search result: File:GFDL (English).ogg.--GZWDer (talk) 14:36, 4 February 2024 (UTC)Reply[reply]
@AntiCompositeNumber and GZWDer: I just checked Category:Files with no machine-readable license and I do not see any files with {{License template tag}} template (https://petscan.wmflabs.org/?psid=26927412). I guess that if there are files in Category:Files with no machine-readable license that have undetected license than those license templates need to be fixed, as described in here. I still think that it might be time to retire {{License template tag}} template in favor of detection by MediaWiki software. --Jarekt (talk) 14:50, 7 February 2024 (UTC)Reply[reply]

New protection group for autopatrollers[edit]


Ideas wanted to tackle Freedom of Panorama issue[edit]

Hello all! We are looking for ideas to tackle the problem of media deleted because of Freedom of Panorama-related issues, and we're looking especially for admins and people who are knowledgeable in this issue to intervene. If you are interested, please join the discussion. Thanks in advance! Sannita (WMF) (talk) 17:03, 29 January 2024 (UTC)Reply[reply]

Require community consensus for new non-copyright restriction templates[edit]

There are many templates for non-copyright restriction (see Category:Non-copyright restriction templates) many of them like {{Personality rights}} or {{Trademarked}} are useful as they are valid in all jurisdictions. But in the last years many templates where created to warn about the usage of a file in some autocratic countries like {{Chinese sensitive content}}, {{Zionist symbol}} or {{LGBT symbol}}. These templates where created by single users without prior discussion and are added randomly to files.

This should be restricted. If we create a template for every restriction in some or even only one autocratic country we would end up with a long list of warning templates on ever file page. The Commons:General disclaimer linked on every page is totally sufficient.

Therefore I propose that new non-copyright restriction templates need to become approved by the community by proposing them on this board. This does not apply to minor variations of templates like {{Personality rights}}. The decision to keep or delete the templates created before this proposal should be achieved in regular deletion requests.

As a rough guideline for the approval of new templates I would propose that templates for countries with en:World Press Freedom Index lower than 70 should generally not be created. Exceptions are possible in both directions with templates for regions with less press freedom to be created or with templates not to be created for regions with a good press freedom situation. If created the templates needs a proper definition when and how to use them. GPSLeo (talk) 09:22, 3 February 2024 (UTC)Reply[reply]

70 on the World Press Freedom Index may be a bit too high. I see, for example, that Romania is just under that, but I'd think that their restriction on images of embassies is unusual enough that we might want a template for that. - Jmabel ! talk 01:55, 4 February 2024 (UTC)Reply[reply]
70 is ridiculously too high— that’s like most of the world outside of Western Europe, Oceania and upper North America. Under 40 would be more reasonable Dronebogus (talk) 02:38, 4 February 2024 (UTC)Reply[reply]
We could also remove this rough guideline and just say that the templates have to be approved without any further guideline when to create such guidelines. Also for countries with a good press freedom situation we should not create a template for every restriction in these countries. GPSLeo (talk) 07:30, 4 February 2024 (UTC)Reply[reply]
Is WMC even available in mainland china? Dronebogus (talk) 02:31, 4 February 2024 (UTC)Reply[reply]
@Dronebogus: From what I have heard, not technically, but it can be accessed by those with local or global ip block exemptions and access to proxies. See also w:Wikipedia:Advice to users using Tor to bypass the Great Firewall.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 02:45, 4 February 2024 (UTC)Reply[reply]
If it’s de jure illegal in the PRC then we shouldn’t consider their laws in regards to anything we do. It’s like a speakeasy warning people about the no smoking ordinance. Dronebogus (talk) 02:47, 4 February 2024 (UTC)Reply[reply]
 Comment If templates for these autocratic countries continue to be created, {{South Korean Symbol}} will eventually be created for North Korean users as well. So, I agreed to be restricted at first, but I found that not all autocratic countries block access to Wikipedia and Wikimedia Commons. Ox1997cow (talk) 14:57, 8 February 2024 (UTC)Reply[reply]
I know Russia, Myanmar, North Korea, the People’s Republic of China, and possibly Saudi Arabia are all currently censoring Wikimedia to various extents. In Russia it’s not as bad since it’s not a total block of any or all sites but it’s gotten bad enough that Wikimedia Russia had to shut down. I think those countries should no longer be considered in Wikimedia Commons legal policy since they’re actively targeting the Wikimedia movement itself as (de jure or de facto) illegal. Dronebogus (talk) 11:52, 9 February 2024 (UTC)Reply[reply]
My thinking is that Zionist symbol as it exists should be deleted. The Star of David is not a good symbol to use for political sensitivities to Israel's actions. Maybe an outline of Israel should be made instead of the Star of David since it's about Israel, not Judaism. Chinese sensitive content can also be deleted since Wikimedia sites are illegal in the People's Republic of China. @GPSLeo: @Holly Cheng: Some level of community consensus on these would be good to have for these non-copyright restrictions, but we do also want to take steps that protect our users, so some balance in how we approach would be good. Abzeronow (talk) 19:56, 14 February 2024 (UTC)Reply[reply]
I agree with the proposal and would like to suggest a system wherein the community can flag or report templates that are inappropriate or irrelevant. This would help maintain a well-organized and user-friendly template system. 70.68.168.129 05:19, 17 February 2024 (UTC)Reply[reply]

Proposal to prohibit political restriction templates[edit]

(The following policy proposal was motivated by the issue discussed in the above section, Commons:Village pump/Proposals#Require community consensus for new non-copyright restriction templates. See also Commons:Deletion requests/Template:Zionist symbol and Commons:Deletion requests/Template:Chinese sensitive content.)

The licensing/permission section on files must contain (a) copyright template(s), indicating either that the file is in the public domain for a certain reason or that it is licensed according to an acceptable license. This section may sometimes contain some other templates, too, which are found in Category:Non-copyright restriction templates. I reckon that there are generally three types of templates in this category:

  1. Templates which convey that there is a (potential) property or pseudo-property right, other than copyright per se, which may result in reusers needing a license for certain types of use.
    Examples:
    • Trademarks: A form of property whose holder has specific rights (although these aren't the same as the rights associated with copyright). The exclusive rights of the trademark holder can only be used under license. Insignia, emblems, seals and coats of arms may be subject to similar restrictions.
    • Personality Rights: A form of property or pseudo-property where people have certain rights not to have their image used in certain ways. These uses can only be made with permission.
    • Governmental Quasi-IP Rights: Not a form of private property, but a scheme under which certain uses of objects can only be made with permission from a public authority. For example, Italian law requires anyone who makes commercial use of images certain culturally important objects to pay a licensing fee.
    • AI-related: Some templates indicate that images may have been produced by generative AI trained on copyrighted works. The legal implications of that are a subject for a different discussion.
    While none of these are copyright restrictions (and may or may not be applicable at all, depending on the jurisdiction), the basic commonality is that there is some sort of (either private or governmental) owner of some kind of exclusive right, and permission must be received from that owner to make certain uses. I think these kinds of templates can be useful reminders to re-users that some form of permission may be required from someone in certain circumstances.
  2. Templates for events and projects which transclude restriction templates of type 1.
    Most of these are for events where some photos may include identifiable people with personality rights. I think these templates are arguably miscategorized (since they are only really restriction templates by virtue of transcluding a restriction template), but that's not what my proposal is about.
  3. Templates which indicate that some jurisdiction(s) may ban any use of some image/symbol in the file for ideological/political reasons. These are what I'm calling political restriction templates.
    Examples:
Proposal

All political restriction templates should be deleted (along with corresponding categories), and future templates of this kind should be disallowed as a rule.

A political restriction template is a template which indicates or claims that some use of a file may be banned, restricted or considered objectionable by some governmental or non-governmental body on the basis of a point of view which is, or may be considered, expressed by use of the content.

Reasoning behind the proposal

Some starting points from Commons policies:

  • Content with these tags may be objectionable to some. This is not a valid reason to remove it, as Commons is not censored. The use of political restriction templates, although it does not entail the removal of these files, may conflict with the spirit of this guideline, as I'll explain below.
  • Commons is not Wikipedia, and files do not need to express a neutral point of view. However, Commons itself is supposed to be neutral on subject-matter disputes. Certainly, a lot of files that are tagged as representing a banned ideology of some kind express a non-neutral point of view in some fashion (which does not make those files banned). The use of these political restriction templates, however, poses significant problems related to neutrality of point of view.

Some of my reasons for making this proposal:

  • The main point of permissions templates is to indicate that the rights to the files have expired or been licensed (and what limitations apply to the expiry/license).
    • A copyright template may indicate that a file is in the public domain in some countries, but not others, or that a license is granted for its use, but with conditions (such as attribution or sharing alike).
    • Anyone who wants to go beyond what is possible according to that file's status must get permission from the appropriate rightsholder(s). Similarly, anyone who wants to use a file in a way that would require the permission of a trademark holder, or a person whose personality rights would be relevant, etc., must receive permission from the appropriate party before proceeding.
    • By contrast, the political restrictions referred to by these templates are (more or less) universal in application, and unrelated to securing permission. In countries where certain ideologies are banned, there's generally no way to receive permission to engage in prohibited speech.
  • Political restriction templates have the effect of privileging government bans over the speech of those who disagree. This goes against our policy on Commons itself (as opposed to the files hosted on Commons) maintaining a neutral point of view.
    • Some of the existing templates already serve as warnings that some content may be objectionable according to a restrictive authoritarian regime. The creation of these warning templates, especially in cases where the government attempts to block access to Commons due to the fact that it is not censored, seems to express the decidedly non-neutral standpoint of those governments over the viewpoints of their opponents (and, in fact, specifically targeting files which contain the viewpoints of their opponents).
    • If we were operating during the days of the Nazi regime, would there being a restriction template placed on the work of Jewish artists indicating that their work is considered degenerate art? Would we have attached a label to the creations of dissidents during the Cold War? Why should we attach such a warning label to such content today?
  • The act of applying these restriction templates to files may also reflect a non-neutral point of view with respect to what the file actually expresses.
    • Who is to say what is or isn't one of these symbols? It seems to require a subjective judgment on the part of the person who applies the tag to say that the symbols in fact do fall within the scope of a ban, especially considering the many legal disputes over what is and is not permitted speech in various countries.
  • The application of a restriction template serves to potentially stigmatize the content (thus expressing or implying a non-neutral view of the content and/or implying that it should be considered whether or not its valid educational use should be avoided), and may be considered inflammatory by various users (see the various points raised in the "Zionist symbol" deletion discussion).

Some alternate ideas or potential objections (and my response to them):

  • Why not base this on whether or not the restriction is imposed by a democratic/good/etc. country?
    • For one, there's no strictly neutral way to determine whether or not a country is "democratic." The World Press Freedom Index mentioned by GPSLeo is the expression of a viewpoint. I'm not saying that viewpoint is incorrect; I'm just saying it's not neutral. Some judgments may be more or less contentious here, but there would definitely be some level of viewpoint-based disagreement.
      • Besides, what would we do if some country which has a good score now is taken over by a new government, which decides to crack down on the freedom of the press? Would we put a template up pending the release of the next WPFI index? It is better to have a test which is independent of any such country-by-country assessment.
    • The restrictions imposed by the countries with higher WPFI scores tend to be less total. In those countries, it's the promotion of certain totalitarian ideologies that is banned, not the reproduction of the symbols (which is commonly done, for example, in history textbooks). Moreover, defendants in criminal cases have due process rights there. For them to commit a crime, it's hard to imagine that they wouldn't know what they were doing (see also the point below on whether or not we owe our users a warning).
    • The most suppressive regimes can (or already do) block access to Wikimedia Commons on the basis that we do not censor the site.
  • Why not make this a case-by-case community discussion?
    • Having a case-by-case discussion means we're still not being neutral. Instead, the discussions would become a popularity contest, with perhaps some restrictions being more accepted than others based on the content of the restricted ideology or who's doing the restricting.
    • Even putting aside the previous point about the lack of neutrality in accepting the restrictions in principle, if we accept that even some of the restrictions are "OK enough" to have a template, the issue with a lack of neutrality still applies every time the restriction template is applied to a given file. "Is this file prohibited content type X?" is not necessarily clear, and I don't think we should be having these discussions (with the inherent NPOV problems in edge cases) on individual files either.
    • GPSLeo sought to exclude things which are like the existing personality rights templates from the scope of the rule, but did not define the scope exactly. I hope my proposed rule is a bit clearer.
  • But we owe users a warning that they could be violating the law, don't we?
    • We have a general disclaimer, and we're not responsible for what users go and do with free content.
    • As addressed above, even where these restrictive laws exist, there are often completely licit uses for these symbols (e.g., in educational materials).
    • I don't think we need to patronize our users like this. These restrictions tend to be very well-known to the people in the countries where they are in effect. They are a core part of the political culture in that country. Both those who agree with them and who disagree with them know this very well. They do not need to be told.
    • Tons of materials can be used in a way that is illicit for non-copyright reasons in lots of countries, even beyond this. For instances, photographs of identifiable people could be modified in a way that libels the person in the photo (or so on). We do not need to remind people not to do things that are illegal.
  • But the Nazis were really bad, and society should not stand for the promotion of Nazism.
    • I agree, but I don't need a restriction template to tell me that.
      Also consider the legal and political disputes such as Strafgesetzbuch section 86a#Anti-fascist symbols, as well as other problems discussed above (some of which apply even if it you accept the wisdom of the legal restrictions themselves). (By the way, despite the ruling of the German courts on the crossed-out Nazi Swastika, the relevant file on Commons still has the restriction template!)

D. Benjamin Miller (talk) 02:49, 6 February 2024 (UTC)Reply[reply]

Votes and discussion[edit]

  •  Support as proposer. D. Benjamin Miller (talk) 02:56, 6 February 2024 (UTC)Reply[reply]
  •  Support These templates are unnecessary cruft. Nosferattus (talk) 03:12, 6 February 2024 (UTC)Reply[reply]
  • Strong  Oppose. How strong? If we drop {{Nazi symbol}} and do not provide some equivalent, I will resign as an admin and possibly reduce my other involvements in Commons. - Jmabel ! talk 03:41, 6 February 2024 (UTC)Reply[reply]
    1. Why?
    2. I don't see what equivalent could exist which is not simply a renamed version of the same thing. D. Benjamin Miller (talk) 04:11, 6 February 2024 (UTC)Reply[reply]
      1) I don't think I owe anyone an explanation, given that this was taken straight to a vote with no prior discussion stage and that (below) you've shown that anything anyone says here in opposition simply becomes another place for you to challenge them.
      2) Precisely. If there is no equivalent of this, that will be my course of action. - Jmabel ! talk 19:23, 6 February 2024 (UTC)Reply[reply]
  •  Oppose This feels like a solution in search of a problem. "Cruft" can describe a lot of things that get put on file pages. Do people really need to see banners that an image was selected as an FP? Quality Image? Media of the day? Do they need to know an image was acquired by Commons due to a partnership between an external repository and a Wikimedia chapter? Do they need to know a picture depicts a UNESCO World Heritage Site? Until someone can come up with a convincing argument for why these specific templates are disruptive or harmful to the project, I don't see any reason to get rid of them. The Squirrel Conspiracy (talk) 04:15, 6 February 2024 (UTC)Reply[reply]
    The proposal is not saying that they should be deleted due to being cruft. (Another person said that, yes.) There is no issue with the number of templates, and the reasoning given in the proposal would not apply to any of the other kinds of templates you mention. And if you do not believe there is any actual dispute here, see Commons:Deletion requests/Template:Zionist symbol, as well as the other section above the original proposal. D. Benjamin Miller (talk) 04:30, 6 February 2024 (UTC)Reply[reply]
  •  Oppose. Suppose you get your way and some college student in Germany illustrates a paper on WWII including a swastika downloaded from Commons, and gets thrown into jail for it because there was no warning. Are you going to defend them? Are you going to bail them out? Are you going to apologize to their parents? Multiply the likelihood of that by the number of college students in Germany on any given day. We try to protect our reusers, not hang them out to dry.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 04:18, 6 February 2024 (UTC)Reply[reply]
    This is not a scenario that actually happens. It's not illegal for a college student in Germany to use a swastika in a history report about World War II. (Can you imagine how absurd it would be to prohibit using pictures of the Nazi era in history reports about World War II?) The symbols of the Nazi party are included in images in virtually every German school textbook about World War II, just as they are included in textbooks about World War II around the entire world. They are also totally legally included in works of art, such as historical movies. See Strafgesetzbuch section 86a. D. Benjamin Miller (talk) 04:24, 6 February 2024 (UTC)Reply[reply]
    @D. Benjamin Miller: Sorry, I had not read that article.   — 🇺🇦Jeff G. please ping or talk to me🇺🇦 04:39, 6 February 2024 (UTC)Reply[reply]
    Even in Germany, where the restrictions on the use of symbols of anti-constitutional organizations (including the Nazi party, but also ISIS, the Kurkish People's Defense Units in Syria and various other groups) are fairly strict, there are exceptions for (among other things) use in an educational context, use in opposition to those groups, research, art, reporting, etc. It is hard to conceive of an scenario in which a user in Germany accidentally engages in unlawful conduct because of Commons.
    Likewise, the goal of Commons itself (to store images for educational purposes) is legal in Germany. In fact, many of the images of Nazi Germany come from the Bundesarchiv (see Category:Images from the German Federal Archive)' these images are distributed by the German government for educational purposes. D. Benjamin Miller (talk) 05:13, 6 February 2024 (UTC)Reply[reply]
  • Selective  Support and  Oppose. Support deleting templates that are purely tied to geopolitics, such as {{Indian boundaries}} and {{Chinese boundaries}}. Every country always gets offended if they see any maps being used on Wikipedia with boundaries that they deem incorrect or inappropriate, but it is not the job of Wikimedia Commons to please their territorial interests. I am actually a bit "surprised" that even if it is highly-offensive to depict the "Nine-dash line" by China here, there is no equivalent {{Philippine boundaries}}, but it is not the job of Wikimedia Commons to please our territorial interests. But oppose deleting templates related to political history as well as racial/cultural politics such as those related to Nazi symbol and Falun Gong, in accordance with current arguments by The Squirrel Conspiracy and Jeff G. as of this writing. JWilz12345 (Talk|Contrib's.) 04:47, 6 February 2024 (UTC)Reply[reply]
    What is the difference between geopolitics and political history? Do you just mean templates specifically related to maps? D. Benjamin Miller (talk) 04:51, 6 February 2024 (UTC)Reply[reply]
    @D. Benjamin Miller: yes. Such templates only add needless "dirt" on the description pages of map images. JWilz12345 (Talk|Contrib's.) 04:53, 6 February 2024 (UTC)Reply[reply]
    If these templates are a solution avoiding the project to be blocked in India or China, why not having them? It is a much lesser evil than a block affecting billions of users. Yann (talk) 10:17, 7 February 2024 (UTC)Reply[reply]
    @Yann Commons hosts a couple of maps that really offend the Philippine authorities because these include Spratly Islands and other South China Sea / West Philippine Sea features to the Chinese territory, such as File:China prefectural-level divisions and administrative divisions (PRoC claim).png (used in an English Wikipedia article), File:China-map ko-kore.svg, and File:North and South China Partition Map.png. Perhaps you are aware of the current tensions between Manila and Beijing that have been existing since 2010s. Still, the PH authorities hasn't issued any order blocking access to either Commons or even Wikipedia because of instances of maps that show the whole South China Sea region as part of China, and I think there is little likelihood of Commons itself being blocked because of the presence of such maps. English Wikipedia would be the first in line of the PH government's censorship if ever, but there is low probability as of now. I do not agree hosting templates tackling boundary disputes (like {{Chinese boundaries}} and {{Indian boundaries}}), these should be taken down. The users are responsible for their actions (as stated in our general disclaimer page); more so, English Wikipedia editors are responsible to ensure that their articles do not cross the line of fire of the PH/Indian/Chinese authorities with regards to insertion of such contentious maps in articles that may discuss territorial disputes. Territorial disputes are English Wikipedia's responsibility, not Wikimedia Commons'. JWilz12345 (Talk|Contrib's.) 22:56, 14 February 2024 (UTC)Reply[reply]
    Yes, I aware of the dispute between China and the Philippines. It came in the news in France. I won't fight for these templates, but still I feel that some information in each of these maps is needed. Only "(PRoC claim)" in the title doesn't really seem sufficient. So if it is not in the form of templates, how to do it? Yann (talk) 23:05, 14 February 2024 (UTC)Reply[reply]
    @@Yann: we do not need to provide extensive information about territorial claims, other than using description fields of {{Information}}. It is the job of individual Wikipedias to provide such information. For other users, it is at their own risk if they offend our government for using Commons-hosted maps of China, the Chinese government for using Commons-hosted maps of the Philippines, and the like. We have COM:General disclaimer, which is repeatedly referenced here. In most cases, we only deal with copyright restrictions (FoP, license plates, stamps, toys, etc.) in safeguarding other users, but in non-copyright restrictions, these are the users' own risks. JWilz12345 (Talk|Contrib's.) 07:54, 22 February 2024 (UTC)Reply[reply]
    I'll wager the vast majority of countries does not throw people in jail for using a version of a map they dont like Trade (talk) 14:10, 22 February 2024 (UTC)Reply[reply]
  •  Oppose This is far to broad and needs exceptions. I think the assumption on what the neutral point of view means for the project is wrong. The NPOV only applies when it comes to the decision which photo to use and how to describe and categorize content. But for meta topics we are not neutral, there we have the goal to make the project better. GPSLeo (talk) 07:12, 6 February 2024 (UTC)Reply[reply]
    That's not the position taken in the essay Commons:Disputed territories. For example:

    Categorization should either be neutral (ideally), or double. e.g. most of these files will be in the simple Category:Geography of Golan Heights (neutral), which itself is a subcategory of both Category:Geography of Israel and Category:Geography of Syria (double). This will work with all subcategories too. Don't add Category:Flora of Israel. Make a category called Category:Flora of the Golan Heights, then it can be a subcat of both Category:Flora of Israel and Category:Flora of Syria.

    I'll ask you: how could we possibly make exceptions in a way that is not based on whichever viewpoints are popular or not? For instance, should we decide whether to keep the Indian or Pakistani border depiction warnings based on which receives more support? Should we keep Template:LGBT symbol? Template:Chinese sensitive content?
    As far as I can see it, there are three paths we can go:
    • We don't allow for any of these templates — which is content-neutral.
    • We allow for all restriction templates (as we currently do). We've seen contentious back-and-forth editing where these templates are used to stigmatize content (including, in some cases, according to 魔琴, content which isn't even actually banned even under the various authoritarian regimes). Template:Zionist symbol will continue to be stuck as a "badge of shame" (as Mx. Granger described it) on various pictures with stars of David, Template:Chinese sensitive content will be stuck on images of the Dalai Lama and Tsai Ing-Wen, etc. By this standard, someone could create a template, such as "American Imperialist Symbol," and slap it on all images of an American flag, commenting on how it cannot be flown freely in Iran and North Korea. I think these labels can be inflammatory and highly undesirable — and are inherently prejudiced towards the view of the banning party over the view of the banned party.
    • We allow for some, but not all, and the determinations end up based on the popularity of the banned viewpoint. Also, political flame wars ensue over every controversial subject to determine whether or not it should be given the mark of shame. I don't think this outcome is desirable either.
    D. Benjamin Miller (talk) 07:39, 6 February 2024 (UTC)Reply[reply]
    Alternatively, here's the other issue. You mentioned earlier that you do not feel it makes sense to have restriction templates for the legal restrictions created by undemocratic regimes, but that it must be OK to have some for legal restrictions created by democratic regimes.
    (The following statements are very much not viewpoint-neutral. My personal opinions are contained below.)
    In agree with you — sort of. I think that there are things that are morally wrong — say, because they run counter to my concept of justice (which has democracy as a component). I think it is worth condemning and stigmatizing those things. Nazism is one such thing.
    But Nazism's wrongness in no way originates from the fact that it its symbols are banned by the German government. It was wrong when it was first formulated, it was wrong when the Nazis were in power and it is still wrong now. When the Nazis killed my relatives and millions of others for "crimes" such as being Jewish, they did so with the authority of government.
    Government legislation is not a source of morality. Governments can do evil things. Even a bad government has real power over people, and bad governments today can and do subject people to punishment for reasons that are fundamentally unjust.
    As far as I am concerned, the worst reason to not be a Nazi is because it is punishable by law. If the only thing that keeps someone from promoting Nazism is a legal penalty, that is incredibly sad.
    To me, it feels wrong for these warning labels to be mere acknowledgments of the fact that some set of governments has condemned something. The way this is done right now is what I'd call pseudo-neutral. While I think using political restriction templates at all is inherently non-neutral (see above), accepting them indiscriminately is being neutral with respect to which state-sponsored prohibitions warrant mention. However, this means that you are opening the door to include political restriction templates based on the edicts of the most vicious and wrongheaded governments.
    The alternative you suggest — having some templates but not others — inherently involves adopting some set of political ideals. Even just deciding which states are "democratic" (and thus are worth paying attention to for the purpose of restrictions) requires this. After all, the North Korean party line says that the North Korean regime is democratic, though I certainly wouldn't concur.
    Especially given the role of these values themselves, rather than any state identified as sharing them, in determining what is right and wrong, if you're going to have any anti-Nazi (or anti-anything) template, it should be based on the fact that Nazism, etc., conflicts with these core values themselves, not the fact that there is a government out there that imposes some sort of penalty for some use. That would be the reflection of adopting, as Commons and/or Wikimedia, some number of official political positions as a community.
    The real question is what to do when you get to the more contentious templates in the group — really, you get beyond an anti-Nazi stance, every other subject probably elicits significantly less agreement. And I just don't feel it's realistic or necessarily productive for Commons/Wikimedia to adopt official community stances on political issues which don't have to do with copyright, free media, etc. The procedure for proposing and approving such motions sounds like it would be a nightmare.
    D. Benjamin Miller (talk) 11:01, 6 February 2024 (UTC)Reply[reply]
    As I already wrote: We should not be neutral when it comes to the usability of our project. And we can not be neutral when it comes to the en:Universal Declaration of Human Rights. Therefore we should accept warning templates based on laws they are covered by and are made to support these human rights. GPSLeo (talk) 20:35, 6 February 2024 (UTC)Reply[reply]
    Thanks. That is a good point and feels like a better starting point than choosing a particular cutoff from a particular press freedom ranking. I am certainly not neutral towards the values of the UDHR; I support those values. And as of 2021, the WMF has adopted support of the UDHR as a position. So that seems something which could be built on.
    The WMF also has adopted a Universal Code of Conduct in a similar vein. One point of this policy is a rule against the "use of symbols, images, categories, tags or other kinds of content that are intimidating or harmful to others outside of the context of encyclopedic, informational use. This includes imposing schemes on content intended to marginalize or ostracize."
    The presence of such symbols within the appropriate educational context is allowed (which nobody disputes). But my reading of this policy (a policy which adopts a non-neutral stance towards intimidation and hatred itself) is part of why I feel the tags are problematic.
    Putting aside for the moment the issue of whether or not we are making accurate determinations about what is or isn't a Nazi symbol (which I think is problematic in some cases), I don't think that it is really debatable whether or not Nazism is an ideology that is counter to the human rights stance of the WMF. It obviously is; the UDHR itself was formulated specifically in response to Nazism, so there can be no ambiguity about whether or not it is included within the scope.
    Allowing for restriction templates only relating to laws which target ideologies and political views which are counter to the UDHR is a more precise distinction, and I appreciate your suggesting it.
    My difficulty is that, while it is clear that Nazism is counter to the UDHR (I don't think there's any other way to interpret it, given the specific context in which it was written), a lot of these restrictions have to do with things which are claimed to be against the UDHR (but not universally accepted as such).
    For example:
    • Zionist symbol — Many people and governments have characterized Zionism as inherently racist. I don't agree with that assessment — nor do the governments of Israel (obviously), Germany and a number of other countries. But many governments do characterize it as such. From the 1970s to the 1990s, this was a position taken by a UN resolution. South Africa has brought a case against Israel accusing it of genocide in the ICJ. So there are many people who would say bans on "Zionist symbols" target an ideology counter to the UDHR.
    • Chinese bans — China claims to support and implement the UDHR. The Chinese government claims its restrictions on speech are necessary to preserve a public order that supports human rights. I and many Western governments and commentators find these claims dubious, but they do make them.
    • Russian bans — Russia has claimed that Ukraine is run by Nazis and that its war against Ukraine is motivated by a desire to de-Nazify Ukraine. Nazism is obviously the paradigmatic anti-UDHR ideology. The issue here is that the Russian claim that Ukraine's leadership are Nazis is an implausible factual allegation.
    And so on. My question is:
    • Do we want to put ourselves in the situation of having to determine by consensus which ideologies violate the UDHR and who really subscribes to such ideologies?
    • What is the level of consensus needed? Must there be virtually universal assent that the target of the legislation is anti-UDHR? Would this standard of consensus be higher than the usual standard of consensus for other questions?
    D. Benjamin Miller (talk) 22:37, 6 February 2024 (UTC)Reply[reply]
    Yes we are the one to decide as this our project. Consensus is formed like for every proposal or scope related deletion request. GPSLeo (talk) 06:54, 7 February 2024 (UTC)Reply[reply]
    Well, I admire your optimism and I hope you're right to think that it would go smoothly if it were the rule. D. Benjamin Miller (talk) 11:19, 7 February 2024 (UTC)Reply[reply]
  •  Oppose No need to object to neutral statement of facts to inform users about works they should be careful using.--Prosfilaes (talk) 20:45, 6 February 2024 (UTC)Reply[reply]
  • Rather  Oppose. More information is better than less or no information. Some of these templates may be too strongly worded (or unnecessarily display a strong warning), but yet they offer an information pertinent for some users. I would support more neutral templates (not using red warning, etc.), but the deletion isn't a solution. Yann (talk) 10:08, 7 February 2024 (UTC)Reply[reply]
  •  Oppose per Yann and Prosfilaes. --Prototyperspective (talk) 11:06, 7 February 2024 (UTC)Reply[reply]
  •  Oppose per Jeff and others. All it takes is some rogue prosecutor in a country that doesn't have free speech as a guaranteed right, and a re-user could be jailed for using one of our images. Warning them of these laws should be a thing we do. I agree with Yann that some warnings should have a more neutral tone, but generally warning of non-copyright restrictions is good. Abzeronow (talk) 17:08, 7 February 2024 (UTC)Reply[reply]
  •  Oppose, while I am not a fan of the existence of these political restrictions, I think that we have a moral duty to report to potential re-users what restrictions exist outside of copyright-related rights. We shouldn't be providing less information about the consequences of using files uploaded here, especially since some of the fines and penalties are really serious (like desecrating the name or image of a "hero of the People's Republic of China", which can land a person 3 (three) years in prison). I don't think that anyone here is actually a fan of the existence of these restrictions, but warning people of potential consequences doesn't enforce the positions of these unfree governments, it simply informs re-users that there are limitations beyond the copyright ©️ of a file. --Donald Trung 『徵國單』 (No Fake News 💬) (WikiProject Numismatics 💴) (Articles 📚) 06:33, 8 February 2024 (UTC)Reply[reply]
    My feeling, however, is that the speech restrictions of some countries (like the PRC) are really pretextual. If you have an authoritarian government, then they're going to censor you or prosecute you however they want. To @Abzeronow's point, I'm not sure that such prosecutors would really be "rogue."
    Besides this, some of the tags we've seen have been inaccurate (or misleading). For example, Tsai Ing-wen's photo was tagged as Chinese sensitive content — but she is in the Chinese news; there is certainly no ban on acknowledging that she exists. The problem would be "advocating for Taiwan separatism." Similarly, defaming (by whatever arbitrary and capricious standard might be applied) a hero of the PRC may cause jail time, but the image of such a person would not be defamatory in itself. So we could say a lot of the files aren't problematic in themselves, but the subject depicted is one which could cause problems for people (depending on the viewpoint expressed about the subject).
    Not to mention, if we really go down this road, we could end up tagging all pictures of Winnie-the-Pooh as Chinese sensitive content, or all pictures of Salman Rushie as Iranian sensitive content. And who knows what might draw the ire of a censor tomorrow?
    As a number of other people have mentioned, the PRC blocks access to Wikimedia projects anyway. They are not the only one to have done so, and a bunch of the censorship templates we have refer specifically to the laws of these countries. If someone is accessing the site from the PRC, they know they're circumventing a block to begin with. If we are worried about inadvertent problematic use, I think we can presume anyone bypassing their country's ban on the site altogether isn't going to be making such a use accidentally. D. Benjamin Miller (talk) 17:17, 8 February 2024 (UTC)Reply[reply]
    Co-signed. Zanahary (talk) 21:15, 21 February 2024 (UTC)Reply[reply]
  •  Support prohibiting anything related to the PRC, which has both famously abysmal press freedom and blocks Wikimedia websites; also support prohibitions for Myanmar, Russia, and North Korea for similar reasons;  Oppose any broad prohibition of political warning templates. I do think some of them should be deleted as frivolous and largely unused but that doesn’t require a policy change. Dronebogus (talk) 14:03, 8 February 2024 (UTC)Reply[reply]
  •  Oppose deletion for templates of democratic countries (Like {{Communist Symbol}} for South Korean users. In South Korea, symbols related North Korea are prohibited under South Korean National Security Act. {{Communist Symbol}} can be used for symbols related North Korea files.)  Support deletion for templates of autocratic countries (Like {{South Korean Symbol}} for North Korean users. Wikimedia Commons cannot be accessed in North Korea.) --Ox1997cow (talk) 15:29, 8 February 2024 (UTC)Reply[reply]
  •  Oppose for the way-too-broad proposal, but also  Weak support for the general idea: this is a sensitive issue. In moderation, "political restriction templates" have their use, which should not be prohibited, as such a prohibition is itself a restriction of the freedom on Commons. Communazi symbols are frowned upon in most parts of the world, and Commons should be a platform for education, not propaganda. For that reason, files that outright show facist or authoritarian propaganda (especially without educative texts to explain the display) should get a disclaimer to show that Commons does not share the authoritarian views promoted in the picture itself. The Chinese-borders template is another example: any map showing any part of the SCS but not the 9Ds is basically illegal in China, but their authoritarian stance should receive blowback here on Commons: Nine-dashed maps are authoritarian propaganda, and planting a template is therefore deserved. Most maps on Commons don't have these 9Ds, anyway. On the other hand, we should not obediently place a "political restriction template" on all other maps, warning our PRChinese users that China considers these maps illegal. Naturally, the previous commenters here have already taken action and DR-nominated templates discussed here, without waiting for consensus on the debate. Now: If a political warning-template would have to be plastered onto hundreds of thousands of files (if fully executed), then something might be wrong with the definition of the template; especially if there is nothing offensive to be seen. On the other hand, if a political warning template (like the Nazi-symbol disclamer) gets plastered onto hundreds of thousands of files with no offending symbols (the text deals with inheritance law issues), then something might be wrong with the application of the template.
    tl;dr: "Political restriction templates" should be used with common sense, and we do indeed need project-wide agreements on how to use them. --Enyavar (talk) 17:16, 9 February 2024 (UTC)Reply[reply]
  •  Support.
  1. most of these nonsense templates only started appearing in recent years.
  2. internet is not kindergarten. users are expected to assume their own risks and not babyfed such warnings/reminders of any kind of restrictions there may be in any country.
  3. quoting Professor James Duane https://www.youtube.com/watch?v=d-7o9xYp7eE&t=310 , there are 10000 different ways you can get convicted by US federal law. there're just infinite number of crimes in the penal codes of the 200+ jurisdictions there are in the world, which can relate to certain files hosted on commons. as shown in the list, some countries by themselves alone need multiple templates because they outlaw porn/maps/blasphemy/lèse-majesté... where does this end?--RZuo (talk) 14:05, 13 February 2024 (UTC)Reply[reply]
  • And also. Nazi symbol is not a reason to create nonsense templates. it will lead our attention to nonsecal issues and will bring to Commons user from all over the world for a political issues which is not in the topic of Commons.-- Geagea (talk) 19:46, 17 February 2024 (UTC)Reply[reply]
 Support. The disclaimer covers what needs to be covered. These templates create needless edit-wars that really add nothing to the commons (how valuable is the Zionist symbol template that it's worth fighting with disruptive editors to have it taken off of pictures of menorahs and cookies?). They also put commons users in the position of interpreting various international laws, many of which have never been transparently enforced. This is also a needless slope to roll down; lots of territories legally suppress imagery and text. These suppressions are often vague and thinly-explained, and don't lend themselves well to creating a template that says "The law here says x". Zanahary (talk) 21:14, 21 February 2024 (UTC)Reply[reply]
 Comment Most of these seem ridiculous and offensive. Putting an LGBT warning on every file with a rainbow in it is patently ridiculous. The Nazi warning seems fine to me, but the rest feel like someone saw a slippery slope and grabbed their toboggan. Bawolff (talk) 07:14, 22 February 2024 (UTC)Reply[reply]
I agree on the LGBT matter, but at the same time... while many are saying here that we don't need even warnings over cultural sensitivity, others are proposing to delete hundreds of images over cultural sensitivity. And it seems to me that a warning tag is a good compromise between deletion and nothing. - Jmabel ! talk 22:44, 22 February 2024 (UTC)Reply[reply]
@Jmabel: the case you mentioned seems to relate to non-copyright restrictions (cultural rules from the museum), and not necessarily cultural sensitivity. JWilz12345 (Talk|Contrib's.) 23:12, 22 February 2024 (UTC)Reply[reply]
The museum itself had licensed the photos on their own account, using one of the usual irrevocable CC licenses. They have now decided for reasons of cultural sensitivity that they wish to suppress the images. The license itself was clearly valid in copyright terms. Yes, the basis on which they want it deleted is a non-copyright restriction; similarly, none of the warnings discussed here related to copyright. - Jmabel ! talk 23:27, 22 February 2024 (UTC)Reply[reply]

Revert policy change for "Overwriting existing files"[edit]


Promoting steward elections[edit]

The Steward elections are on and there were comments about a low voter turnout. I suggest to make them a bit better visible on commons, similar like the sysop elections which is quite discreet in my opinion. If not similar like the sysop elections, then a banner which is made visible between 5-10 times per cycle (a cycle is a day I was told) is also an option. The current announcement of the Steward election on commons disappears really fast in my opinion and I originally became aware of them because I had the user page of a Steward watchlisted. Krd (a former Steward) suggested that I post this here. Paradise Chronicle (talk) 10:22, 10 February 2024 (UTC)Reply[reply]

I don't really know what happened but now I can see the announcements I believe at all pages, or at least most of the times I open a page. So to me the issue is solved.Paradise Chronicle (talk) 09:49, 19 February 2024 (UTC)Reply[reply]

Chinese and Japanese characters as disambiguation?[edit]

for category titles, sometimes there are chinese or japanese names that are written in many different ways but have the same pronunciation. some pronunciations are shared by so many people that it's possible to end up with multiple people with the same occupation (such that "cat:john doe (writer)" is not enough to distinguish them).

here's what i'm pondering. can we use these names' forms in hanzi or kanji as the qualifier in parentheses? very often they are different. it also helps users navigating these categories because they can immediately identify the persons with the actual forms of the names in the native languages.

this idea obviously only applies to logograms, among which only chinese and japanese are popularly used.

examples: Category:Lu Xun (Tang dynasty) can become Category:Lu Xun (陸勳) and Category:Lu Xun (Wu) Category:Lu Xun (陸遜). RZuo (talk) 14:28, 13 February 2024 (UTC)Reply[reply]

I would guess that a far larger number of our users can understand "Tang dynasty" than "陸勳". Do you have reason to think otherwise? - Jmabel ! talk 20:21, 13 February 2024 (UTC)Reply[reply]
  1. as i said, "some pronunciations are shared by so many people that it's possible to end up with multiple people with the same occupation (such that "cat:john doe (writer)" is not enough to distinguish them)." by using all these indirect prompts, it's hard even for me to immediately connect the category title to the person. "Lu Xun (Wu)" is a pretty well-known figure, but a first glance at this title i cant make out what "wu" means, which can refer to a dozen different states in history or a dozen different places historical or present.

    again, these are just examples. there are also people of the same era that have the same occupations.

  2. most people who have to deal with these categories can read c/j chars.
  3. allowing the use of c/j chars, is not the same as requiring only c/j chars to be used as disambiguation. it's only to give one more obvious and convenient possibility of words to use for disambiguation, when strictly following latin-only titles creates unnecessary confusion.
RZuo (talk) 20:45, 13 February 2024 (UTC)Reply[reply]