Jump to content

Talk:Edit check

Add topic
From mediawiki.org
Latest comment: 1 month ago by Matěj Suchánek in topic Paste Check

So glad to see!

[edit]

@ESanders (WMF)@PPelberg (WMF), I'm very glad to see that this project exists! If implemented well, I think it will have a strong positive impact on the new editor experience. Please let me (and others!) know when you're looking for community feedback, and I'll be happy to share thoughts on specific ideas/prototypes! Cheers, {{u|Sdkb}}talk 18:29, 24 January 2023 (UTC)Reply

hi @Sdkb! I'm glad that you're glad to see the Editing Team prioritizing work on this ^ _ ^

Please let me (and others!) know when you're looking for community feedback, and I'll be happy to share thoughts on specific ideas/prototypes!
Thank you for offering as much; we're going to need it!

In fact, since we're here, I wonder: can you think of specific Senior Contributors or groups of Senior Contributors that you think we ought to prioritize reaching out to first? I've added a bit more context below about the project and what we're currently looking to learn at this stage.

All of the above aside, thank you for dropping by! You doing so was the reminder I needed to prioritize posting a status update about this work which I'll do on the project page before this week is over.

Seeking Input from Seniors
Okay, this is the "context" I referred to above...
In the coming weeks, I expect us to be ready to share an initial idea for the user experience of the first "check" we're prioritizing work on: a check that will present people who are attempting to add new content without a corresponding reference with a call to action to do just that. This design work is happening in phab:T325711.

While we'll be keen to hear what Senior Contributors think of the user experience we're proposing to present to newcomers and Junior Contributors, we're thinking some of the most valuable things we have to learn from y'all in this moment are things like:
  1. "In what – if any – ways do Senior Contributors think the proposed mobile UX flow could be adjusted/augmented to increase the ease and efficacy with which they can review edits people will be making with Edit Check?
  2. "Which parts of the feedback system Edit Check is proposing do Senior Contributors would value being able to configure on a per-project basis?" [i][ii]
---
i. This "configuration" bit is particularly top-of-mind for the team as we're seeing Edit Check as having two distinct, and complimentary, dimensions: 1) the user experience newcomers and Junior Contributors will see and 2) the configuration tools/options Senior Contributors will have access to that will enable them to customize this user experience it guides people to take actions that align with project policies and conventions.
ii. For more details about how we're thinking about consulting with experienced volunteers, please see phab:T327330. PPelberg (WMF) (talk) 22:33, 24 January 2023 (UTC)Reply
...thank you for dropping by! You doing so was the reminder I needed to prioritize posting a status update about this work which I'll do on the project page before this week is over.
Yes Done: Edit Check Update: 27 January 2023 PPelberg (WMF) (talk) 19:32, 27 January 2023 (UTC)Reply
@PPelberg (WMF), thanks for sharing all that! I reviewed the project page, phabricator tickets, and Figma mockups for desktop and mobile. I responded on Phabricator to some of your prompts there.
I think you're on the right track with who to reach out to — w:WT:Teahouse and w:WT:Growth Team features are good options, and I'd add somewhere more general like w:WP:VPI to bring in some folks who might have good advice but who we might not have thought to ask.
Regarding your questions:
  1. If I'm understanding this right, you're looking to make sure that the mobile implementation of EditCheck doesn't make mobile editing annoying for experienced editors. Well, when I need to make an edit on mobile, my workflow (and that of most other experienced editors I know) is to just switch to desktop mode, since the mobile mode is so deeply lacking in functionality it's preferable to suffer through the tiny fonts/buttons of desktop on mobile. I know Clovermoss has been interested in mobile editing, so they might be able to offer more.
  2. There are lots of things we might want to configure. We care more about referencing for certain types of articles, such as those on living people and those on medical topics, so we'd likely want a more aggressive implementation for those than others. Of note for you all, there's currently a tag for new users adding unreferenced material to a BLP. Noting this comment you made on Phabricator — Iterate on the "Citoid" edit card to include some kind of guidance that prompts people to reflect on whether the source they're considering adding is likely to be one other volunteers deem acceptable — we'll absolutely want control over the source list there, so that we can modify it as RSP changes, and ideally we'll want to be able to provide context/specific conditions for (sometimes-)unreliable sources, as it's far from just a binary reliable/unreliable switch. And we'll want to be able to tweak all the guidance text that shows up at the different steps.
    Going more broadly, I think the true wiki way would be to allow us to come up with custom filters (just as we currently do abuse tags) and provide custom feedback. This would probably mean leaving out some of the more advanced functionality, but I think the community would be able to come up with many good uses that'd be more effective than our current approach when we notice a persistent error of trying to expand guidance (leading to creep).
One other thought (for now): Many editors who are contributing uncited material are providing original research, so we'll want to identify and provide guidance to them. The Upload Wizard comes to mind for its approach to a similar problem. When you're contributing a local file, it asks you whether it's a free work, in a fair use category, or doesn't fit either of the categories above. If you choose the last option, it comes back with Please don't upload it as it's very likely a copyright violation. For here, the question we want to make folks adding information answer is "How do you know this?" If it's from a particular source, we of course want the source. But if not, we want to give editors a way to express that by saying "it's something I know from my personal experience". And we want to then explain to them why that's original research and why they need to either find a source for it, or if it's private information that doesn't have a public source, why they shouldn't contribute it.
Cheers, {{u|Sdkb}}talk 06:47, 28 January 2023 (UTC)Reply
First off, thank you for investing the time and energy into thinking about these prompts and making this thinking easy for us to understand and engage with, @Sdkb!
I'm going to respond to the specific points you raised in discrete comments so that we can more easily explore the distinct points you are raising here.
Before that, thank you for being patient with me.
Note: I've also seen the comments you posted on T327330 (thank you!); I'll respond there directly. PPelberg (WMF) (talk) 00:57, 11 February 2023 (UTC)Reply
I think you're on the right track with who to reach out to — w:WT:Teahouse and w:WT:Growth Team features are good options, and I'd add somewhere more general like w:WP:VPI to bring in some folks who might have good advice but who we might not have thought to ask.
Excellent. You affirming the instincts we had leads me to think something like, "Okay, we're on the right track." PPelberg (WMF) (talk) 00:58, 11 February 2023 (UTC)Reply
If I'm understanding this right, you're looking to make sure that the mobile implementation of EditCheck doesn't make mobile editing annoying for experienced editors.
While we're open to hearing input on the user experience, at this stage, we're primarily seeking to learn things like the following from experienced volunteers:
  1. How can you envision this general approach going wrong? What fundamental assumptions/constraints might this project be at risk of "running up against"?
  2. What logic/heuristic do experienced volunteers think ought to cause the reference check to get triggered? Thank you for responding to this in T327330#8566397; I'm going to follow up with you there.
  3. What kinds of editors do experienced volunteers think the reference check should be enabled for? E.g. All logged out editors? All logged out editors *and* editors who have made fewer than (insert number here) cumulative edits? Etc.
  4. How might experienced volunteers value being able to evaluate/audit how Edit Check is performing and the impact it is having so that they can make improvements to it?
  5. What other kinds of checks can experienced volunteers imagine being useful? I see you shared a collection of wonderful ideas in T327330#8581224 (thank you!). Similar to "2.", I'm going to respond in Phabricator.
Note: we do NOT anticipate the in-editor Edit Check user experience to disrupt experienced editors because we:
  • Assume experienced editors are unlikely to use the visual editor on mobile to edit. See data
  • Assume experienced editors are likely to intuitively add references when they are adding new content to the wikis and thus, be unlikely to trigger the reference check
  • Think we might start by only enabling Edit Check for people who are logged out and have made fewer than, let's say, 500 edits or something like that. Exact number TBD. Although, writing this out led me to realize we need a task to hold ourselves accountable for deciding this: T329340.
PPelberg (WMF) (talk) 01:01, 11 February 2023 (UTC)Reply
Responding to the numbered questions:
  1. I think there is some risk for false positives. E.g. if the tool tells people to add citations for the plot section of a film, that'll violate w:MOS:PLOTCITE. There is also some risk for false negatives. This could happen if newcomers become used to the tool, and then start expecting it to let them know whenever they need to add a reference. Then, when the tool misses an instance, they might falsely think that they don't need to add anything because edit check didn't flag it.
    More generally, I think the difficulty of this project is that it's trying to communicate Wikipedia's policies/guidelines succinctly at points when newcomers need them, but those guidelines are both inherently complex and not designed to be machine-readable. Because of that, identifying when they come into play will be a challenge.
  2. (See Phabricator discussion)
  3. We always want even the newest of newcomers to be incentivized to create an account, so I would not want to see it enabled only for logged-out editors. Once the tool is mature, I could see it being enabled for everyone if it's good enough about not having false positives — but of course there would be an option in the settings for anyone to turn it off. For the beta stage, I think something like fewer than 15 days or 100 edits might capture most of the target user base. I think you should also make it an opt-in beta feature, though, since some experienced editors like myself are going to want to try it out to provide feedback.
  4. This could be tricky, as our normal way of checking things is to look at edits, but this tool of course intervenes before an edit is made. I think trying it out for ourselves will be very helpful, as we'll be able to see how accurate its suggestions are.
  5. (See Phabricator discussion)
Cheers, {{u|Sdkb}}talk 18:42, 11 February 2023 (UTC)Reply
Well, when I need to make an edit on mobile, my workflow (and that of most other experienced editors I know) is to just switch to desktop mode, since the mobile mode is so deeply lacking in functionality it's preferable to suffer through the tiny fonts/buttons of desktop on mobile.
Understood. I appreciate you naming the differences in what functionality is and is not available within the desktop and mobile editing experiences.
While we are aware of this, the Editing Team is not currently prioritizing work on feature parity.
I recognize that me saying the above offers no relief. Tho, it is important to me that, at a minimum, you know what the Editing Team is and is not focusing on in the near-term. PPelberg (WMF) (talk) 01:03, 11 February 2023 (UTC)Reply
Yeah, certainly. I definitely didn't mean it as an accusation that you ought to focus more on mobile editing. (And indeed, I'm not sure how fruitful that might be — there are some tasks just inherently not well-suited to mobile.) {{u|Sdkb}}talk 18:46, 11 February 2023 (UTC)Reply
There are lots of things we might want to configure. We care more about referencing for certain types of articles, such as those on living people and those on medical topics, so we'd likely want a more aggressive implementation for those than others.
Understood and I feel energized hearing that you anticipate people wanting to be able configure the checks themselves!
But back to what you shared, two follow-up questions for you:
  1. Would it be accurate for me to think that you'd expect to be able to configure the check based on the category in which the article someone is editing belongs to?
  2. Can you say a bit more about what you mean by "aggressive implementation" in this context? Asked another way: can you share the kinds of 'rules' you'd imagine wanting the reference check to operate on and that you think volunteers would value being able to 'tune' based on the category an article belongs to?
PPelberg (WMF) (talk) 01:03, 11 February 2023 (UTC)Reply
On (1), yes. We might also want to configure the check based on project tags (which add categories to the talk pages) or even Wikidata information.
Another way we might want to configure is based on article quality, which would also be reflected in talk page categories. For instance, we might decide to tone down the suggestions on a featured article, which has already received extensive human review.
On (2), I'm thinking of it in terms of the confidence level of the tool. So, for example, if the tool thinks there's an 80% chance that adding a reference is needed, we might decide that that's high enough that we want to display the check at a BLP page, but not if it's a normal page. {{u|Sdkb}}talk 18:51, 11 February 2023 (UTC)Reply
Of note for you all, there's currently a tag for new users adding unreferenced material to a BLP.
Oh, great spot. I've added this tag/filter (Special:AbuseFilter/history/686) to the Edit Check project page. Although, please boldly edit that page if I've linked to a tag/filter that differs from the one you had in mind. PPelberg (WMF) (talk) 01:05, 11 February 2023 (UTC)Reply
Noting this comment you made on Phabricator — Iterate on the "Citoid" edit card to include some kind of guidance that prompts people to reflect on whether the source they're considering adding is likely to be one other volunteers deem acceptable — we'll absolutely want control over the source list there, so that we can modify it as RSP changes, and ideally we'll want to be able to provide context/specific conditions for (sometimes-)unreliable sources, as it's far from just a binary reliable/unreliable switch.
Great call; I'm glad you named the nuance and configurability such feedback would require.
I've added a note to T276857 (the ticket where we'll explore adding feedback of this sort) so that I can hold us accountable to revisiting this conversation when we prioritize work on it. PPelberg (WMF) (talk) 01:05, 11 February 2023 (UTC)Reply
And we'll want to be able to tweak all the guidance text that shows up at the different steps.
Understood. PPelberg (WMF) (talk) 01:06, 11 February 2023 (UTC)Reply
Going more broadly, I think the true wiki way would be to allow us to come up with custom filters (just as we currently do abuse tags) and provide custom feedback. This would probably mean leaving out some of the more advanced functionality, but I think the community would be able to come up with many good uses that'd be more effective than our current approach when we notice a persistent error of trying to expand guidance (leading to creep).
I feel inspired reading the above! Primarily because it sounds like you and us (the Editing Team) are seeing similar potential in this work…
Where – to double confirm – "potential" in this context refers to the ability for experienced volunteers to write custom filters/checks (e.g. similar to those you've listed in T327330#8581224) in ways that:
  • New and inexperienced volunteers will intuitively understand and be equipped with the tools/workflows they need to apply them
  • Experienced volunteers can audit and iterate upon

I'd value knowing if you sense any gaps between the potential you articulated above and what I responded with.

Note: I still think there is a lot for us to collectively execute and validate before I'd feel comfortable "promising" the potential we seem to both see, but yes, the Editing Team is definitely thinking about this first check as a proof of concept for an open-ended system that we can collaboratively shape and expand. PPelberg (WMF) (talk) 01:08, 11 February 2023 (UTC)Reply
Yep, that's correct! {{u|Sdkb}}talk 18:52, 11 February 2023 (UTC)Reply
One other thought (for now): Many editors who are contributing uncited material are providing original research, so we'll want to identify and provide guidance to them.
Before inquiring more deeply about the Upload Wizard, would it be accurate for me to understand what you're describing above as the following?
In the event that someone declines to add a source, the edit check user experience ought to present them with next steps (that experienced volunteers at individual projects ought to be able to configure) that align with the best practices outlined in WP:No original research.
In the meantime, I've filed T329406. PPelberg (WMF) (talk) 01:09, 11 February 2023 (UTC)Reply
Yep, that's correct! {{u|Sdkb}}talk 18:53, 11 February 2023 (UTC)Reply
@Sdkb, are you saying that if someone adds material without adding a source, then it's likely impossible for anyone to find a reliable source that could support the new material? That hasn't been my experience, but perhaps it varies by subject area. For example, consider this edit that turned up in my (volunteer) watchlist the other day. Would you call that "original research"? Whatamidoing (WMF) (talk) 21:11, 14 February 2023 (UTC)Reply
@Whatamidoing (WMF), I'm not saying that. It's certainly always easiest for the person adding info to provide the source when they add it, but others can always do so later. For the category of editors who don't add a source, there are two subcategories: the ones where there is a public source but it's just not provided, and the ones where there is no public source (e.g. "I know Jane Smith lives in Glendale because she's my neighbor"). It's the latter subgroup that is likely original research. {{u|Sdkb}}talk 21:56, 14 February 2023 (UTC)Reply
@Sdkb, I am thinking about this: "Many editors who are contributing uncited material are providing original research".
Original research is "material—such as facts, allegations, and ideas—for which no reliable, published sources exist". "Exist" means they're available to be found in the real world. So if you write "Jane Smith lives in Glendale" because she's your neighbor, but it happens that Jane Smith has posted on her Facebook account that she lives in Glendale, then that's not original research. Jane's Facebook post is a reliable, published source for her residence, and therefore a reliable, published source exists for this material, even though it's not cited. (If, on the other hand, Jane is like me in shunning all anti-social media, then it might or might not be OR.)
Looking at it from the POV of the software, I think that OR is not a useful lens for evaluating content. The software can identify whether material is cited, but it won't be able to identify whether uncited material either should be (WP:V) or could be (WP:OR) cited. Consequently, I'm thinking that it's simpler to focus on the concept of cited/uncited instead of "policy violations". Whatamidoing (WMF) (talk) 22:28, 16 February 2023 (UTC)Reply
If it helps to have a real-world example, this is from earlier today.
If I understand correctly, you're saying that it's very hard for the software to determine whether or not a reliable source exists for content contributed without a source, so it's therefore hard to say whether or not it's original research. That makes sense.
Basically, I imagine the flow for editors who decline to provide a source to be something like this. First, we nag them and say, "you should really provide a source, it'll make you less likely to be reverted". For the portion that still decline, we want to know why. For some, they might be unwilling to be bothered. For some, there might be a source but the novice editor can't manage to find it (we want to divert them to somewhere where experienced editors can help with that). And for some, there might be no source available because they're contributing original research. I think you make a good point that it can be hard for a novice editor to know whether they're in the second-to-last group or the last group. But we want to find ways to provide appropriate guidance to everyone. {{u|Sdkb}}talk 22:45, 16 February 2023 (UTC)Reply
I don't think it would be possible for the software to differentiate that kind of edit from someone doing a basic copyedit. It's pretty easy to identify "added a whole paragraph". It's kind of hard to identify "added a new sentence" (How many sentences is "In 1939 Mr. Smith went to Washington, D.C."?). It's really hard to differentiate "added new information" from "fixed grammar". Whatamidoing (WMF) (talk) 00:02, 18 February 2023 (UTC)Reply

Commenting since I was pinged (thanks Sdkb!). PPelberg (WMF), if you're interested in my thoughts about mobile editing, I wrote a piece for the Signpost last month. [1] There's some interesting comments on the related talk page from other Wikipedians. If there's anything you want to ask me, feel free. Clovermoss (talk) 15:20, 28 January 2023 (UTC)Reply

hi @Clovermoss – it's great to arrive in a conversation with you after appreciating the some of the work you've done from a distance. Now, as someone who is experienced with and has developed nuanced thoughts around Wikipedia's mobile editing experience, I'd value hearing what you think of the mobile experience we're envisioning for Edit check.
I expect us to have some mockups to share in the next week or so (maybe sooner)...I'm thinking I'll ping you here once we've published them along with some prompts/questions to serve as feedback guides. How does that sound to you? PPelberg (WMF) (talk) 00:23, 15 February 2023 (UTC)Reply
@PPelberg (WMF): Feel free to reach out whenever you have something you'd like to share. I still have access to my old phone so I can actually observe what it looks like from two devices if you think that'd be useful. Clovermoss (talk) 04:33, 15 February 2023 (UTC)Reply

I responded at the ticket and have found this discussion, which is perhaps a better place for it. Mathglot (talk) 00:13, 12 February 2023 (UTC)Reply

hi @Mathglot! Based on what you've written on Phabricator, I have the sense that we (you, @Sdkb, and the broader Editing Team) are thinking about Edit check in similar ways which I feel quite energized about!
You can expect direct responses from me to the feedback you shared in T327330#8607428 before this week is over. In the meantime, I wanted you to know that the feedback you shared is by no means too late. In fact, it's arriving at just the right time ^ _ ^ PPelberg (WMF) (talk) 00:18, 15 February 2023 (UTC)Reply
Thanks for the update. Just wanted to confirm that I am interested, and to let you know that I'm less experienced here at mw:, so if you ping here and don't hear from me within a day or two, definitely try an en-wiki ping; I won't knowingly ignore responses from you (or anyone) here. Mathglot (talk) 23:06, 16 February 2023 (UTC)Reply
Understood! And just so you know, the response I committed to sharing before today was over will need to come next week instead...today got away from me ^ _ ^ PPelberg (WMF) (talk) 01:52, 18 February 2023 (UTC)Reply

Above somewhere you asked about who might be contacted, and I know that User:Cullen328 has a page on smartphone editing and may be interested in this topic. (It's too far up to try to figure out the right indent/spot to add it, so just tacking it on here.) Mathglot (talk) 23:37, 16 February 2023 (UTC)Reply

Thanks for the ping, Mathglot. I have been an active editor on English Wikipedia for almost 14 years and 98% of my contributions for the past 12 years have been made using smartphones. I have been an administrator for 5-1/2 years and have carried out many thousands of administrative actions on my phone. My strongly held perspective is that all sites and apps offered by the WMF for the purpose of editing should be fully functional. Wikipedia is a collaborative project and new editors become productive and committed to the projects when they are fully able to collaborate and interact with their colleagues. In my view, all of the software offered to mobile editors is inadequate in a variety of well-documented ways. That is why I use the so-called "desktop" site on my phone, because it is fully functional and enhances collaboration. "Desktop" in this context, in my view, is an inaccurate moniker that discourages mobile editors from using readily available fully functional collaborative software on their phones. I am aware that my view of this matter is is often castigated and ridiculed, but I believe that my own long record of contributing high quality, well-referenced content, and fully participating in many "behind the scenes" areas of the encyclopedia serves as a refutation of those criticisms. To be frank, I am not much interested in placing band-aids on software that I believe impedes collaboration. If these fixes help a little bit, that is fine, I suppose. I take a much broader view. Cullen328 (talk) 00:37, 17 February 2023 (UTC)Reply
@Mathglot, @Cullen328, and anyone else: You should see a "Subscribe" button at the top of this thread. If you click it, you'll get Echo/Notifications for every new (signed) comment added. Whatamidoing (WMF) (talk) 00:04, 18 February 2023 (UTC)Reply

Dark Continent

[edit]

The project has a focus on editors from Sub-Saharan Africa. But there seem to be several issues including:

  1. As noted, "people from Sub-Saharan Africa represent only 1% of active unique editors".
  2. Sub-Saharan Africa is different in nature to the West, being lower in resources, development and infrastructure

The proposition is to add a feature to the Visual Editor to nag editors about adding references. But how does this address the geographical issue? Sub-Saharan Africa is not richly endowed with libraries, museums, quality press and other GLAM institutions which will help with formal citations, is it? Instead, it has different traditions such as oral history. So, nagging Sub-Saharan Africa about expectations based upon the experience of the more privileged 99%, doesn't seem appropriate.

For example, consider a topic in the English Wikipedia: Furra -- a legendary queen. This was started by a BBC journalist with Ethiopian heritage. I assisted her in getting this started at a BBC 100 event and then did some follow-up work, taking it through the DYK process. I was able to find some sources to cite in our customary way but it wasn't easy and I suppose that the work might seem quite difficult or strange to people in that region. There might be some good sources in the Sidama language and oral tradition but they will not be easy to access here.

So, the proposal doesn't seem a good fit with the objective. Perhaps it should be turned around so that the 99% of our editors in the developed world, like me, get the nagging, while those in Sub-Saharan Africa are exempted from this.

Andrew Davidson (talk) 13:12, 23 February 2023 (UTC)Reply

Quite interesting point you suggested relating how people will reach out to library or reliable sources: if not par to Sub-Saharan editors, us in ja, or in the global North but in the East, also suffer from finding good sources to reach. We are gifted with decent libraries to each prefecture, dotted Japan in 47+ cities. But as far as you are yet to learn how to hunt books in libraries, numbers are numbers, not treasure troves.
I am green to my ears that Sub-Saharan peers are given the chance to explore the five-pillar supported effort and try explore reliable sources with peers. And I am ambitious to see how things turn out, as Sub-Saharan experience would be a great forerunner for us on jawp.
One thing aside from how many libraries you have near you. Isn't this an opportunity to demonstrate that you can't write un-sourced materials on Wikipedia? Our weak point in jawp is that newer editors need to learn that and Five Pillars, and realize we are not on a chat board or SNS. Of course, some topics such as anime or voice actors, pop singers, attract newer editors; they come jumping in to write/change whatever they heard/ seen, and such pages naturally ends up in editing wars. We senior editors fail to prove them what a digital encyclopedia should be handled. Or us old-school don't understand the best route to find backing information for those very new and developing artists/genre of entertainment.
One question.
Seriously, should such reliable sources be physically located on the Continent? Or documented in local language only? Written in local language and any chance stored across the ocean? My search for good ukiyo-e woodblock prints of 18th-19th century Japan often leads me to the States or UK museums/collections. Then I need to correct captions on Commons, as references on each artist are a bit richer in ja, and misunderstanding can be solved with reliable sources.
Well, replace prints to sculpture/costume abroad, and references local, that formula might apply to any part of the world. Human history had many explorers/scholars coming to the end of the world (on the map projection they had used), picked up whatever they found curious, brought those evidences/collections to their home in the global North, to the West. That is not that bad, or in Tokyo alone, we, too, had collected items of anthoropology from around Japan, then lost many of them to fires/earthquakes in the 20th century. Or hymns of 16th century Jesuit has been orally handed down to 1860s among those hidden Christians in southern Japan (orrassio in local term), in 16th century Catalan dialect.
Let's hope we have backed up important human knowledge in material form at distant places. And local wisdom or non-material knowledge is still handed down among us somewhere on this planet, including among Sub-Saharan peers. Or invite unwritten wisdom to be put into texts. Are there any more dreamer like me? Omotecho (talk) 13:12, 24 February 2023 (UTC)Reply
"Evening Snow on the Heater" from New York's Metropolitan Museum of Art. What is happening here -- can you guess?
Good points. Me, I live in London, which has many GLAM institutions and so I recently started an article after seeing an exhibit at our Science Museum. But multi-cultural topics can be challenging. For another example, I started an article to explain a ukiyo-e print when it was featured as a baffling Japanese picture on the English Wikipedia (right) without any narrative to explain the subject. That has now been expanded into a detailed explanation of the series -- an elaborate parody or pastiche of Chinese art. People in one culture may find it difficult to understand such intricate conventions, culture and history of another... Andrew Davidson (talk) 14:05, 24 February 2023 (UTC)Reply
Wonderful I encounter somebody who cares about ukiyo-e, and so appreciating to focus on Harunobu, a kind of mysterious artist to me.
I imagine it takes very deep understanding that many art forms in Japan had entertained ideas to have your tongue in cheek kind of resistance against the Bakuhan system. Or brocade on the back of cotton kimono and cheat the luxury ban imposed on you by the government/feudal lords stocking for wars that would never break.

Then, that kind of everyday people's spirit against the rulers would be a global theme, won't it be? How about in Sub-Saharan area?
I am thrilled to imagine that had happened somewhere...
FYI, regarding the image of the cotton on those laquarwares, I will scan NDL biblio database. Kindly,, Omotecho (talk) 13:43, 25 February 2023 (UTC)Reply
Some of the major interest areas in Africa seem to be pop culture (like everywhere else), current politics (like everywhere else) and businesses information. As a result of the interest areas, the sources tend to be available online. Whatamidoing (WMF) (talk) 22:06, 28 February 2023 (UTC)Reply
But not always. For example, consider the town of Azia which is in Nigeria. I picked this up on prod patrol and then had some difficulty defending it. The best source is a book. To access this myself, I had to visit the British Library. That's one of the best reference libraries in the world but it's in London and so not readily available to most editors. The original editor used the book but then got burnt by using it too faithfully. This is a common issue -- damned if you do and damned if you don't.
Note, by the way, that one of the editors involved in this incident -- SpinningSpark -- has recently died. My impression is that the original corps of veteran editors who established much of the current thinking about such matters is now dying off or being driven off. There is much that can and has been said about this. I don't want to digress but it may be relevant as background.
Andrew Davidson (talk) 23:24, 28 February 2023 (UTC)Reply
Picked up the term British Library: @Whatamidoing (WMF), hi, do you see we are able to expand like the following, and I wonder whose team under the WMF umbrella does the best job: in my case, I'd depend much on Wikipedia Library;
  • can we request the WL sponsor establishment and set a number of temporary accounts on Wikipedia Library; they can expire automatically on the due date of each initiative?
  • Or any matching page where those needing book research in a book/journal will be paired with those WM editors, who are lucky enough to hold accounts, esp at those publishers whose resources focus on each initiative? If not writing or content proofreading, book research itself is an area some people are very resourceful.
  • Wikimedia Library has partner establishments who sets the number of total library cards or free-of-charge log-in accounts. That is why editors are in queue for vacancy, and those who holds account are encouraged to offer and do book research when requested. I have encountered only two jawp editors who were aware of WL, so I translated the page into jawp. As Wangari Muta Maathaï has said, Mottainai! (waste of reserouces)
Omotecho (talk) 04:58, 1 March 2023 (UTC)Reply
Mottainai!? It's a small world as that was a remarkably controversial topic on the English Wikipedia -- see Talk:Mottainai and its archive. Andrew Davidson (talk) 10:16, 1 March 2023 (UTC)Reply

Where will this sit in the publish process?

[edit]

Timing wise, is Edit check going to be before the contributor attempts to publish, or will it interrupt the publish process? Other items in this workflow are pre-publish items like disambiguation resolution, and interpublish items like CAPTCHA and Abuse Filter. Xaosflux (talk) 15:44, 28 February 2023 (UTC)Reply

I think the answer is "yes". Nothing's written in stone (or even in code ;-) yet, but the notion is that it could trigger:
  • after you've added some content (e.g., a whole paragraph) and/or
  • when you click the Publish changes… button.
The first option runs the risk of interrupting you. The second option runs the risk of people cancelling their completed edits.
I think that the very first version will not trigger frequently. (If there's something totally broken about the experience, you don't want it interfering with all the edits.) So – on the first day – it might not trigger on a page like w:en:User:WhatamIdoing/Christmas candy, because it contains just one short paragraph (115 bytes long) plus some lists. But eventually I would expect it to trigger when a short paragraph is added. Whatamidoing (WMF) (talk) 21:37, 28 February 2023 (UTC)Reply
From the early mockups on Figma, it also looked like there may be a difference with mobile vs. desktop, with mobile interrupting the publish process whereas desktop coming up as you edit. {{u|Sdkb}}talk 19:01, 3 March 2023 (UTC)Reply
The team decided recently that the mobile and desktop versions are probably going to be different. It's more work, but it could produce a more natural feel in the different platforms. Whatamidoing (WMF) (talk) 18:57, 24 March 2023 (UTC)Reply

This was discussed at the meeting today. I liked the suggestion that the process might be a background activity rather than an intrusive dialog. This would highlight issues in the text as the user edited or created it. Background shading or annotation would appear, drawing attention to issues such as spelling errors, missing citations, plagiarism, grammar and so on. An example of a tool which works in this way was Grammarly. I've not used this myself but it seems successful and so may be familiar to our users. We might copy the method of operation of such popular tools so that it would seem intuitive. Andrew Davidson (talk) 22:49, 3 March 2023 (UTC)Reply

It looks like the first version will be more like "intrusive dialog" and less like "background activity", but the hope is to adjust that in the future. Whatamidoing (WMF) (talk) 02:12, 17 March 2023 (UTC)Reply

I liked the suggestion that the process might be a background activity rather than an intrusive dialog. This would highlight issues in the text as the user edited or created it. Background shading or annotation would appear, drawing attention to issues such as spelling errors, missing citations, plagiarism, grammar and so on.
— Andrew Davidson

@Andrew Davidson, Whatamidoing (WMF), and PPelberg (WMF): Suffusion of Yellow brings up an excellent point, below, regarding the privacy implications of this sort of approach.
A background task inherently means evaluating unpublished data — that's a privacy minefield, unless the analysis is performed entirely client-side without ever sending even a single byte of content back to the Wikimedia servers. (Which, for certain of those suggestions — e.g. plagiarism — is basically impossible.)
If data is evaluated server-side, then it would have to be immediately discarded because retaining users' private data for analysis / review purposes is a huge red flag.
Spiriting away unpublished content from the client edit interface would probably also run afoul of the GDPR, unless the user is given a (clearly-explained) choice and explicitly affirms their opt-in to data-sharing, before any such sharing takes place.
Even if the data collected were to be managed in a fully-anonymized fashion with all identifying metadata removed, it wouldn't make any difference. As Suffusion of Yellow points out, what we can't know for sure is that the content itself contains no sensitive personal data. So it just can't be collected or stored, full stop. Not anywhere, not for any length of time.
Storing potentially-sensitive data would be a bad idea, even if the only person with access was PPelberg (WMF) — forget about making any of it semi- or totally-public. Not having access to any of the actual content processed by Edit Check seems like it would hamper analysis of the tool's accuracy/performance.
All of these issues go away, once the user has signaled their intent to publish the edit interface's contents by hitting the "Publish" button. But for anything that occurs prior to that point, it's a completely different story.
Have any privacy experts (and/or the Evil WMF Lawyers™) been involved in discussions about Edit Check's features and design? The time to invite them to the table is yesterday. FeRDNYC (talk) 08:54, 4 May 2023 (UTC)Reply
This objection seems excessive. If users have such extravagant privacy requirements then these should be addressed at the outset rather than waiting for the Publish interaction, which is not a good place for it, as discussed below. Andrew Davidson (talk) 09:16, 4 May 2023 (UTC)Reply
I would not consider this an "extravagant" privacy requirement. Do you except the recipients of your emails to be able to read them before you click "Send"? Suffusion of Yellow (talk) 21:04, 4 May 2023 (UTC)Reply
I use Gmail and expect that service to save drafts frequently while I'm working on them. They are saved into the cloud and that seems quite normal for such services now. This is not a bug, it's a feature, as losing a draft version because it hasn't been saved is quite annoying. If people have extravagant privacy requirements then they shouldn't be using internet services like Wikipedia and should be warned off at the earliest opportunity so that they don't waste their time. Andrew Davidson (talk) 10:25, 22 June 2023 (UTC)Reply
As noted below, we already are sending the content of unsaved edits to servers. Open up your browser's network monitor, and look for "stashedit". But as far as I know, that's discarded after one minute. I see no problem with sending back to user, and only the user, the result of some evaluation. It's allowing anyone but the user access to this result, or the text that was evaluated, that is the problem. Suffusion of Yellow (talk) 21:01, 4 May 2023 (UTC)Reply
I could check in with the Legal team about this, if/when that ever happens. While storing the data might be useful during development (e.g., to see whether the evaluation is happening correctly), I'm less certain that it would be wanted in the long-term.
As for whether it's legally acceptable: Do users in Europe see autocomplete search suggestions on major web search engines? Do you remember being "given a (clearly-explained) choice" or "explicitly affirm[ing] their opt-in to data-sharing" for that? AIUI sending the data back to the servers for evaluation is how that happens, and if it's okay for Google to do that without clearly explaining the mechanism and letting you opt out, then it's going to be okay for other websites to do that, too. I therefore doubt that it is a legally insurmountable problem. Whatamidoing (WMF) (talk) 18:33, 5 May 2023 (UTC)Reply
Re storing the data might be useful during development. sure, on a separate wiki (beta or whatever) where people have a lower expectation of privacy, so long as it's clearly disclosed. On a production wiki? At a minimum, it should only be visible to NDA'd developers, and deleted-for-real (not "archived") when you're finished debugging. Yes, someone, somewhere at Google, has access to your autocomplete suggestions. But imagine if one day Google started sharing that history with all your contacts. That goes beyond "legally acceptable or not". That's creepy.
But each wiki is going to have its own highly customized "edit checks", right? Each time the equivalent of a local EFM updates one of the "edit checks", they need some way to know they haven't made a mistake. Looking for an NDA'd developer to go sifting through a log obviously does not scale. Hopefully, it will be enough just to look at the finished edits. Suffusion of Yellow (talk) 20:03, 5 May 2023 (UTC)Reply
@Suffusion of Yellow: "Hopefully, it will be enough just to look at the finished edits." Ooh, that just made me realize something. Because the Edit Check data is eventually working towards published content, it would be really easy to de-anonymize. Say the Edit Check log shows that some anonymous user was writing some text, which Edit Check flagged and suggested they change in some way. They do so, and the updated text becomes part of their edit when published. Well, the information on who made that edit is public, which means we know who received that suggestion, and we've now tied the input data back to them as well. Yeah, there's no such thing as anonymous Edit Check data. FeRDNYC (talk) 19:34, 7 May 2023 (UTC)Reply
(...For published edits. If the user ultimately decides not to publish their edit, then they haven't "outed" themselves as the same person who interacted with Edit Check.) FeRDNYC (talk) 19:40, 7 May 2023 (UTC)Reply
They could still out themselves by the content of the edit, e.g. "as I just told you, Bob", or "I made [this diff] because", etc. Absolutely nothing must be logged anywhere, except "this finished edit matched this edit check", which is something anyone with time on their hands could have worked out for themselves anyway. Suffusion of Yellow (talk) 19:45, 7 May 2023 (UTC)Reply
@Whatamidoing (WMF): That's a good question, regarding autocomplete. AFAIK it does work in Europe, although (just like the in US) you can opt out of it. However, Google sets strict limits on how that data is managed — it's quickly aggregated, so that the only strings their servers store are the ones that have been entered by thousands of users (if not more), which no longer makes it personal data. They've also been sued, more than once over the results of those autocompletions.
And the enhanced spell checking feature in Chrome (which sends your typed input back to their servers) is explicitly default-off and opt-in only, even though it promises to anonymize and not retain the data, all out of privacy concerns. (Much greater ones than are involved here, of course, since it has access to everything you type in any input field. Including our editing interfaces.) FeRDNYC (talk) 19:11, 7 May 2023 (UTC)Reply
This is slightly off-topic. The issue of sending the text to servers is moot. The "stash edit" feature has been doing this for at least half a decade. It's why huge pages sometimes seem to save instantly; the server had already quietly parsed the page and even checked it against edit filters, and was just waiting for your confirmation. The reply tool with which you typed this response was sending data to the servers so it could show you a live preview. And of course "show preview" and "show changes" were server-side from the very beginning, and never explicitly disclosed that fact. The issue is storing the data, and worse, sharing it. Suffusion of Yellow (talk) 19:33, 7 May 2023 (UTC)Reply
I agree that it's probably possible to do this (send information to the servers, and even to store it briefly) legally. I can ask Legal to particularly look at that.
But the bigger question, and one that Legal might not accept, is sharing the information. But if editors can't see the information, that might limit their ability to test their own checks. The underlying software could be made as open-ended as Special:AbuseFilter, and it could be made as pre-packaged as Growth's Newcomer Home Page. But if it's open-ended, I'd hope that editors are verifying that it's really catching what it should be. There is always a risk that when you try to prevent one bad edit, you will accidentally lose two good edits as well. If you can't see what's happening, you might not know that a given check is causing more harm than benefit. Whatamidoing (WMF) (talk) 22:32, 20 May 2023 (UTC)Reply
My objection goes beyond "is this legal?" See my example about Alice and Bob below. It's not just questionably legal, it's toxic. Either you have to be worried at every keystroke that your innermost thoughts are going to be shared, or you get yelled for something you never intended to publish. Suffusion of Yellow (talk) 22:56, 20 May 2023 (UTC)Reply
I agree that it's a significant worry. It's one thing to have a couple of people with a contractural obligation to maintain confidentiality who are checking something on many edits, for a particular purpose ("Your call may be recorded for quality purposes...").
It's quite another thing to have an editor with no obligations, and who might be in a dispute with you, reading the words you typed before you publish them, including things you did not publish.
I would not like to see this be possible. But: There are consequences for this decision. Being able to see what editors do before they publish (e.g., right now, they can count how many editors click on a button in the editing toolbar) helps developers know whether a tool is working (e.g., nobody clicks on that button; people who click on that button abandon their edits; clicking on that button results in fewer reversions...). Having that visibility into pre-publication actions improves the tools (e.g., nobody clicks on that button → change the label on the button, so users know what the button does).
Perhaps if editors can't (and they normally shouldn't!) have the information that is necessary to find out whether the tool is working, then they also shouldn't be creating brand-new tools that ought to use this sometimes-private information for testing purposes.
As a simple example, imagine that someone wants to encourage new articles to be well-sourced. This well-intentioned person creates a "check" that guesses whether the article is about a living person and requires the citation of six sources, and that only sources from a pre-approved list are counted.
What they will notice is:
  • All new articles about living people contain at least six citations from sources on the pre-approved list. 🙌
They might even notice that:
  • Fewer articles are created.
  • Fewer new articles end up deleted during the first 30 days.
What they will not necessarily see is:
  • Fewer articles about dead people are created, because the heuristic for detecting whether the subject is alive is poor.
  • Fewer articles that aren't about people at all are created, because the heuristic for detecting whether the subject is about a person is poor.
  • Articles about living people don't get created, because the editors couldn't figure out how to format the citations in a way that was recognizable to the new "check".
  • Articles about living people don't get created, because not all of the good sources were included in the pre-approved list.
  • The pre-approved list of sources is focused on English-speaking content, so the creation of articles about people from non-English-speaking countries is particularly badly affected.
  • More articles about certain borderline subjects (e.g., minor politicians, newly hired athletes, first-time actors) are being created.
  • It doesn't trigger when a redirect is turned into an article, so editors bypass the new "check" by creating articles as redirects first and only then adding article content.
...and so forth.
When the Editing team creates software, they look for these kinds of bad outcomes. This sometimes requires more information than is visible in a diff. If you don't feel that you could trust volunteer devs to have the pre-publication information that is necessary to notice such problems, do you also think we should we limit the "check" system so that they can't (independently) write their own, potentially inadequately tested, versions? Whatamidoing (WMF) (talk) 18:56, 23 May 2023 (UTC)Reply
As in, have to file a phab ticket every time we think a "check" might need tweaking? And wait the usual one week to fifteen years to get it fixed? No, that would terrible in its own way. It would effectively give the WMF a channel to tell users the right way to write content. Traditionally, there's been a "separation of powers" and that would be a major breach. I think a combined approach will work:
  • First, on every popup, have a prominent "report error" button. Not some ad hoc system like we have with edit filters, but built right in to the extension. When they click that, it's clearly disclosed that now the contents of the edit form will be made public (and also CC-BY-SA, you are editing logged out, blah blah blah). They can click "cancel" if there's a something private or copyrighted.
  • Second, as discussed above, a log of those checks which stall match when the user clicks "publish". I understand the worry that this won't be adequate, but really, from working with warn-only edit filters, I think that maybe 50% of time, even when the filter is correct, people just ignore the warning. And it's probably about 95% when the filter is wrong. In your example above, nearly everyone creating an article about a dead person will think, "oh, stupid buggy system", ignore the popup, and publish anyway. And then we'll have the data we need.
And no, I don't want either of these logs to be limited to "volunteer devs" or any special group. With edit filters, we value transparency; and only mark filters private if there's a good reason to do so. We certainly don't mark "good-faith" edit filters (like en:Special:AbuseFilter/869) private. Suffusion of Yellow (talk) 19:39, 23 May 2023 (UTC)Reply
I don't think you were around for the Article Feedback Tool, which was removed by community demand in 2014. I can recommend reading w:en:User:Risker/Risker's checklist for content-creation extensions in general (FYI @Risker), and I'm sure that a feedback tool inside the editing window won't result in as much extra work for oversighters as the ones shown to readers did, but it could still be a lot.
Also, feedback tools usually have a pretty poor signal:noise ratio. Some people will report errors on correctly functioning software (e.g., spammers), and most newcomers just trust that the software is correct, or they'll click pasts it and decide that it's not worth their time to report the problem. The false positives and false negatives are both significant.
And, of course, this is assuming that the problem is fairly obvious to the end-user. If the check says "This appears to be an article about a living person, and it appears to have zero sources. Sources are required for all regular Wikipedia articles" with options like Not a living person and Already cites sources and This is a disambiguation page, then that might result in people getting past it when the check is buggy.
But if it says instead "This article must have more sources. You cannot publish this unless you add citations to reliable sources", with the only possible response being Add sources, then I would expect a very different reaction from the users. Whatamidoing (WMF) (talk) 18:23, 24 May 2023 (UTC)Reply
With edit filters, the tradition is to almost never disallow an edit unless it is disruptive. Certainly I wouldn't want Edit Check to be any different. It should offer suggestions, and nothing more. If something really should be disallowed, we can continue to use AbuseFilter. Suffusion of Yellow (talk) 18:39, 24 May 2023 (UTC)Reply
I could imagine editors wanting Edit Check to "disallow" the addition of any URL that matches the spam blacklist, and once "disallow" is an option, it's an option that could be used against (the English Wikipedia's) tradition. (Different wikis have different traditions.) Whatamidoing (WMF) (talk) 16:46, 13 June 2023 (UTC)Reply

Clarify please: warning for SpamBlacklist warning

[edit]

Yes Done As it says: Warn when the url added as reference is registered in the SpamBlacklist, and thus prevent the warning from appearing when saving the page, does it mean the SpamBlacklisted url is removed automatically? Omotecho (talk) 09:56, 4 March 2023 (UTC)Reply

@Omotecho, that might be possible in the future, but I think it would probably just show a message and (I hope) highlight the place where the link is located. Whatamidoing (WMF) (talk) 02:11, 17 March 2023 (UTC)Reply
I see, and case closed. Thank you helping me (: Omotecho (talk) 03:24, 17 March 2023 (UTC)Reply

Privacy

[edit]

It's suggested that we are going to be able to Review the edits people who are shown Edit Check are making. But the user hasn't clicked "Publish" yet. Publish means "make public". Until they click that, it's private. The contents of the edit form might contain all sorts of private information: cut & paste fails of passwords or personal info, ill-thought-out personal attacks, copyrighted text, and so on. The user hasn't agreed to for it to be shown to anyone, even users with special rights. So how will we be able find and adjust broken filters/checks? Suffusion of Yellow (talk) 22:27, 13 April 2023 (UTC)Reply

Editing a complex text like an article will naturally tend to require some interaction. For example, I commonly use the visual editor to generate citations from URLs and it is good practice to do this as one goes along. If such interaction causes privacy concerns then these are best dealt with at the outset, before the user starts editing. The user should be warned of any such implications in advance so that they don't waste their time or risk their privacy. It's no good waiting for a final publish interaction. As that may come at the end of a lengthy editing session, the user will tend to be weary at this point and disinclined to read through detailed formalities and legal warnings. If these are important, they are best done when the user is fresh. Andrew Davidson (talk) 09:09, 4 May 2023 (UTC)Reply
It's not the information being sent to the servers that I'm worried about. stashedit has been doing that for years. But a stashed edit is kept for what, one minute? And even if it's kept for longer, only some NDA'ed developer is ever going to see it. But what's being proposed here would seem to be making public (or at least sharing with some huge group, like admins) the contents of the edit box, before the user intended to make it public.
For example, suppose Alice types Bob, you're a total moron, regrets that a few seconds later, hits backspace, and types Bob, I respectfully disagree, Should Bob be able to access Alice's original comment?
A disclaimer won't help here; we're talking about fundamentally changing the way we interact with Wikipedia. To start, it'll no longer be possible to preview anything you don't want public. It'll no longer be possible to keep copyrighted text in the edit form, for easy reference. We'll have to be on edge every time we hit CTRL-V.
I can see a few solutions, though:
  • Only log the edit if the problem was still there when they clicked "publish". That would be incomplete data, but possibly enough.
  • Provide a "report problem" button. When the user clicks that, warn them then that they're about to make a public post, and they should remove any copyrighted material, etc.
Suffusion of Yellow (talk) 20:50, 4 May 2023 (UTC)Reply
In the early days, "review" will mean "review via Special:RecentChanges". The first step will be a new item in Special:Tags (which was recently created, but which just picks up the addition of any 50 characters, so not really functional right now) that shows finished edits that EditCheck should have triggered on. We'll be able to make sure that it's (usually) picking up the right things before the tool is shown to any editors at all. Whatamidoing (WMF) (talk) 18:37, 5 May 2023 (UTC)Reply
@Andrew Davidson, @FeRDNYC, and @Suffusion of Yellow: thank you for opening up this conversation about the relationship between Edit Check configurability/extensibility and the Movement's/Foundation's commitment to protecting peoples' privacy. Also, thank you for being patient with me in responding here.

Below is an attempt to address the points I currently understand y'all to be raising in this discussion and the discussion @Xaosflux started above that evolved to encapsulate questions around privacy.

Of course, if you think I've misunderstood or missed anything about what y'all are expressing here, please let me know. PPelberg (WMF) (talk) 23:08, 16 June 2023 (UTC)Reply
As Suffusion of Yellow points out, what we can't know for sure is that the content itself contains no sensitive personal data. So it just can't be collected or stored, full stop. Not anywhere, not for any length of time. via FERDNYC
@FeRDNYC, as Edit Check grows more nuanced, there may be a point where the software will need to send unpublished data people input during an edit session to servers to be evaluated. Note: this data will not be automatically stored or shown to anyone.
Although, as @Suffusion of Yellow pointed out, stashing provides a precedence for this. Before implementing functionality of this sort we will ensure it adheres to the Foundation's established privacy standards. Here is a ticket to hold the Editing Team accountable to delivering on this by consulting with WMF legal staff, @Whatamidoing (WMF) suggested: T339316. PPelberg (WMF) (talk) 23:10, 16 June 2023 (UTC)Reply
"But what's being proposed here would seem to be making public (or at least sharing with some huge group, like admins) the contents of the edit box, before the user intended to make it public." via Suffusion of Yellow
@Suffusion of Yellow: the Editing Team has no plans now, or in the future, to share the contents of an unpublished edit publicly or with other volunteers, regardless of the rights/permissions volunteers are afforded.
Can you please share what you've seen that might've contributed to you perceiving us as planning this kind of functionality so that I can update it to make this commitment explicit/unambiguous? PPelberg (WMF) (talk) 23:11, 16 June 2023 (UTC)Reply
I don't recall seeing any place where this was implied. Just trying to head off any bad ideas at the pass. :-) Suffusion of Yellow (talk) 02:07, 17 June 2023 (UTC)Reply
@Suffusion of Yellow, understood and thank you. The entire Editing Team continues to be grateful for how proactive you're being about naming potential issues the idea of Edit Check brings to mind. We're eager to hear what you think of the prototype that will be ready in the next few days. PPelberg (WMF) (talk) 15:52, 20 June 2023 (UTC)Reply
But each wiki is going to have its own highly customized "edit checks", right? Each time the equivalent of a local EFM updates one of the "edit checks", they need some way to know they haven't made a mistake. Looking for an NDA'd developer to go sifting through a log obviously does not scale. via Suffusion of Yellow
Two things:
1. Yes, we are designing the Edit Check system in ways that enable individual projects to configure existing checks and potentially author new ones.
2. And we agree with you in thinking volunteers will need a way to verify the checks they've authored/configured are working as they expect without needing an NDA'd developer to intervene.
With this in mind, @Suffusion of Yellow: are there existing systems on-wiki whose debugging capabilities you think we should look to as a source for inspiration? We started gathering a few ideas in T327959. PPelberg (WMF) (talk) 23:16, 16 June 2023 (UTC)Reply
The most important thing is, when the user ignores the check and chooses to publish anyway, everything about the (final) edit is saved, just like at Special:AbuseLog. So long as we can query that from the API, we can build tools. Suffusion of Yellow (talk) 02:17, 17 June 2023 (UTC)Reply
everything about the (final) edit is saved, just like at Special:AbuseLog.
@Suffusion of Yellow, the point you're raising above feels important...can you please say more/be explicit about what "everything about the (final) edit is saved" means in the context of Special:AbuseLog? PPelberg (WMF) (talk) 23:40, 23 June 2023 (UTC)Reply
I mean, just look at an example AbuseLog entry. Along with the filter pattern, everything you need to work out why the filter matched is there. So:
  1. Alice adds an unreferenced paragraph
  2. Edit Check asks Alice for a reference
  3. Alice declines, and does not choose to make a public report.
  4. Alice makes some more edits, but leaves the unreferenced paragraph there.
  5. Alice clicks "publish"
At this point, another check is run on whatever Alice is trying to publish, and if it matches any Edit Check, the result is saved in something very similar to Special:AbuseLog. Of course this isn't perfect, because we can't see what the text was at step 2. But it's the best we can do. Suffusion of Yellow (talk) 21:05, 25 June 2023 (UTC)Reply
@Suffusion of Yellow you sharing this example was the prompt I needed to look more closely at all of the information that is provided alongside each abuse log entry...thank you. I've added this example to T327959. PPelberg (WMF) (talk) 22:36, 26 June 2023 (UTC)Reply
So long as we can query that from the API
I'm going to raise this topic of an API in a new thread to increase visibility. PPelberg (WMF) (talk) 23:42, 23 June 2023 (UTC)Reply
...have to file a phab ticket every time we think a "check" might need tweaking? …No, that would terrible in its own way. | source
@Suffusion of Yellow, agreed. A system design that would require tweaks to Edit Check be made by MediaWiki developers making changes to code that needs to ride a weekly deployment train does not seem attractive or feasible to me. PPelberg (WMF) (talk) 23:17, 16 June 2023 (UTC)Reply
First, on every popup, have a prominent "report error" button. Not some ad hoc system like we have with edit filters, but built right in to the extension. When they click that, it's clearly disclosed that now the contents of the edit form will be made public (and also CC-BY-SA, you are editing logged out, blah blah blah). They can click "cancel" if there's a something private or copyrighted. | source
+1 to the idea of including an option for people to express that they think Edit Check is appearing in error.
To start, we're thinking we'll expose this option to people who decline to add a reference when invited to do so.
The work to implement this is happening in T329593. We've yet to converge on the language for this choice, so if ideas come to mind, please let us know as much by commenting on the ticket. Note: you can see a work in progress design pictured below.
Working design. More in phab:T329593
As to the idea you're entering about sharing the cases where people believe Edit Check to appear in error, we're not yet sure technically or socially how to go about doing this. We filed to T338909 as a placeholder. So if ideas emerge, I'd value you sharing them there. In the meantime, I've linked the comment above on that ticket so that we can refer to it when we prioritize this work. PPelberg (WMF) (talk) 23:22, 16 June 2023 (UTC)Reply

Where can I find a live example?

[edit]

It's written that test wiki, en wiki and fr wiki should already have this feature deployed. However, I tried to add changes with more than 50 characters and still nothing (changes published without asking for citations). Can you tell when this will be deployed and whether it's possible to see this feature live now? Kanzat (talk) 13:57, 19 June 2023 (UTC)Reply

hi @Kanzat – thank you for writing to ask! A couple of responses to the questions you posed...
Can you tell when this will be deployed and whether it's possible to see this feature live now?
If all goes according to "plan," the initial reference Edit Check will become available as a beta feature at an initial set of wikis before July is over.
With the above said, an initial prototype of the mobile experience will be available before tomorrow is over.
It's written that test wiki, en wiki and fr wiki should already have this feature deployed.
Can you please share what you read that contributed to the impression you shared above? I ask this wanting to update the language you're referencing so it's clear that Edit Check is not yet available in production on any wiki. PPelberg (WMF) (talk) 19:22, 20 June 2023 (UTC)Reply
Thanks for reply. I probably misread status update from May 12th. It's written there about prototype in test wiki. And then something about new tag in frwiki and enwiki, so I thought functionality should be there too. Kanzat (talk) 20:32, 20 June 2023 (UTC)Reply
Understood, okay! Thank you for following up, @Kanzat ^ _ ^ PPelberg (WMF) (talk) 20:55, 20 June 2023 (UTC)Reply

Seeking Feedback: Edit Check Prototype

[edit]

hi y'all! Below are instructions for trying and sharing feedback about the mobile prototype for the first Edit Check.

Edit Check demo (mobile)

If you find anything about the instructions below confusing/unclear, please comment here so that we can try to help.

For context, this first Edit Check that will prompt newcomers who are contributing new content without including a corresponding reference to consider doing so. PPelberg (WMF) (talk) 22:22, 21 June 2023 (UTC)Reply

Try the prototype
  1. On a mobile device, visit an article page on the patch demo wiki. E.g. https://patchdemo.wmflabs.org/wikis/dbe9212625/w/index.php?title=Douglas_Adams&mobileaction=toggle_view_mobile
  2. Tap any edit pencil you see on the page
  3. Decide whether you'd like to Edit without logging in or create a temporary account by tapping the Sign up button to switch editing modes.
  4. Ensure you are using the visual editor. If not, please switch to it by tapping the button
  5. Add a couple of new sentences to the article you arrived at in "Step 1." without including a reference
  6. Tap the > button to proceed to publishing
  7. Follow the prompts you're presented with to publish the changes you made in "Step 4."
  8. ✅ You are now ready to share feedback.
PPelberg (WMF) (talk) 22:26, 21 June 2023 (UTC)Reply
Share feedback
  1. Click here to start a new topic on the talk page.
  2. Share what you think as responses to the following prompts [i]:
    1. What did you find unexpected about the prototype?
    2. What do you like about the prototype?
    3. What do you wish was different about the prototype?
    4. What concerns you about this prototype?
    5. What questions does this prototype bring to mind?
  3. Click the blue "Add topic" button.
  4. ✅ You are done! Thank you!
---
i. Please do NOT feel limited by the prompts above. The Editing Team is interested in any and all feedback. So, if you feel like what you want to share does not fit neatly within these prompts, don't worry, please share the feedback you have to offer in whatever way feels natural to you. PPelberg (WMF) (talk) 22:29, 21 June 2023 (UTC)Reply
Do we need to use a mobile useragent for this, or is there a mobilefront end url? (Or is using Minerva enough for this test?) Xaosflux (talk) 15:35, 22 June 2023 (UTC)Reply
@Xaosflux, great question. At present, you'll need to use the prototype on a mobile device.
Before next week is over, we plan for the desktop prototype to be ready. Tho, we're not quite there yet.
Note: in the meantime, I've updated the "Try the prototype" instructions to make this explicit. PPelberg (WMF) (talk) 18:45, 22 June 2023 (UTC)Reply
Pinging a few people who I think will have valuable feedback and insight to offer about the Edit Check mobile prototype: @Andrew Davidson, @Clovermoss, @FeRDNYC, @Kanzat, @Mathglot, @Omotecho, @Sdkb, @Suffusion of Yellow, @Xaosflux PPelberg (WMF) (talk) 22:35, 21 June 2023 (UTC)Reply
When I click "Publish changes" button, the dialog has such a small height, that I can't see anything. Kanzat (talk) 23:28, 21 June 2023 (UTC)Reply
@Kanzat, what you described does not sound ideal...can you please share a screenshot of what you're seeing? PPelberg (WMF) (talk) 23:35, 21 June 2023 (UTC)Reply
test of WMF feature
My result seemed similar.. Here's a screenshot. Andrew Davidson (talk) 08:40, 22 June 2023 (UTC)Reply
@Andrew Davidson, can you please try the prototype again using a mobile device?
And thank you for saying something about this and sharing this screenshot, @Andrew Davidson...you doing so helped prompt me to update the testing instructions above 🙏🏼 PPelberg (WMF) (talk) 18:48, 22 June 2023 (UTC)Reply
I've made three edits, two of which were minor and one that was slightly more substantial. None of them have given me the prompt. Am I not adding enough? I was hoping the most recent one with a few sentences would trigger it. I would like to note that I did receive a prompt about using the suggested edits feature after the third edit and this message had a bunch of numbers in it? I could email a screenshot of what that looked like if you are interested. Clovermoss (talk) 12:30, 22 June 2023 (UTC)Reply
hi @Clovermoss! Thank you for trying out the prototype with such swiftness 🙏🏼
I've made three edits, two of which were minor and one that was slightly more substantial. None of them have given me the prompt. Am I not adding enough?
Assuming this edit is the "more substantial" one you were referring to, can you please try making another edit (with a similar amount of text: 1-2 sentences) and ensure you are using the visual editor to do so?
At present, Edit Check is only activated for people using the visual editor. Tho, I'm now realizing the instructions as previously written did not make this clear. I've updated them to make this explicit. I'm sorry for the confusion. PPelberg (WMF) (talk) 00:47, 23 June 2023 (UTC)Reply
I did try again and I recieved the prompt this time. I was impressed by the feature. :) Is there anything else you were looking for/wanted to ask? It looks like you've put a lot of thought into how this works so I'm not sure there's really anything that I'd suggest you do differently. Clovermoss (talk) 08:40, 24 June 2023 (UTC)Reply
I did try again and I recieved the prompt this time.
Wonderful, okay! Thank you for giving it a second try.
I was impressed by the feature. :)
Were there particular aspects of the experience that you noticed? No worries if there aren't details you can recall. Knowing what you thought about the overall experience is valuable unto itself.
Is there anything else you were looking for/wanted to ask?
Outside of the overall impression the prototype left you with, I wonder what – if any – questions you'd value being addressed before you'd be confident in the feature being offered in production? PPelberg (WMF) (talk) 22:49, 26 June 2023 (UTC)Reply

Feedback: Sdkb

[edit]

I may come back for more thorough feedback later, but for now, when I dismissed the suggestion to add a citation, it felt like follow-through was lacking. When someone selects "this info is already cited elsewhere," it should prompt them to reuse a ref, and when someone selects "I don't know what citation to add" it should offer up info about citations to try to help them out. It might also be good for research purposes to have the "other" answer give the opportunity to provide feedback. {{u|Sdkb}}talk 06:18, 22 June 2023 (UTC)Reply

Thank you for taking the time to promptly review the prototype, @Sdkb.
When someone selects "this info is already cited elsewhere," it should prompt them to reuse a ref, and when someone selects "I don't know what citation to add" it should offer up info about citations to try to help them out.
Oh! Interesting ideas. I've taken a first pass at articulating these in ticket form here: phab:T341535.
Meta: I appreciate how in both of the ideas you're describing above is this notion of offering people ways to go "deeper" / "learn more."'
It might also be good for research purposes to have the "other" answer give the opportunity to provide feedback.
Agreed; here's a ticket for this idea: phab:T341536.
Zooming out a bit, would it be accurate for us to think you did not see any issues with the current "No" suggestions themselves?
Reasons for declining to add a reference
PPelberg (WMF) (talk) 23:27, 10 July 2023 (UTC)Reply
@PPelberg (WMF), the current suggestions look good, but they don't quite capture all the common decline reasons an editor might have.
One option to add to the list might be "I added info that does not need a citation". In most cases, the response will be, "yes, it does, because w:WP:Verifiability," but it would be helpful to know when users are thinking this. There's a bit of a risk that, by including such an option, we give users the impression that sometimes citations are not needed, but that would be mitigated if the interface returned an explanation on verifiability whenever someone chose that option.
Another common use case would be the "I just can't be bothered" one, where an editor knows that a citation would be preferred, but just doesn't think it worthwhile to put in the effort to provide it. The current suggestions don't capture that, but adding an option like "I don't think a citation is important" might. The response to it from the interface would be to emphasize that uncited information is much more likely to be reverted, so if they want their edit to stick, they should really provide one. {{u|Sdkb}}talk 14:31, 12 July 2023 (UTC)Reply
I disagree that if the same information appears twice in an article, both instances should have a citation to one source. Citing the same source several times should be done when different pieces of information are obtained from the same source. 24.62.188.162 13:29, 12 July 2023 (UTC)Reply
The same information isn't generally appearing twice in an article, other than once in the lead and once in the body. But often two pieces of information from the same source appear, and in that case the general expectation at en-WP is to reuse the citation. Anything else creates undesired redundancy. {{u|Sdkb}}talk 14:07, 12 July 2023 (UTC)Reply
I'm concerned that this feedback may be rather enwiki-centric, but even at the English Wikipedia, w:en:Wikipedia:When to cite#When a source or citation may not be needed (the standard for FAs) does not require duplicate citations for most repeated information, and the guideline at w:en:Wikipedia:Citing sources#Consecutive cites of the same source says the same thing.
The same information can appear many times in an article. The fact that COVID-19 is caused by a virus (e.g., rather than a bacteria or a parasite) appears in w:en:COVID-19 a few dozen times. It is a direct claim only a handful of times (e.g., lead sentence, infobox, ==Cause== section) a few times, but a sentence like "People remain contagious for up to 20 days and can spread the virus even if they do not develop symptoms" also contains an indirect claim that the disease is caused by a virus. Whatamidoing (WMF) (talk) 19:42, 22 August 2023 (UTC)Reply

Feedback: Andrew Davidson

[edit]

I didn't get a functional dialog. Here's a screenshot.

test of WMF feature

A couple of other observations.

1. I made some other edits of the Adams article as an IP. I repeatedly got demands for CAPTCHA -- a painful process which I'm not used to as a registered editor. When I made the test edit to add sentences, I didn't get the CAPTCHA, iirc, which indicates that the new feature is cutting in first. This seems the wrong way round.

2. When I made a screenshot of the issue above, I then faced the challenge of inserting it here. I used the upload feature and this took me to Commons which I usually avoid as I prefer to directly upload to the English Wikipedia using their simpler interface. The Commons process was painful too, with a tiresome Wizard which I mostly ignored and then a maze of inputs for WikiData and whatever. This seemed quite excessive for just a simple screenshot. The issue is that multiple layers of well-meaning but tiresome demands and nagging are a big turnoff when you just want to get something simple done. If we are adding yet another layer, there should be monitoring of how many users start the process but then give up in disgust or frustration.

3. The test article is a biography of Douglas Adams. This is entertaining and I am strongly reminded of the Sirius Cybernetics Corporation. For more cheap entertainment of this sort, I recommend Are you a robot? which, by coincidence, resurfaced here recently when I was tidying up. "Share and Enjoy!"

Andrew Davidson (talk) 09:15, 22 June 2023 (UTC)Reply

@Andrew Davidson: thank you for trying out the prototype and being patient with me here. Responses to what you shared below...
I didn't get a functional dialog. Here's a screenshot.
Can you please try again using this updated version of the prototype https://patchdemo.wmflabs.org/wikis/9a60b4369e/wiki/Douglas%20Adams which we've updated to work on desktop machines?
When I made the test edit to add sentences, I didn't get the CAPTCHA, iirc, which indicates that the new feature is cutting in first. This seems the wrong way round.
To make sure I'm understanding: are you suggesting that it seems more intuitive to you that people be prompted to complete the CAPTCHA before being prompted to add references? If so, can you say a bit more about what's leading you to think as much?
The issue is that multiple layers of well-meaning but tiresome demands and nagging are a big turnoff when you just want to get something simple done. If we are adding yet another layer, there should be monitoring of how many users start the process but then give up in disgust or frustration.
I hear you on seeing the potential in streamlining the Commons upload experience. In fact, it looks like there is a team at the Foundation who will be focusing on this very thing in the coming months. See phab:T337466.
Sirius Cybernetics Corporation
Ha. This is the first time I'm hearing of this – thank you for sharing :) PPelberg (WMF) (talk) 22:53, 17 July 2023 (UTC)Reply
I don't think you can ask for a CAPTCHA in the pre-ref state, because you don't know whether the ref will contain a URL. Whatamidoing (WMF) (talk) 19:47, 22 August 2023 (UTC)Reply

Imagining an Edit Check API

[edit]

Based on what you know about Edit Check so far, and other projects like Abuse Filter that Edit Check draws inspiration from, what kinds of information / functionality can you imagine being useful to have access to through an API?

We're asking the above in response to @Suffusion of Yellow raising the idea of an API in the context of a broader conversation about how we might create ways for volunteers to independently audit, configure, and potentially author Edit Checks(T327959).

Note: please do not feel urgency around the above. This discussion is very much exploratory as the Editing Team remains focused on first validating the initial reference Edit Check we're in the midst of developing and seeking feedback about. PPelberg (WMF) (talk) 23:50, 23 June 2023 (UTC)Reply

Feedback: Suffusion of Yellow

[edit]

So, I'm not sure what the point of this prototype is. It only prompts me after I begin the publish process; is that intended? Because I was under the impression that I would get some sort of reaction before that; hence all my concerns about privacy. So far, you've mostly reinvented Special:AbuseFilter's "warn" option under a new name. I say "mostly" because you've added one nifty feature that edit filters don't have: the ability to highlight the problematic text in the edit window. So why not patch AbuseFilter to allow something like that, instead of building a new extension from scratch? Suffusion of Yellow (talk) 20:52, 25 June 2023 (UTC)Reply

Took the words right out of my mouth. My assumption has been that it works like (and will replace) the disambiguation link warning. I get that you wouldn't want a prompt about citation every key stroke or every few seconds, but that's exactly why the prompt is a baffling choice for a demo. Nardog (talk) 21:09, 25 June 2023 (UTC)Reply
I get that you wouldn't want a prompt about citation every key stroke or every few seconds, but that's exactly why the prompt is a baffling choice for a demo.
hi @Nardog – can you please say a bit more about the above? Asked another way: can you please describe what's leading you to think the current experience could cause people to become frustrated/distracted?
Note: I think we're aligned in thinking people would become distracted and frustrated were the interface prompt them repeatedly to add a citation while they're in the midst of drafting new content. Thus, why we've started with an initial approach that presents people with a prompt once we're fairly certain they're finished typing. PPelberg (WMF) (talk) 23:37, 17 July 2023 (UTC)Reply
You seem to have completely misconstrued my comment. Let me rephrase it hopefully more clearly: My assumption has been that the edit check will work like the disambiguation link warning in that it gives you notifications as you edit (i.e. T315072). If it gives you prompts after you click Publish then there's little utility in it because that function already exists in AbuseFilter. That's why I find the prompt about citation a baffling choice for a demo, because citation is not something that needs to be prompted about as you edit. Nardog (talk) 05:46, 18 July 2023 (UTC)Reply
Thank you for clarifying what you meant, @Nardog.
Responses and a follow up question in response below.
Note: I think some of what's contained below applies to some of the points @Suffusion of Yellow raised below. Although, I'll respond to that comment directly in a subsequent message.
"My assumption has been that the edit check will work like the disambiguation link warning in that it gives you notifications as you edit (i.e. T315072). If it gives you prompts after you click Publish then there's little utility in it because that function already exists in AbuseFilter."
I think we are aligned in wanting to avoid allocating resources implementing functionality that already exists.
With the above said, it seems you perceive Edit Check, as currently implemented, to be undifferentiated from Abuse Filter and as a result, see the former as being redundant. Of course, please tell me if I've misunderstood you.
The Editing Team on the other hand sees Edit Check to be different enough from Abuse Filter to consider the two projects as distinct and therefore not duplicative and potentially, uniquely valuable. More on the "valuable" piece below.
As to why the Editing Team thinks this…
The Editing Team sees Edit Check and Abuse Filter as similar in so far as they both seek to offer volunteers an on-wiki way to specify that certain messages be shown when an edit meets certain criteria.
The Editing Team sees Edit Check and Abuse Filter as different in the extent to which the messages/feedback they present are integrated into the editing experience…
Edit Check will present people feedback that is paired with calls to action and features that invite them to immediately improve the edit they're making without need to consult documentation or exert effort trying to locate the specific part of the change the feedback they're receiving is relevant to.
Now, I think it's important to say explicitly that we have NOT yet  proven that the points of distinction I named above amount to real impact. We've shared how we're thinking about evaluating the impact of this project so that we, staff and volunteers, can converge on an evaluation framework that encompasses the nuances of this project.
In fact, writing the above out leads me to wonder: what do you think about potentially running a test that looks something like this…
Of the people who are attempting to add new content without a source, ⅓  are shown an abuse filter, ⅓ are shown an Edit Check, and ⅓ are shown nothing and then comparing how peoples' behavior varies between these three groups?
Note: I'm not sure if the above is technically feasible. Tho, I wonder if it is a start to us arriving at a way of understanding how impactful these different experiences are for improving the quality of edits people make. PPelberg (WMF) (talk) 23:28, 23 August 2023 (UTC)Reply
My main objection is not that it duplicates AbuseFilter (although it does), but that you claim this project is about "Offer[ing] people actionable feedback about Wikipedia policies while they are editing" (emphasis in original) yet you're NOT building it.
As demonstrated by the disambiguation link warning and as discussed in T315072, there's tremendous potential and utility in parsing the source and offering warnings and error messages while the user is editing, so as to reduce the significant number of volunteer hours spent on fixing lint errors, template errors, inappropriate sources, MoS violations, etc., which is currently done mainly through tracking categories, preview warnings, tags (via AbuseFilter), and special pages.
You may choose not to work on such a thing, but at least be honest and accurate about what you're actually building (i.e. offering feedback after submitting changes, and only in Visual Editor), so that others who might want to work on T315072 won't mistakenly think it's already happening. Nardog (talk) 15:11, 24 August 2023 (UTC)Reply
The time in between clicking the edit button and the change being published is "while editing", right? There is:
  • before you are editing – before you click the edit button
  • while you are editing – after you click the edit button and before you publish your edit
  • after you are editing – after you publish your edit
From minimum viable product to more complex product
The current prototypes are closer to the skateboard end of this spectrum than to the car. A skateboard is faster than walking, but it's not yet self-propelled, we don't yet have a comfortable place to sit down, and we'll still get wet if it rains. We might be on the right path, but this isn't the finished product.
The Editing team has talked about providing near-instant feedback, but it's the nature of a first testable iteration that not every feature is there yet. It's more sensible to try something small and see whether it's a worthwhile concept at all.
In particular, figuring out the right time to provide real-time feedback in an instantaneous system, without interrupting people, is really difficult. Much larger teams have tried and failed at that. You need to provide the feedback before the person moves on to another task, but after they're definitely done. There's no obvious way to differentiate between pauses that mean "I'm stuck and don't know what to do next – some help would be nice" and the ones that mean "Stop interrupting me, I only paused to grab a copy of the URL for the source that you were telling me to add!" And, of course, the whole system has to work on a wide variety of devices, some of which may not be able to support checking the contents as the editor types. The challenges might be worth addressing, but if the whole idea turns out to be unhelpful, then we could save a lot of time and trouble by finding that out first ("fail quickly", if you like the Silicon Valley jargon). Whatamidoing (WMF) (talk) 17:58, 24 August 2023 (UTC)Reply
One could argue, but that's not what one thinks of when one hears "while editing". I'm just asking for clarification on what the scope of this project is. I was interested in this project because of what the top of the infobox said it was about. If the project is not what I thought it was, fine, I don't object to the project per se, but I strongly suggest you be clearer in the description.
You seem to be implying that this project will support offering feedback before "Save changes" is clicked, but I haven't seen anyone in the Editing team saying it will. Are you speaking for the Editing team when you imply that?
Also, does the Editing team have any intention to make it work with other editors? If not, while I don't object to the project like I said, I do think the same resources could be better spent making something that works for all (or most) editors. Nardog (talk) 18:37, 24 August 2023 (UTC)Reply
I know that the Editing team has talked about offering feedback while you're still typing, but I don't know – and AFAICT nobody actually knows – whether that will happen in the foreseeable future.
The visual editor is used by more than half of desktop-based new editors for their first edit. The percentages vary significantly by community. For example, at the English Wikipedia, 45% of newcomers' first four edits to articles are in the visual editor. At many large Wikipedias, including German, French, Spanish, Hebrew, and Arabic, it's around 70%. At the Japanese and Korean Wikipedias, it's 55%. Turkish is 60%. Portuguese is 75%. Chinese is 40%. Globally, the overall score is that the first few edits are made 55% visual editor, 37% in the 2010 wikitext editor, 4% switching between visual and wikitext modes, 3% using something else (e.g., the 2003/non-Javascript editor or a script), and a tiny fraction using the 2017WTE.
As a result, even if they only built this for the visual editor, it would still be available to more than half of newcomers.
(I can pull mobile numbers if you want, but overall, I think mobile visual editor is less visible, less used, and less capable [=less space for toolars/buttons], making it more important to provide additional help there.) Whatamidoing (WMF) (talk) 19:54, 24 August 2023 (UTC)Reply
@PPelberg (WMF): Can you (or anyone who's actually working on this project) respond to my request for making the project description clearer? Nardog (talk) 12:30, 28 September 2023 (UTC)Reply
hi @Nardog – can you please help me understand what remains unclear in your mind? Might this confusion stem from us holding different understandings for what constitutes "while someone is editing"? If so, I wonder whether that difference in perspective is notable considering we seem aligned on the spirit of that language...
Offering people feedback in moments: A) when they are likely to act on said feedback and B) before changes are published that would likely result in experienced volunteers needing to allocate time/effort to improving.
You raising this again signals to me that this is important to you and I do not mean to doubt that. Tho, at present, I'm needing some help understanding where you're coming from. PPelberg (WMF) (talk) 16:32, 28 September 2023 (UTC)Reply
@PPelberg (WMF): As I said above, I find this important because currently this project appears to be cookie-licking T315072.
we seem aligned on the spirit of that language I'm at a loss as to how you came to this conclusion. I don't consider "after clicking 'Save changes'" to be what one thinks of upon reading "while someone is editing". Now that you've said it, "before changes are published" seems like a good alternative.
And that wasn't my only concern. If this is just about VisualEditor, then please say so in the description. Nardog (talk) 16:48, 28 September 2023 (UTC)Reply
The initial version will only be in the visual editor (=Extension:VisualEditor's visual mode, not even in its wikitext mode). It is being extended to the mobile visual editor.
The team recently talked about what it would take to make this work in the 2010 wikitext editor (largely a difference between ContentEditable and HTML textarea). I do not believe that they can extend it to the 2003 wikitext editor (=non-Javascript), and I do not believe that they are likely to extend it to wikEd or other editors that are not officially supported. I do not know whether they intend to extend it to the mobile wikitext editor.
You don't use the visual editor, so you might be unaware that, unlike in most of the wikitext editors, clicking the blue 'Publish changes...' button in the visual editor is never the end of the editing process. In the visual editor, Edit check will trigger before people write their edit summaries or check the diff. Whatamidoing (WMF) (talk) 19:43, 28 September 2023 (UTC)Reply

The team recently talked about what it would take to make this work in the 2010 wikitext editor

@PPelberg (WMF): And the conclusion was? And I ask for the nth time: please change "while editing" in the project description to something to the effect of "before the edit is published" as long as you're not building something that offers feedback while the editable content is being edited. Nardog (talk) 01:57, 16 November 2023 (UTC)Reply
@PPelberg (WMF): Asking once again. Nardog (talk) 13:44, 19 May 2024 (UTC)Reply
hi @Nardog thank you for the ping. To start, Edit Check will be offered only within the visual editor.
With regard to updating the language we use to describe the project, we're going to keep it as-is for the reason you described [i], we plan to work on a check in the next fiscal year that will offer people feedback while they're editing.
I hope this helps and please let me know if anything about the above brings new questions to mind.
Also: thank you for being patient with me. I've been on leave.
---
i. as long as you're not building something that offers feedback while the editable content is being edited. PPelberg (WMF) (talk) 21:51, 20 May 2024 (UTC)Reply
Also! You asking this was the prompt I needed to update the FAQ section of Help:Edit check. PPelberg (WMF) (talk) 21:56, 20 May 2024 (UTC)Reply
@Suffusion of Yellow, hi – thank you for taking a close look at the prototype and allocating the time to share what the experience brought up for you. Some comments in response below...
It only prompts me after I begin the publish process; is that intended? Because I was under the impression that I would get some sort of reaction before that...
Can you please say a bit more about when you had expected the prompt to appear? And if you can recall, what assumptions/past experience(s)/etc. might be informing that expectation?
For context: the prompt appearing after you initiate the save process by clicking/tapping the publish button is intentional.
...hence all my concerns about privacy.
Are you able to share what – if anything – about the current experience causes you to worry about privacy?
Fwiw, the Editing Team does not have any plans to share the contents of an unpublished edit with anyone, as noted above.
So far, you've mostly reinvented Special:AbuseFilter's "warn" option under a new name. I say "mostly" because you've added one nifty feature that edit filters don't have: the ability to highlight the problematic text in the edit window. So why not patch AbuseFilter to allow something like that, instead of building a new extension from scratch?
We agree with you in seeing Edit Check as an evolution of Abuse Filter. In fact, it's one of the features that helped inspire the project and continues to inform how we approach it.
Further, I think the distinction you highlighted above between Abuse Filter and Edit Check [i] represents one of the core assumptions this projects rests on: presenting people with feedback and actions about a specific change they're making while they still have the editing interface open will increase the likelihood that they act on the suggestions they're being presented with and subsequently, make edits they're proud of and projects value. We still have some experimenting to do to learn the extent to which this proves true.
So why not patch AbuseFilter to allow something like that, instead of building a new extension from scratch?
Inside of the question you're posing above I see a concern for efficiency and a value of leveraging the momentum/behavior an existing feature has garnered over years of use. Assuming this doesn't stray too far for what you meant, than I think we are aligned.
It's with the above in mind that we're not developing Edit Check as a new extension. Rather we are building Edit Check as a set of improvements to the visual editor codebase that will, hopefully, make it easier for other teams and volunteers to plug directly into and customize the editing interface in ways that have historically been difficult to do.
---
i. Edit Check introduces the ability to present people with feedback and actions about a specific change they're attempting to make in moments when they are, ideally, motivated and equipped to act. PPelberg (WMF) (talk) 23:30, 17 July 2023 (UTC)Reply
Can you please say a bit more about when you had expected the prompt to appear?
I had expected it to appear as I was typing.
And if you can recall, what assumptions/past experience(s)/etc. might be informing that expectation?
The infobox at Edit Check says "Offer people actionable feedback about Wikipedia policies while they are editing." (emphasis in original)
Are you able to share what – if anything – about the current experience causes you to worry about privacy
Nothing. You seem to running the check client-side, in fact.
presenting people with feedback and actions about a specific change they're making while they still have the editing interface open
But that's not what you're doing. You're "presenting people with feedback" after they begin the publish process. That's what, to a very close approximation, AbuseFilter already is doing, and if we wanted to warn people for adding an unreferenced paragraph, we would already be doing so. That fact that we're not should be a very big clue that this, as Nardog says, is a poor choice for a demo.
Rather we are building Edit Check as a set of improvements to the visual editor codebase that will, hopefully, make it easier for other teams and volunteers to plug directly into and customize the editing interface in ways that have historically been difficult to do.
I'm unclear as to what that means. Is this going to be a VisualEditor-only thing? Because a quick glance at en:Special:RecentChanges suggests that on enwiki only about 10% of edits are made through VE. Is the plan to leave out the other 90%? Or to create two parallel system, carefully kept in sync, one for VE, and the other for the classic editor? Suffusion of Yellow (talk) 20:26, 18 July 2023 (UTC)Reply

Feedback: Joe Roe

[edit]

Thanks for sharing this, it's really cool. Here's my feedback from the point of view of the English Wikipedia:

  • As I think your team already knows, most but not all material on enwiki needs an inline citation. Notably, your video demo appears show someone adding text to the lead section of an article, which is one of those exceptions. Obviously the tool can't accommodate all these nuances and I appreciate that users can always click no. What seems to be lacking is guidance (especially for new users) on when citations are and aren't required; when it's okay to click "no" without worrying about your edit being reverted. Ideally the finished tool should make the prompt text locally configurable, so we can add that project-specific guidance.
  • Relatedly, I think it's important that there is a way to turn this off for mainspace pages that don't use citations (on enwiki that includes disambiguation pages and certain types of list). Perhaps with a magic word? Then we could incorporate into templates like en:Template:Disambiguation etc.
  • I don't like the follow-up questions very much. The first two options sound like reasonable reasons not to add a reference, but actually aren't consistent with enwiki policy, so it feels like you're teaching people the wrong lessons. The second option would make more sense if it said "my changes are already cited later" (per LEADCITE above). But more to the point, I don't think we should be asking editors to explain themselves to interfaces. It's annoying and infantalising. The text says "other editors would value learning..." but a) it's not clear how this information is going to be conveyed to other editors and b) knowing other editors, they probably don't value it at all.
  • When you go through all the prompts, but then go back to editing the page before publishing, you're re-prompted about the same text. I can also see that becoming quite frustrating.

Joe Roe (talk) 14:47, 30 June 2023 (UTC)Reply

@Joe Roe, hi! Thank you for making the time to try the prototype and thinking critically about it. I'm going to post responses to what you shared below as individual comments so they're [hopefully] easier to discuss one-by-one...
Before that: thank you for being patient with me 🙏🏼 PPelberg (WMF) (talk) 01:01, 25 July 2023 (UTC)Reply
...most but not all material on enwiki needs an inline citation. Notably, your video demo appears show someone adding text to the lead section of an article, which is one of those exceptions.
Great spot. To equip projects to configure Edit Check to accommodate the LEADCITE exception you're naming here, we're planning to make it possible for volunteers, on a per project basis, to:
  1. Decide what – if any – sections Edit Check ought to ignore edits to and
  2. Implement said preferences on-wiki independent of the Editing Team.
...what do you think about the plan for on-wiki configuration that I described above? What – if any – aspects of it lead you to question the extent to which it could be effective? PPelberg (WMF) (talk) 01:02, 25 July 2023 (UTC)Reply
That sounds useful in the specific case of LEADCITE. Joe Roe (mobile) (talk) 10:29, 22 August 2023 (UTC)Reply
What seems to be lacking is guidance (especially for new users) on when citations are and aren't required; when it's okay to click "no" without worrying about your edit being reverted.
Mmm, understood and I'm glad you named this. Two resulting questions...
1) Are you aware of a page (or set of pages) that you think are effective at equipping people with the information they need to decide whether proceeding without accompanying the new content with a reference is likely to result in that edit being reverted?
2) More broadly, what prompted this thought? Asked in another way: what do you worry could happen were Edit Check not to offer guidance of the sort you're describing above?
For context: we've been thinking in a future release, we'd offer people who decline to add a referenc when prompted, follow-up actions and learning opportunities of the sort [I think] you're describing here. See: T341535. PPelberg (WMF) (talk) 01:03, 25 July 2023 (UTC)Reply
1: There's en:WP:NOTCITE, but it's a bit neglected. Specific examples of things that don't need a citation are scattered across various project pages (e.g., off the top of my head, en:WP:LEADCITE, en:WP:BLUESKY, en:WP:CALC, en:WP:LISTVERIFY). There are also specialised types of article that aren't expected to have citations, like disambiguation pages or certain types of set indexes. And there are lots of unwritten conventions in specific topic areas. I imagine we could come up with an information page that summarises the major exceptions, though at the end of the day it's possible to argue that anything that doesn't fit the en:WP:BURDEN criteria doesn't need a reference.
2: I'm worried that Edit Check could reinforce a simplistic and rules-oriented reading of Wikipedia's verifiability policy, which is that everything has to have an inline citation, even when that citation brings no value to readers or editors. The actual policy is considerably more nuanced than that and like all Wikipedia policies is intended to guide and develop editorial judgement, not replace it. The kind of situation I'm thinking of is when an editor (new or not) correctly declines to provide a reference. In that situation the software shouldn't hassle them further, flag their edit for attention, or otherwise condition them into thinking they've done something bad. The prompt system described in that ticket seems to assume that if an editor declines to add a reference, they are wrong and need to be corrected. Joe Roe (mobile) (talk) 11:06, 22 August 2023 (UTC)Reply
Ideally the finished tool should make the prompt text locally configurable, so we can add that project-specific guidance.
Great call and agreed. I've updated phab:T327959 and phab:T341535 to include this.[1] [2]
Relatedly, I think it's important that there is a way to turn this off for mainspace pages that don't use citations (on enwiki that includes disambiguation pages and certain types of list). Perhaps with a magic word? Then we could incorporate into templates like en:Template:Disambiguation etc.
Great call and agreed. We've been thinking this configuration, like the other Edit Check configurations we've been talking about in this discussion, would happen in a centralized place on-wiki. Where exactly, we're not yet sure. So, if ideas of places where you think these sorts of settings would make sense to put, please do share! PPelberg (WMF) (talk) 01:04, 25 July 2023 (UTC)Reply
The first two options sound like reasonable reasons not to add a reference, but actually aren't consistent with enwiki policy, so it feels like you're teaching people the wrong lessons. The second option would make more sense if it said "my changes are already cited later" (per LEADCITE above).
Understood. The edits you're suggesting above read, to me, as added reason for the suggestion you made above about the tool making the prompt text locally configurable. Please let me know if you're seeing this differently.
But more to the point, I don't think we should be asking editors to explain themselves to interfaces. It's annoying and infantalising. The text says "other editors would value learning..." but a) it's not clear how this information is going to be conveyed to other editors and b) knowing other editors, they probably don't value it at all.
I appreciate you being clear and direct about this...
I think we're aligned in wanting to avoid an experience that causes people to feel frustrated or infantalized. While we're not yet aware of people experiencing the user experience, as it's currently been implemented, in this way, I think it's important that we do our best to verify this remains true post-deployment.
To this end, I've filed phab:T342589 to hold us accountable to reporting on the above.
As to making it clear how the response people provide will be used by other editors, we agree with you in thinking this needs to be made clear in the design. I've updated the requirements to make this doubly clear.
Now, about whether other volunteers are likely to value the context people who decline to add a reference when prompted provide, what you're suggesting might be accurate. Tho, we've also heard people sharing how they think this information would be useful. Ultimately, my gut is telling me this is the kind of thing we'll need to try out and see... PPelberg (WMF) (talk) 01:04, 25 July 2023 (UTC)Reply
To expand on why I find it annoying and infantalising, there are two questions that cross through my mind when software initiates a dialogue like this:
  1. Who is asking? Is my response going to be used to calculate aggregate statistics? Recorded in some sort of log? Transmitted to a human? Personally, I'd say the first two purposes are a disrespectful use of volunteer time, while the third can be defensible, if there's a genuine benefit to both parties. So yes, it absolutely needs to be made clear what is going to happen to these responses. But again speaking from 12+ years of experience of how Wikipedians engage with newcomers, if other editors use this information, it'll be to more efficiently patrol and revert edits, not engage in any positive way. It doesn't feel honest to suggest that others are going to "value" it.
  2. Why am I being asked to provide this information? If it's to decide which help page or FAQ answer to direct me to, then I again have to question the assumption answering "no" to the first prompt is something that needs to be corrected. If it's to provide explanation, then we already have edit summaries: why not simply prompt the user, non-interactively, to explain why they're not including a reference there?
Of course I can't speak for others, but this is the kind of thing that would make me disable the tool and never try it again. I really urge you to reconsider whether asking someone to justify a decision that they've already made and already told you they're sure about meets your goal of "empowering" users. Joe Roe (mobile) (talk) 11:52, 22 August 2023 (UTC)Reply
> speaking from 12+ years of experience of how Wikipedians engage with newcomers
@Joe Roe, do you think this should only be visible to newcomers? I've been thinking that it should treat everyone equally. Editors like me probably shouldn't be dropping in whole unsourced paragraphs often enough to be irritated by a reminder about adding the ref before saving. Whatamidoing (WMF) (talk) 19:57, 22 August 2023 (UTC)Reply
I agree, it should be the same for everyone, whether that's opt-in or opt-out. But Edit check#Primary audience says its aimed at people learning the basics of contributing to Wikipedia and I was asked to comment here via enwiki new page patrol, so I've responded with that in mind. Joe Roe (mobile) (talk) 07:20, 23 August 2023 (UTC)Reply
When you go through all the prompts, but then go back to editing the page before publishing, you're re-prompted about the same text. I can also see that becoming quite frustrating.
Oh, this sounds like a bug. We're going to file a ticket so that we can address it.
...thank you for saying something PPelberg (WMF) (talk) 01:07, 25 July 2023 (UTC)Reply
EDIT: it looks like @RYasmeen (WMF) already filed a ticket for – what looks like – this same issue: phab:T342311! PPelberg (WMF) (talk) 01:10, 25 July 2023 (UTC)Reply

14 July Community Call

[edit]

hi y'all – this Friday, 14 July (15:30 to 19:00 UTC) we would like to invite you to a virtual meeting about Edit Check.

We're planning to use this time to:

  1. Talk about the Edit Check prototype @Xaosflux, @Kanzat, @Andrew Davidson [i], @Clovermoss, @Sdkb, @Suffusion of Yellow [i], and @Joe Roe [i] have shared comments about above.Specifically, what you all think might need to be added, removed, and/or changed before the feature is offered to people in production.
  2. Brainstorm ideas for additional "edit checks" we ought to consider building in the future.

We'll also hold space for general Q&A about anything related to the visual editor and/or DiscussionTools.

If the above brings any questions to your mind, please ping me so that I can try to answer. In the meantime, this MediaWiki page should contain all the information you need to join Friday's conversation.

---

i. @Andrew Davidson, @Suffusion of Yellow, and @Joe Roe: I still need to process and respond to the feedback that you shared (thank you for being patient with me). PPelberg (WMF) (talk) 22:22, 11 July 2023 (UTC)Reply

what you all think might need to be added, removed, and/or changed before the feature is offered to people in production Um, sorry, but ... everything? That's a nice mockup, but I gather it's only a mockup, right? Let me repeat what I said above: you have reinvented AbuseFilter under a new name. Only without the ability to fix FPs (see below).
Brainstorm ideas for additional "edit checks" we ought to consider building in the future ... who is "we"? I thought communities were going to be able to create their own checks, as with AbuseFilter? Have y'all started building a mechanism for us to do this, or is the current plan just for the WMF to maintain each check? Keep in mind that the AbuseFilter source is over 45,000 lines of PHP (though parts of it could be re-used, I suppose), and this is arguably a more complex task. This seems like putting the cart before the horse. Suffusion of Yellow (talk) 00:30, 12 July 2023 (UTC)Reply
@PPelberg (WMF): Oh wait. Misread that. You're not asking for feedback here, you want us join a video chat on Google. What's wrong with discussing these issues right here, on this page? Suffusion of Yellow (talk) 00:46, 12 July 2023 (UTC)Reply
Different people have different preferences. Also, in video chats, you can say things like "Hey, can you screenshare the most recent version and show me what happens if...", which is a lot harder to do in text. Written feedback is extremely valuable to the team, but it's not always the best for people who have complicated questions for the team. Whatamidoing (WMF) (talk) 19:59, 22 August 2023 (UTC)Reply

Features that are likely to increase interaction and understanding?

[edit]

Hi all,

What are the "features that are likely to increase interaction and understanding" in the encourage interaction section that I see on the community conversations page? (That is, the phrasing suggests they're targeted at the reviewer portion, not the inexperienced creator portion of users?) Nosebagbear (talk) 18:36, 13 July 2023 (UTC) [With a request to ping on reply, as I never think to check mediawiki watchlist]Reply

I was thinking the same thing. Maybe some way to remind reviewers to attempt to find a source before reverting an edit? (per en:WP:ROWN). Bart Terpstra (talk) 18:40, 13 July 2023 (UTC)Reply
If it's something like that it's going to need some fancy footwork to get the "false positive" of reminders down a long way to not really aggravate users who are removing it for any other reason (or have indeed already tried to find a source) Nosebagbear (talk) 18:42, 13 July 2023 (UTC)Reply
Could that be addressed in the same way as en:template:failed verification? failed to find source. Bart Terpstra (talk) 18:56, 13 July 2023 (UTC)Reply
Great question, @Nosebagbear...
The first feature we’re introducing intended to encourage interaction and understanding is:
  1. Presenting people who decline to add a source when Edit Check prompts them to do so with a way to share why they made this decision and
  2. Showing/sharing the rationale people provide in a place where people who are reviewing edits are likely to see. Where this "place" (to start) will be in diffs and history pages by way of introducing new edits tags that correspond to the decline response people share.
Does the above bring up any questions/ideas/concerns in your mind? Also: can you think of revisions that you think would be worthwhile to consider making to the decline responses people are presented with (pictured below)?
Mockup of potential decline responses
PPelberg (WMF) (talk) 23:43, 27 July 2023 (UTC)Reply

Opting out

[edit]

Will it be possible for any wiki to opt out of this whole system? Suffusion of Yellow (talk) 20:41, 18 July 2023 (UTC)Reply

@PPelberg (WMF): Not sure if you missed this. Suffusion of Yellow (talk) 23:51, 1 August 2023 (UTC)Reply
hi @Suffusion of Yellow: by "opt-out of this whole system" are you referring to opting out of any and all checks/features that are introduced as part of the Edit Check project? Something else?
...having a better sense of what you're referring to will help me address the question you're posing. PPelberg (WMF) (talk) 22:56, 8 August 2023 (UTC)Reply
@PPelberg (WMF): Yes, this is what I mean. Suffusion of Yellow (talk) 21:04, 9 August 2023 (UTC)Reply
Generally, features like this are configurable per wiki, and the Editing team generally does not deploy features that the local community objects to.
I expect the English Wikipedia to be one of the most enthusiastic adopters (of this or anything else that encourages editors to add inline citations), but the German Wikipedia has a different approach to sourcing, and they might not want it at all, and a Wikipedia in a very small language might not want to put up any perceived barriers to contributions. Whatamidoing (WMF) (talk) 20:06, 22 August 2023 (UTC)Reply
@Whatamidoing (WMF): Would you (or anyone else with a (WMF) account) be willing to state that again, but replace the "generally" with an "absolutely"?
I am frankly getting more and more disturbed by this project every day. So far, this is nothing other than a WMF-controlled "warn" edit filter. Yes, citations are good, m'kay, but it's not up the WMF to decide what should and shouldn't be sourced, how those sources are to be formatted, etc. etc. Most importantly, this is the WMF "getting its foot in the door" with respect to micro-managing content; even if you believe this check is innocuous, what about the next one?
Of course, if the WMF is willing to say "no, of course, this is just a feature we're offering y'all, and if you don't want it, we absolutely won't try to force it on you", that's different. But I will probably still be objecting to its deployment on enwiki; that depends on what, precisely, is being deployed. I won't be starting any RFC yet because its not finished; who knows, maybe it will be much more configurable that I imagine. But if there's no way for us to deal with the "unknown unknowns" except file a phab ticket and wait, consider me opposed.
But if this is going to be forced on us, MediaViewer-style, I do not plan to wait. I will be drawing this to a wider audience shortly. Suffusion of Yellow (talk) 21:12, 23 August 2023 (UTC)Reply
I suggest instead that if anyone associated the WMF tells you that "absolutely, all features are always configurable per wiki", then you should not believe them. Some features are deployed everywhere for (e.g.,) legal or performance reasons. However, AFAIK this feature isn't one of those.
I do not think that this is something you need to worry about for this feature. I say this because I've worked with the Editing team for more than 10 years and with its current product manager since he was hired ~4.5 years ago. He has never forced a feature on any community that formally objected to it, and he has delayed deployments over a single editor's objection more than once. I can also tell you that not only has the team never even hinted at requiring this particular feature, but they have also already discussed not deploying it to some Wikipedias.
That said, I want to point out that volunteer-me has been a registered editor at the English Wikipedia for 17 years, and I've made more than 100,000 edits there. Few editors can claim to have spent more time with the w:en:WP:RFC process than me, and you will find my name all over the archives of all of the English Wikipedia's policies and guidelines about sourcing. And I suggest to you, based on my not-inconsiderable experience of that particular community (not to mention the comments in and linked to Editing team/Community Conversations) that if you go to that community with a question about whether they want software to prompt editors to add citations, the answer will be "yes, absolutely, right now".
Consequently, there is no real chance of it being forced on the English Wikipedia (even if the team were inclined to do such a thing, which they aren't) because that particular community is extremely likely to want it.
The scenarios that worry me are much closer to the team concluding that it is net harmful (e.g., because the extra step results in good edits being abandoned) and enwiki still demanding that it be deployed anyway. Whatamidoing (WMF) (talk) 02:58, 24 August 2023 (UTC)Reply
I do not doubt your experience; you grok the true meaning of "rv unsourced" better than anyone. But again, I'd feel more comfortable if words like "AFAIK" weren't in there; I would like to hear someone say straight out "No, this feature will not be deployed against consensus".
As to the outcome of a RFC, the devil is in the details. The question will not be "Should be sprinkle some  magic pixie dust to make everyone add references". It will be "should we deploy this specific feature, to do this specific thing, that might encourage people to add references". If said feature starts yelling at people for using Harvard references, or adding unsourced plot summaries, or reordering text, or building disambig pages, or doing any of a million other things that I haven't thought of yet and if the solution to those problems is "file a bug report on phab" as opposed to "fix it ourselves" I expect the answer just might be "no". Suffusion of Yellow (talk) 21:12, 24 August 2023 (UTC)Reply

Risks of this project and approach to validating it

[edit]

I noticed here that the Risks section of the product documentation is empty. It seems obvious to me that the primary risk here is that these edit check notices might either confuse or discourage new editors.

The strategy notes that "This project is built on the belief that by surfacing relevant guidance in the precious moments when people are in the midst of making a change to Wikipedia and equipping them with the know-how and tools necessary to apply this guidance they will make changes they are proud of and that experienced volunteers value". This belief can be tested scientifically by administering a randomized test (i.e. an A/B test).

@PPelberg (WMF) are there plans to test new edit check notices and prove whether or not they A) discourage new editors from successfully completing edits they started B) decrease the revert rate of edits completed successfully? Steven Walling (talk) 01:47, 24 July 2023 (UTC)Reply

hi @Steven Walling – I'm glad you acted on the instinct to ask these questions. You doing so was the reminder I needed to update the project page with the outcomes of T325838 and T325851. I expect to have this done before next week is over and I'll ping you in this thread once this thinking is in a place for you to review.
From there, I think we ought to start a wider conversation about what – if any – changes/additions we ought to consider making to:
  1. How we evaluate the impact of this project
  2. The risks we ought to hold ourselves accountable to monitoring throughout this project.
In the meantime, responses to the specific questions you raised...
It seems obvious to me that the primary risk here is that these edit check notices might either confuse or discourage new editors.
We agree with you in thinking this is a key risk we need to hold ourselves to managing.
This belief can be tested scientifically by administering a randomized test (i.e. an A/B test).
Great call and we plan to do exactly as you described: use A/B tests to help evaluate the extent to which the checks we're introducing are causing the impact they've been designed to cause.
are there plans to test new edit check notices and prove whether or not they A) discourage new editors from successfully completing edits they started B) decrease the revert rate of edits completed successfully?
Per the above, definitely; I'm eager to talk through the specifics of the experiments we're planning and the metrics we'll use them to help us evaluate. PPelberg (WMF) (talk) 00:29, 1 August 2023 (UTC)Reply
You doing so was the reminder I needed to update the project page with the outcomes of T325838 and T325851. I expect to have this done before next week is over and I'll ping you in this thread once this thinking is in a place for you to review.
@Steven Walling: I'll need to do the above next week...this week got away from me :o PPelberg (WMF) (talk) 00:18, 12 August 2023 (UTC)Reply
@Steven Walling ok! The project page should now be updated with the risks and desirable outcomes I was alluding to above as well as the methods we'll use to gather the information needed to determine the impacts Edit Check is causing.
What – if any – questions/concerns/uncertainties/ideas/etc. does what the Editing team is planning bring up for you? PPelberg (WMF) (talk) 23:22, 17 August 2023 (UTC)Reply
This is perfect, thank you for all the detail and thinking this plan through. Steven Walling (talk) 00:36, 25 August 2023 (UTC)Reply
Wonderful and you bet. PPelberg (WMF) (talk) 16:20, 30 August 2023 (UTC)Reply

Feedback: Sir Amugi

[edit]

it's super cool working with it. 154.160.24.204 18:26, 7 August 2023 (UTC)Reply

We're glad to hear this! If there are particular aspects that you appreciated and/or were uncertain about, we'd value hearing ^ _ ^ PPelberg (WMF) (talk) 23:09, 17 August 2023 (UTC)Reply

Feedback: SIR SUCCESS (NAA JAHINFO)

[edit]

1. What I found unexpected is asking me to add citation.

2. What I like about the prototype is that, drawing attention that is important to add a source of information to help readers know where the information is found


3. I wish it could also prompt me to add inter links when possible

4. I  tried adding a section/heading but couldn't find it. Quiet apart from that,  I think  the prototype will help to improve the quality of the wikimedia content and make it more reliable

5. what if we actually have this tool  on the wikipedia platforms help new editors know the importance of adding citations?

Thank you. 154.160.26.192 12:32, 10 August 2023 (UTC)Reply

Thank you for taking the time to try out the prototype and come to here to share what your experience with it was like!
A couple of follow-up questions in response to what you shared...
1. What I found unexpected is asking me to add citation. 2. What I like about the prototype is that, drawing attention that is important to add a source of information to help readers know where the information is found.
Would it be accurate for me to understand the combination of the two points you named above as something like the following?
"Despite it being unexpected to you that the interface asked you to add a citation, you ended up finding it useful seeing as how important sources are to Wikipedia.
3. I wish it could also prompt me to add inter links when possible
Oh, interesting. Can you please say more about this? What do you find attractive about the prototype that led you to imagine it being useful for adding wiki links?
4. I tried adding a section/heading but couldn't find it.
Do you recall whether you were using a desktop or mobile device to try the prototype?
{tq| I think the prototype will help to improve the quality of the wikimedia content and make it more reliable}}
This is encouraging to hear ^ _ ^
what if we actually have this tool on the wikipedia platforms help new editors know the importance of adding citations?
Assuming it's accurate for me to understand the above as you asking if/when the functionality the prototype demonstrates will be made available on Wikipedias, then the answer: we are planning for Edit Check to be available at initial set of partner wikis within the next month or so. PPelberg (WMF) (talk) 23:08, 17 August 2023 (UTC)Reply

This is wrongheaded

[edit]

Across all Wikipedias, new content edits that include a reference are ~2x less likely to be reverted (6.2%) than edits that do not include a reference (11.5%)

Duh. That doesn't automatically mean prompting newcomers to add citations is going to improve the quality of articles. What it means is that it's harder for patrollers to sort out welcome and unwanted contributions because contributions with bogus citations are much, much harder to spot than contributions with no citations. You cannot just look at the number of reverted edits and measure success because that doesn't account for what should be reverted but isn't. Nardog (talk) 03:00, 12 August 2023 (UTC)Reply

That doesn't automatically mean prompting newcomers to add citations is going to improve the quality of articles. What it means is that it's harder for patrollers to sort out welcome and unwanted contributions because contributions with bogus citations are much, much harder to spot than contributions with no citations.
@Nardog: great observations! Responses to the points you raised below...
As you're naming here, the reference Edit Check, as currently implemented, does not offer newcomers and Junior Contributors guidance about how likely people who are reviewing edits are/are not to consider a given source reliable.
This could, as you adeptly noted, could lead to a scenario wherein Edit Check is causing lower quality edits that are more difficult for people reviewing edits to detect and moderate.
We also agree with you in thinking that looking solelyat the number of reverted edits would be an incomplete and unreliable way of evaluating the impact of this project.
Two questions for you with the above in mind:
  1. What information do you think would be helpful in measuring the success of the reference check feature?
  2. Can you think of ways we might detect whether Edit Check is causing, as you put it, more "bogus citations" to make their way onto the wikis?
In the meantime, here are three things the Editing Team is doing/thinking about in response to the risk you named:
  1. Exploring how we might go about iterating upon the initial reference check design that would provide people who are adding citations feedback about how likely people reviewing said citations are to perceive a source as reliable. This work is happening in T325414.
  2. Researching what would need to be in place in order to deliver the functionality T325414 is describing. This work is happening in T276857.
  3. Cataloguing the risks/undesirable outcomes the Edit Check project could cause and identifying what information we'll use to detect if/when we enter into such a scenario. This work relates to the question @Steven Walling posed above and I'll be sharing a proposal for before this week is over.
PPelberg (WMF) (talk) 22:43, 15 August 2023 (UTC)Reply
The only outcome of the reference check I would see as a success is its not happening. You have yet to address the points made by Suffusion of Yellow above and until then I believe this project should not proceed as is. Nardog (talk) 23:48, 16 August 2023 (UTC)Reply
To be clear, while I believe the reference check should not happen at all (and would support turning it off on projects I'm active on should it be imposed on them), I don't care as strongly about Edit Check as a whole (unless it provides a solution to T315072—which this project page claims it does, but what's being built does not). Nardog (talk) 15:27, 24 August 2023 (UTC)Reply
> You cannot just look at the number of reverted edits and measure success because that doesn't account for what should be reverted but isn't
Good point. Also, it doesn't account for what shouldn't be reverted, but is anyway (e.g., the reviewer who didn't notice that it's a simple summary of content already cited elsewhere in the article, or whose personal standards are out of step with the rest of the community). Whatamidoing (WMF) (talk) 20:11, 22 August 2023 (UTC)Reply

Feedback: Dnshitobu

[edit]

@PPelberg (WMF) I have tried the demo several times and I have the following feedback based on the check up questions: What did you find unexpected about the prototype?

I realized that prompt came when I tried to publish my edits. However, even with minor edits, I still got prompts.

What do you like about the prototype?

The prompts are good

What do you wish was different about the prototype?

I really wish that there is a video link to guide people on how to go about adding citations or a textual documentation.

What questions does this prototype bring to mind?

Wikipedia is an aggregator of sources and this will help make the platform more credible and help new editors learn about how to add citations and the very reasons for the citation

Dnshitobu (talk) 22:34, 19 August 2023 (UTC)Reply

@Dnshitobu, would something like this be helpful? Whatamidoing (WMF) (talk) 20:14, 22 August 2023 (UTC)Reply

Suffusion of Yellow's demo

[edit]

I've thrown together a quick demo of what might be a better approach. Steps to try it out:

  1. On enwiki, enable the CodeMirror syntax highlighter (the highlighter pen in the editing toolbar, not any gadget)
  2. Install en:User:Suffusion of Yellow/wikilint.js in your common.js
  3. Open your sandbox (or any page), using the 2010 text editor.
  4. Type Lorem ipsum.<ref>https://foxnews.com</ref>

Within a second or so, "foxnews.com" should have a squiggly underline, and a little triangle should appear on the right. Hover other either, and you should see the text "Reliability depends on contributor or topic".

This should work with any source from Headbomb's unreliable sources list.

Also note at the bottom right, there's an options to show problems already on the page (like a conventional linter). This is off by default, but can turned on if you are bored and looking for something to fix. You can also turn the whole thing off, if desired.

As I said, this is just a demo. The style is the CodeMirror default. And the CodeMirror 5 linter is really simplistic; I understand y'all are going switch to CodeMirror 6 soon, and I don't want to dig too much through the version 5 manual, just to rewrite this later.

The main point is that we aren't interrupting the user. There's a gutter marker which they can investigate as they choose. Or they can continue typing.

Also, elsewhere there came up the question of which user groups these checks should apply to. That's AbuseFilter thinking. The checks should apply to all users who haven't turned them off. When it comes to actual abuse of course you can't make the filter voluntary, so you have to guess based if on edit count, etc. But there's no need to guess here; just let the user choose. Suffusion of Yellow (talk) 01:01, 24 August 2023 (UTC)Reply

How would you use this approach to identify "negative" problems? That is, how do you identify that the problem is the non-existence of the ref? Whatamidoing (WMF) (talk) 18:05, 24 August 2023 (UTC)Reply
Not yet implemented; see your skateboard analogy above. :-) Just highlighting unsourced paragraphs in the text would be trivial; more challenging is figuring out whodunnit: were they adding an unsourced paragraph or modifying one. (something something edit distance) I went with "does the new text match any of these regexes" because it is a trivial check; the real purpose of this demo is to show how the message might be presented: the same way we already show problems in JavaScript, CSS, JSON, and even edit filters. Suffusion of Yellow (talk) 20:52, 24 August 2023 (UTC)Reply
@Suffusion of Yellow: what a neat demo!
Wikilint Gadget drawing attention to the reliability of a source
. I appreciate how Wikilint.js is presenting feedback: A) immediately after you make a change and B) in immediate spatial proximity to the content it is relevant to.
there's an options to show problems already on the page (like a conventional linter). This is off by default, but can turned on if you are bored and looking for something to fix. You can also turn the whole thing off, if desired.
I also find this idea to be quite powerful and it's leading me to think about a future wherein suggested edits appear within/alongside the editor you have open.
The reflections above aside, I wonder: what inspired you to create this gadget? What energizes you about the approach you're demonstrating here? PPelberg (WMF) (talk) 00:13, 18 October 2023 (UTC)Reply
Well, we already have something similar for nearly every other content model: Javascript, CSS, Lua, JSON, etc. It seems odd that on a wiki we aren't doing the same for wikitext. If nothing else, a few built in warnings for unclosed tags and comments, malformed links, and maybe even nonexistent templates would be universally applicable. Warnings about content issues would of course have to be fully customizable. Suffusion of Yellow (talk) 22:16, 26 October 2023 (UTC)Reply
Well, we already have something similar for nearly every other content model: Javascript, CSS, Lua, JSON, etc.
To be doubly sure I'm accurately understanding you: by "something similar" are you referring to data validation?
If so, that framing resonates with me! I find it quite clarifying.
A resulting question: are there ways you've seen community-driven validation systems implemented that you find to be particularly useful or, on the contrast, unhelpful?
...I ask the above thinking there are patterns from these systems that we can learn from/avoid. PPelberg (WMF) (talk) 21:59, 30 October 2023 (UTC)Reply
Also, elsewhere there came up the question of which user groups these checks should apply to. That's AbuseFilter thinking. The checks should apply to all users who haven't turned them off. When it comes to actual abuse of course you can't make the filter voluntary, so you have to guess based if on edit count, etc. But there's no need to guess here; just let the user choose.
Oh, and regarding the above, we've designed Edit Check in such a way that empowers volunteers, on a per project basis, to decide what people a given edit check applies to. See more in Edit_check/Configuration.
Are there ways you can see this approach going wrong? If so, I'd be eager to hear... PPelberg (WMF) (talk) 00:15, 18 October 2023 (UTC)Reply
Why not both? I mean, yes, allow wikis to decide which groups to apply the checks to, but still allow users in those groups to opt out. Because not everyone's brain works the same way, and no matter what we do, some people will find the feedback so irritating as to discourage them from contributing at all. A good interface design will reduce that number, but not to zero. Suffusion of Yellow (talk) 00:07, 27 October 2023 (UTC)Reply
Because not everyone's brain works the same way, and no matter what we do, some people will find the feedback so irritating as to discourage them from contributing at all.
@Suffusion of Yellow: I agree with you in thinking there is a possibility some people will find the feedback disruptive to the point that they'd rather not edit at all.
Having named that causes me to think we'd benefit from way(s) of detecting whether the scenario you described is occurring and the extent to which it is occurring.[i]
Question: can you think of a way we might go about detecting this? One thought: we could look to see if there is a significant increase in the percentage of people who abandon the edit the were attempting to make after being presented with feedback from Edit Check.
Knowing the above, I think, will help us decide what interventions we want to experiment with to relieve people of this frustration.[ii]
---
i. E.g. are there any patterns in who is reacting to the feedback Edit Check is presenting in this way? Are there particular facets of the experience that they find to be especially frustrating/demotivating/etc.?
ii. E.g. Do we revise the user experience such that people can bypass the feedback prompts? Do we, as you suggested, provide a way to dismiss feedback forever? etc. PPelberg (WMF) (talk) 21:50, 30 October 2023 (UTC)Reply

How should we evaluate this project?

[edit]

Hi y'all – on 11 October, Edit Check became available to newcomers for the first time at an initial set of wikis.

This means, we're  a step closer to running an A/B test to evaluate the impact the first Edit Check is causing.

Before this A/B test can start, we – staff and volunteers – need to align on how we'll evaluate the impact of this project.

It's with the above in mind that we would value you all reviewing the measurement proposal published on the project page and sharing what you think about it.

While we are interested in any and all feedback, here are some specific questions that we need your help answering:

  1. If this project is successful, and Edit Check becomes a valuable tool for your community, what measurable impact might you hope it will have? What metrics/numbers could we monitor to evaluate that impact?
  2. If this project fails, and ends up disrupting your community, how might you know this is happening? What could we monitor to detect that Edit Check could be causing harm?
  3. How might we evaluate the extent to which Edit Check is a tool experienced volunteers use to help moderate/improve the quality of edits?

Of course, if there is any additional information you think would help you provide the kind of feedback we're seeking, please let me know, we're eager to share.

CC @Sdkb, @Andrew Davidson, @Xaosflux, @Suffusion of Yellow, @Omotecho, @Novem Linguae, @Joe Roe, and @Steven Walling. Y'all have commented/raised questions about the broader implications of this project leading me to think you may be well-positioned to respond to the questions above. PPelberg (WMF) (talk) 23:09, 19 October 2023 (UTC)Reply

If I viewed the mock ups correct, EC processes prior to AF; so for #3 a reduction in AF hits could be a measure. Xaosflux (talk) 23:15, 19 October 2023 (UTC)Reply
...a reduction in AF hits could be a measure.
Oh, yes. Great spot, @Xaosflux! Are there specific AbuseFilters that you think we should monitor at en.wiki?
A couple that immediately looked relevant to me:
PPelberg (WMF) (talk) 00:11, 20 October 2023 (UTC)Reply
For #1, since Edit check is designed to reduce the number of bad edits, a lower revert rate would be a key indicator. For the narrower implementation of Edit check that is just about references, I think the number of times filters 833 and 869 are tripped looks good.
For #2, the main way I could see the project causing harm is if it introduces too much friction to the editing process, causing people not to make edits they otherwise would have. This is tricky to observe as volunteers, since we can't see edits not happening. But if you're able to measure it, the percentage of people who open the edit window who ultimately publish an edit would be a key metric to track. If Edit check makes it substantially lower, that'd be a red flag. However, if combined with positive indicators from #1, it might mean the feature is working well — people who would make a bad edit are realizing the edit would be bad and then choosing not to make it. If that data is unavailable, then looking at whether the number of edits overall decreases in the sample group would be a proxy.
For #3, I would just check in on relevant talk pages once the test has commenced, and ask editors how their experience is coming across edits tagged with the Edit check tag. {{u|Sdkb}}talk 04:45, 20 October 2023 (UTC)Reply
For #1, since Edit check is designed to reduce the number of bad edits, a lower revert rate would be a key indicator.
Understood and agreed. Assuming what's currently written in row "1." of the "Desirable Outcomes" table looks good to you, I think we're aligned on this.
For the narrower implementation of Edit check that is just about references, I think the number of times filters 833 and 869 are tripped looks good.
Wonderful. Have you seen existing tools/methods that visualize (read: graph) abuse filter hits over time? I'm thinking some of this sort would be helpful for enhancing our collective awareness of a particular Edit Check's effectiveness. See T343166.
For #2, the main way I could see the project causing harm is if it introduces too much friction to the editing process, causing people not to make edits they otherwise would have. This is tricky to observe as volunteers, since we can't see edits not happening. But if you're able to measure it, the percentage of people who open the edit window who ultimately publish an edit would be a key metric to track. If Edit check makes it substantially lower, that'd be a red flag. However, if combined with positive indicators from #1, it might mean the feature is working well — people who would make a bad edit are realizing the edit would be bad and then choosing not to make it. If that data is unavailable, then looking at whether the number of edits overall decreases in the sample group would be a proxy.
Great spot and I agree with all that you described. It seems like Rows "3." and "4." of the "Risks (Undesirable Outcomes)" could benefit from with adding the following to each: "...without a corresponding increase in edit quality."
With the revision I'm proposing based on what you shared would cause them to read:
  • "3. Edit completion rate drastically decreases without a corresponding increase in edit quality."
  • "4. Edit abandonment rate drastically increases without a corresponding increase in edit quality."
…how does the above sound to you?
For #3, I would just check in on relevant talk pages once the test has commenced, and ask editors how their experience is coming across edits tagged with the Edit check tag.
Simple enough. Makes sense. We'll prioritize doing the above in the newly-created T349878.
And hey, thank you for engaging with these questions so thoroughly, @Sdkb. PPelberg (WMF) (talk) 22:46, 26 October 2023 (UTC)Reply

Next check: Reference Reliability

[edit]

The next Edit Check will prompt people to replace a source when they attempt to cite a domain a project has deemed to be spam.

Edit Check (Reference Reliability) design explorations

Where "spam" for this first iteration means the domain someone is attempting to cite is listed on a project's Special:BlockedExternalDomains page.

Zooming out, the Editing Team sees this as a first step towards a potential future where editing interfaces can use the consensus stored in pages like en:WP:RSP to offer people feedback about the source they are attempting to cite. See more in phab:T346849

The designs we are exploring are pictured here.

We now need your help...

  1. Which approach do you favor most?
  2. What about that approach do you appreciate?
  3. What questions/concerns do these approaches bring to your mind?

PPelberg (WMF) (talk) 15:46, 16 November 2023 (UTC)Reply

This seems to have issues, see T366985. Count Count (talk) 06:07, 8 June 2024 (UTC)Reply

Cite Q

[edit]

Does the reference checking recognise Cite Q (en:Template:Cite Q on en.Wikipedia, and equivalent on other projects)? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:42, 9 March 2024 (UTC)Reply

Any way of emulating <ref> tags (including #tag:ref) is recognized as a reference call. This being said, newcomers aren't likely to use Cote Q firsthand. :) Trizek_(WMF) (talk) 14:45, 4 April 2024 (UTC)Reply

Talk:Edit check/False positives/Reports

[edit]

Hello, as far as I can see, there has only been vandalism in this page [2], without any valid report. I concern if this page is useful, or the link of this page is placed in wrong place. Thanks. SCP-2000 (talk) 13:18, 2 May 2024 (UTC)Reply

Hello @SCP-2000
Thank you for your message, and also for patrolling the report page. The link to this page is shown on edits triggering Edit check. It was a test to see if false positives would be reported by users who don't know team members. I'll discuss the problem with the Editing team.
Trizek_(WMF) (talk) 14:43, 2 May 2024 (UTC)Reply
@Trizek (WMF): Hello, are there any updates on this matter? Perhaps that page can be semi-protected first, as the problem still exists. Thanks. SCP-2000 (talk) 18:08, 18 August 2024 (UTC)Reply
@SCP-2000 it is still a topic to discuss. I'll put it up for our next meeting. Trizek_(WMF) (talk) 14:28, 26 August 2024 (UTC)Reply

Edit Checks as you are typing

[edit]
Technical demo showing what it could be like for Edit Checks to appear within/alongside the editable surface.

People like @Andrew Davidson, @Nardog, and @Suffusion of Yellow have raised the idea of Edit Check showing people Checks while they are typing. 1, 2, 3

With this initial technical prototype (desktop), the Editing Team is exploring what it could be like for Checks to:

  1. Appear within/alongside the editable document, in the moment when people make a change that causes a Check to be activated
  2. Respond when people make a change that impacts the Check that they activated

We need help

[edit]

What do you think about this idea? What questions/concerns/ideas/etc. does it bring to mind?

Please note: what you're seeing is very much a sketch. As such, please don't hold back feedback! Being this early in the design and development process means there are still many aspects of the experience that have not yet been defined. PPelberg (WMF) (talk) 21:12, 7 June 2024 (UTC)Reply

The UI design of this is good, but how helpful it is really depends on how smart you can make the suggestions. Since it triggers as you are typing, it has more potential to be noisy. Where is the definition for how the warnings are determined? Steven Walling (talk) 19:15, 9 June 2024 (UTC)Reply
...how helpful it is really depends on how smart you can make the suggestions. Since it triggers as you are typing, it has more potential to be noisy.
Big +1, @Steven Walling. I think a risk we need to manage is, as you described: Edit Check intervening in ways that people experience to be disruptive, irrelevant, unhelpful, etc. Further, I think a single, let's say, "poorly implemented" Check has the potential to get in the way of the positive impact people experience other Checks to have.
Where is the definition for how the warnings are determined?
The Editing Team is currently prioritizing building Checks that have the potential toreduce the likelihood that an edit someone publishes will cause a negative consequence for them, people who read Wikipedia, and the experienced volunteers work to make Wikipedia a resource people can depend on. Does this offer the sort of clarity you were seeking? See more in phab:T367897.
And hey, I'm glad you asked! You doing so was the prompt I needed to add this thinking to the project FAQ. PPelberg (WMF) (talk) 22:54, 30 September 2024 (UTC)Reply
I think the most important thing is the ability to opt-out. Different people react in different ways to moving or changing elements. Some people will appreciate this. For others, it might be so distracting that they'll give up editing. There probably isn't a "sweet spot" where can ensure 100% of people will notice and 100% won't be aggravated. Our brains just react differently to motion. So give the user control. Suffusion of Yellow (talk) 23:24, 13 June 2024 (UTC)Reply
Offering the ability for individual people to opt-out of Edit Check is not something we've considered yet, @Suffusion of Yellow...thank you for raising this.
Can you think of an experience that offers the kind of "opt-out" functionality you might've had in mind when you suggested the above?
No worries if nothing comes to mind. I realize it's taken me >3 months to respond :o
Reason I ask: I'm thinking about the tradeoff between ease and disruption. E.g. remove too much friction from the publishing process (by way of offering an easy opt-out mechanism) at the "expense" of increasing the likelihood that someone publishes edit(s) that negatively impacts readers and create more work for experienced volunteers.
In parallel, I've created phab:T376086 to track this work. PPelberg (WMF) (talk) 23:16, 30 September 2024 (UTC)Reply
I agree with Sj's suggestion. It is also recommended that there is a "dismiss" bottom to ignore the messages. The Edit Check (i.e. Robot) icon is a bit strange and perhaps you could use the checklist icon (e.g. [3]) etc? Thanks. SCP-2000 (talk) 14:44, 24 June 2024 (UTC)Reply
One more thing: There is no separation between the content area and the warning area. Perhaps separate them (like Grammarly and the Microsoft Word Editor Pane)? Thanks. SCP-2000 (talk) 03:35, 9 July 2024 (UTC)Reply

Less ⚠️ warning ⚠️, more helpfulness

[edit]

Our internal templates and notifications have become very judgmental and negative -- talk pages are full of warnings and alerts about pending challenges, deletions, and problems; banners at the top of pages are almost uniformly negative content warnings. That's the opposite of good narrative + game design; let's try to reverse that~

These notices to the side could be shorter, more along the lines of a helpful tutorial step ("use this button to expand references"), and sparing in suggesting optional style guide directives (even I who like much of the style guide wouldn't appreciate an automated "don't use adjectives" popup).

Text/code completion and writing-checks do a fine job, polished over decades, of showing people things they might want to revisit without it getting in their way while writing. Maybe we can adopt some of those techniques: subtle underlining (with expandable details that default to collapsed) or faint proposed rewriting/correction.

And I would prefer to see people encouraged to save early and often. Like a good tutorial, only showing people one core thing at a time helps them save each step and iteratively learn to do more things, in sequence or in the same edit. Overleaf has a nice way of showing you how many potential errors and alerts you could expand and try to resolve, so that you can revisit your past writing once you have time to focus on that -- that could be a separate sort (or color?) of notice-count. Sj (talk) 22:09, 12 June 2024 (UTC)Reply

[edit]

Hello, thanks for your hard work in developing Edit Check and it is a great feature for newcomers and the community! It is recommended that:

  1. Include a link to the spam-blacklist false positive reporting page (e.g. en:MediaWiki_talk:Spam-blacklist) in the warning message when users attempt to link to a blocked domain. If possible, the community can customize the link, and even the entire interface message.
  2. Add a log entry to record the action of attempting to link to a blocked domain. Currently, if the edit triggers the spam-blacklist, this would be recorded in the log (e.g. en:Special:Log/spamblacklist). Edit Check should also have the same feature for debugging.

SCP-2000 (talk) 05:13, 21 June 2024 (UTC)Reply

Thank you for your appreciation and the suggestions @SCP-2000.
I documented point 1 as T368150
Regarding the second idea, what if we tag the changes in the log? For instance $date $user caused a spam block list hit on $page by attempting to add $URL #linkcheck. I need to check on the feasibility.
Trizek_(WMF) (talk) 13:41, 21 June 2024 (UTC)Reply
@Trizek (WMF): Hello, thanks for your response. I think $date $user caused a spam block list hit on $page by attempting to add $URL #linkcheck is okay for me. The community needs to know what URL the user attempting to add. SCP-2000 (talk) 03:30, 22 June 2024 (UTC)Reply
Thanks, I documented it! Trizek_(WMF) (talk) 17:34, 25 June 2024 (UTC)Reply
Thank you for your appreciation and the suggestions @SCP-2000.
+1, @Trizek (WMF)...thank you, @SCP-2000.
Regarding adding, ...a log entry to record the action of attempting to link to a blocked domain... a couple of follow-up questions for you:
  1. When people people attempt to link to a domain present within meta:Spam_blacklist, it sounds like you would you expect that attempt to be logged on Special:Log/spamblacklist. Now, what about when people attempt to link to a domain present within en:MediaWiki:BlockedExternalDomains.json or MediaWiki:Spam-blacklist, where would you expect those attempts to be logged?
  2. Would it be accurate for us to think the edit tag @Trizek (WMF) proposed above (#linkcheck, or some variation of it) would sufficiently differentiate hits triggered within Edit Check from other types of hits?
  3. Once we become clear about the above, might you be able to recommend a few people and/or places where we can invite other volunteers to review what we're proposing before moving forward with implementation? Of course, if you would like to help out with inviting feedback, we'd welcome it ^ _ ^
PPelberg (WMF) (talk) 22:38, 21 August 2024 (UTC)Reply
@PPelberg (WMF): Hello, thanks for your question:)
  1. I would expect it to work the same as the present system works, i.e. attempts to link to a domain within MediaWiki:Spam-blacklist and meta:Spam_blacklist will be logged on Special:Log/spamblacklist; within MediaWiki:BlockedExternalDomains.json will be logged on Special:Log/abusefilterblockeddomainhit.
  2. Yes, (#linkcheck) this kind of tag is enough.
  3. It is an interesting question:) As it is not a major change, maybe using the Tech News to invite others to review is fine?
SCP-2000 (talk) 04:22, 23 August 2024 (UTC)Reply
@SCP-2000: all that you described sounds great to me.
Before posting an invitation for feedback in Tech News, can you please review the newly-created Requirements section of T368438 and share what – if anything – about it you think ought to be changed? PPelberg (WMF) (talk) 21:43, 23 August 2024 (UTC)Reply
@PPelberg (WMF): Hello, it is fine for me. Thanks! SCP-2000 (talk) 11:48, 25 August 2024 (UTC)Reply
Wonderful – thank you for taking a look, @SCP-2000. We'll work to include an invitation for other volunteers to review and share feedback about the proposed approach in next week's Tech/News. PPelberg (WMF) (talk) 19:45, 26 August 2024 (UTC)Reply

AbuseFilter

[edit]

This is probably related to #Edit_Checks_as_you_are_typing. It would be good to have (some? all?) tags exposed as variables in AbuseFilter, so if the user decides to ignore the warning, the community can block it before saving. Strainu (talk) 09:45, 27 September 2024 (UTC)Reply

hi @Strainu – can you please give the below a read? In what ways (if any) does what I've written not align with what you have in mind?
Proposal
"Each time an Edit Check (of any sort) gets activated and someone subsequently, declines acting upon it, expose this information to AbuseFilter so that volunteers can script/specify what happens in response when someone proceeds to publish changes made in edit sessions of this sort."
Assuming the above is somewhat accurate, can you share a bit more about what prompted you to ask? E.g. might there be a particular pattern you're noticing or could foresee happening? PPelberg (WMF) (talk) 22:21, 3 October 2024 (UTC)Reply
Thanks @PPelberg (WMF). The text is aligned with my proposal. I asked in the context of a certain pattern of translations I observed lately: people copy the page from enwp, then translate only the text, arguing they will add links and sources "later". Sometimes they do, other times they don't. A discussion on the subject can be found here. Strainu (talk) 04:11, 4 October 2024 (UTC)Reply

Paste Check

[edit]
Paste Check mobile user experience walkthrough.

The Editing Team is actively working on a new Edit Check: Paste Check.

Paste Check will prompt people pasting text into an article to confirm whether they did or did not write the content they are attempting to add.

There is an initial proposal for the user experience ready: video walkthrough (4 minutes).

Now, we need help answering some questions about the experience:

  1. Content source:By default, will we suppress the Paste Check if the text comes from Microsoft Word, Google Docs, Libre Office, etc.?
  2. Paste Size: By default, how much text does a newcomer need to have pasted in order for the Paste Check to appear? E.g. one sentence? One paragraph? Multiple paragraphs? Something else...?
  3. Resolution Paths: What – if anything – concerns you about offering people who indicated they did not write the text they are adding to Delete or Rewrite this pasted text?
  4. Rewrite Threshold: By default, what % of the text do you think people would need to have changed in order to mitigate copyvio risk?

PPelberg (WMF) (talk) 20:57, 7 November 2024 (UTC)Reply

A few months ago, I posted the literally same idea in the wishlist: m:Community Wishlist/Wishes/Warn when large amount of content has been copy-pasted (more ideas in the discussion). Full support for this initiative! --Matěj Suchánek (talk) 14:58, 12 November 2024 (UTC)Reply
What a delight to discover this alignment, @Matěj Suchánek! I'm glad you decided to say something here.
Now, some responses to the points you raised as well as some follow-up questions for you...
Detecting the source of the pasted content
The Editing Team agrees with you in thinking that the source of content someone is pasting ought to be information we use to help decide whether to present a Check or not.
In fact, through the investigation @ESanders (WMF) did in T376306, we learned we can differentiate text pasted from document editors from other sources. This is leading us to think: by default, people should not see Paste Check in cases when they are pasting content from a document editor. Does that sound like a sensible default to set?
Related to the above: what's prompting you to think the Paste Check should not appear when someone is pasting text from another Wikipedia article?
Patroller Awareness
...on publishing adds a tag to the edit, so that patrollers are aware of a possible problem.
Can you please say a bit more about how you imagine tagging to work in this context? Asked another way: When do you imagine this tag being applied to edits?
For example, do you imagine a tag being applied to edits where someone decides to publish the text they pasted without modifying it?
In the meantime, I've created T379843 to track work on this. PPelberg (WMF) (talk) 22:53, 13 November 2024 (UTC)Reply
Does that sound like a sensible default to set? Yes. It is sometimes presented as an acceptable way of preparing an article for submission (e.g., here). Note that it could be interesting to measure how often it happens (client-side analytics). Maybe the users could also be prompted to share why they prefer doing this (possible ideas for improving visual editor, etc.).
what's prompting you to think the Paste Check should not appear when someone is pasting text from another Wikipedia article? I need to make myself clear. Currently, I am thinking of three situations potentially motivating users to copy-pasting content within a wiki:
a) Split an article. It doesn't happen often, and with proper attribution is harmless.
b) Paste from draft/sandbox. Acceptable, common practice in my wiki.
c) Duplicate article to achieve title change. Unacceptable.
So it's up to discussion whether it should or should not happen.
For example, do you imagine a tag being applied to edits where someone decides to publish the text they pasted without modifying it? Yes, but I understand this is tricky. Does changing formatting (boldface, italics) or adding a link cancel copyright? Does a simple copy edit cancel copyright? Not really. On the other hand, you can paste in copyrighted content and then rewrite it / paraphrase within the wiki editor. It's questionable, but only the publish action counts. Again, some prior analytics could tell us if this ever happens.
--Matěj Suchánek (talk) 21:03, 14 November 2024 (UTC)Reply