Growth/Constructive activation experimentation
Constructive activation experimentation
Experiments to increase the percentage of new account holders who edit constructively
|
As part of the Growth team 2024/2025 Annual Plan , the Growth team will explore various ways to increase constructive activation on mobile.
This project page documents Growth team experimentation related to the Wikimedia Foundation 2024-2025 Annual Plan, specifically the Wiki Experiences 1.2 Key Result.
Current Status
[edit]- June 2024 - planning and sharing the initial hypothesis
- September 2024 - user testing of design prototypes
- November 2024 - start community consultation with our pilot wikis
- December 2024 - conduct a temporary test at pilot wikis
- Next - conduct A/B test at pilot wikis
Summary
[edit]Current full-page editing experiences require too much context, patience, and trial and error for many newcomers to contribute constructively. To support a new generation of volunteers, we will increase the number and availability of smaller, structured, and more task-specific editing workflows (e.g. Edit check and Structured Tasks ). The Growth team will primarily focus on Structured Tasks, while working closely with the Editing team to ensure our work integrates well with Edit Check.
This project aims to address the following user problem:
Getting started editing on Wikipedia is difficult and especially frustrating on mobile devices. I want the editing interface to provide the in-the-moment policy and technical guidance I need, so my initial efforts aren't reverted.
This project aims to achieve the following user outcome:
As a new Wikipedia volunteer, I feel confident and enthusiastic about contributing to the Wikimedia movement by editing Wikipedia articles. The tools provided guide me step-by-step, limit distractions, and allow me to learn progressively so I can successfully contribute on my mobile device.
Background
[edit]How does this work fit into the Wikimedia Foundation's Annual Plan?
[edit]Wiki Experiences 1: Contributor experience Objective
[edit]Under the Wikimedia Foundation's Infrastructure Goal, and within the group of objectives focused on Wiki Experiences, is an objective related to improving the experience of contributors:
- Wiki Experiences 1: Contributor experience Objective - Both experienced and new contributors rally together online to build a trustworthy encyclopedia, with more ease and less frustration.
Wiki Experiences 1.2 Key Result
[edit]Under the Contributor experience objective, is one key result focused on increasing newcomer constructive activation on mobile:
- Wiki Experiences 1.2 (WE1.2) Key Result - Widespread deployment of interventions shown to cause a 10% relative year-over-year increase in the percentage of newcomers who publish ≥1 constructive edit in the main namespace on a mobile device, as measured by controlled experiments.
Several Wikimedia Foundation teams are committed to working on projects under the WE1.2 Key Result: Draft Hypotheses.
Growth Team Hypotheses
[edit]The Growth team's initial focus is surfacing Structured Tasks to newcomers in new ways:
Growth team hypothesis | Timeline | Phabricator main task |
---|---|---|
Wiki Experiences 1.2.3:
If we conduct user tests on two or more design prototypes introducing Structured Tasks to newcomers within/proximate to the Visual Editor, then we can quickly learn which designs will work best for new editors, while also enabling engineers to assess technical feasibility and estimate effort for each approach. |
-
|
T362584 |
Wiki Experienced 1.2.6:
If we introduce new account holders to the “Add a Link” Structured Task in Wikipedia articles, we expect to increase the percentage of new account holders who constructively activate on mobile by 10% compared to the baseline. |
2024-10-01 -
2024-03-31 |
T368187 |
Wikimedia Foundation teams are approaching annual planning more iteratively this year, so rather than committing to larger year-long projects, our first hypothesis is fairly narrow in scope. This should allow us to deliver value in smaller increments throughout the year, while also ensuring we have the flexibility to pivot as we learn.
So WE1.2.3 is just a first step, while the WE1.2.6 hypothesis relates to implementing what we learned from WE1.2.3 user tests. The initial experiment will have the narrow focus of only targeting new account holders with zero edits. Later in the fiscal year we may work on developing new types of structured tasks.
How are we defining constructive activation?
[edit]For the WE1.2 we are focusing on brand new account holders on mobile, so constructive activation is defined as a newcomer making at least one edit to an article in the main namespace of a Wikipedia project on a mobile device within 24 hours of registration on a mobile device and that edit not being reverted within 48 hours of being published. This will be measured on a per-platform basis (we will measure mobile web and mobile app activation separately).
What research and data inform these experiments?
[edit]This work is guided by the following observations and associated data and research: The majority of new account holders on Wikipedia never complete even an initial edit.
Task-specific, structured workflows cause more newcomers to publish a constructive edit.
- Structured Task research - Add a link & Add an Image
- Talk page research - Reply Tool & New Topic Tool
- Newcomer tasks experiment analysis
Newcomers struggle with noticing, understanding, and applying the policies that shape Wikipedia.
Fewer new editors are registering on Wikipedia.
- Wikimedia Stats: New registered users
- Investigate drop in New Editors T351759
The majority of Wikipedia pageviews are from mobile.
Community Discussion
[edit]We have discussed the broader concept behind this project with communities as part of the WMF annual plan Product & Technology OKRs discussion, we collected feedback from Wikimania attendees, and we will initiate a more detailed community consultation with our pilot wikis (Egyptian Arabic, French, and Spanish Wikipedia) soon. (T372957)
Design
[edit]We have explored three different approaches to surfacing structured tasks, and considered which moments during the editing journey were the most appropriate to present an edit suggestion. We decided to test and move forward with two approaches during the reading experience (screenshot 1 and 2), and another one during the editing process, inside of the Visual Editor (3).
-
(1)
-
(2)
-
(3)
We ran unmoderated user tests with participants familiar with Wikipedia, but with limited experience regarding editing and contributing to it. The main objectives were to:
- Understand which of the three designs was the most promising to users,
- Corroborate if users are more or less likely to consider editing in the future knowing that this kind of help is available, and
- Acknowledge which options for copy resonate best with participants.
User testing insights
[edit]Overall, the idea of surfacing an edit suggestion alongside/within the Visual Editor was well received. All participants (6/6) expressed that they would likely consider editing knowing that this kind of help is available, and that they would feel motivated to complete an edit. In terms of the different approaches, approach one was the most promising for most participants (4/6).
"First experience is better, it provides in-line help with editing."
However, a lot of questions came up about what the yellow tag actually meant. Participants assumed different meanings, for example that the tag was a representative for a broken link, a bot, or a recently edited text.
For approach two, participants appreciated the difficulty + time estimate tags.
"If it only takes 2 to 3 minutes, not a big deal. It says it's easy, then it makes me more likely to do it."
Approach three wasn't wildly successful, but participants expressed great feedback.
"I didn’t find [the modal] confusing or intrusive. I think it was actually encouraging".
"I would feel inclined [to edit], especially if it was a situation where I forgot that I was editing or I haven't finished editing as yet. So it's definitely a good reminder."
“I would be more inclined to click on ‘Help add a link' because I think it would almost do it for me, and I feel like I achieved something”
In terms of copy, we presented 4 options and asked participants to mention which version of the text felt the most inviting to them. These were the two most successful options, described as captivating, concise, and engaging:
“Should this text be linked? This article comes with a simple edit suggestion for adding links that guides you through the process step by step. Would you like to review it?” “Connect knowledge: Link this article to others and improve Wikipedia’s web of information.”
Next steps
[edit]As we are planning ahead, these tests helped us understand and recognize potential further improvements. Some of them are:
- Iterate on copy
- Re-think yellow tag for approach one
- Consider adding difficulty+time estimate tags to all approaches
Measurement and Results
[edit]Metrics grid from the Measurement Plan (T377096)
Metric | Timeline | Metric type | Notes, questions, and documentation |
---|---|---|---|
Impressions a: views of articles where a “Add a link” task is surfaced. | Q2 Alpha test
Q3 experiment |
Secondary | On how many pages are tasks surfaced? |
Impressions b: views of structured tasks on an article page. | Q2 Alpha test
Q3 experiment |
Secondary | How many tasks are being surfaced? |
Task CTR: click/tap on the “YES" on the pop-up | Q2 Alpha test
Q3 experiment |
Primary | How many newcomers both SEE the initial highlighted suggestion, and click through to the "Add a link" task? (# of yes clicks / # task impressions b) *100 |
Constructive Activation on mobile | Q2 Alpha - if time allows
Q3 experiment |
Primary | A newcomer making at least one edit to an article in the main namespace on a mobile device within 24 hours of registration, through the module, with that edit not being reverted within 48 hours of publication. Details: T360829#9903872 |
Newcomer Retention | Q3 experiment | Secondary | If we increase constructive activation, but that doesn’t flow into retained users, then the impact of this work will be limited; Ensure newcomer retention remains stable or improves. |
Revert Rate | Q3 experiment | Guardrail | |
Task status | Q3 experiment | Guardrail |