Account Creation Improvement Project
Account Creation Improvement Project
|
This page provides information about the technical needs of the Account Creation Improvement Project, a Wikimedia Foundation initiative from 2011, aimed at increasing the number of people who create a user account and actually start editing.
- Also see the Account creation UX editor engagement experiment.
Status
[edit]- 2011-04-22: Nimish is reworking some of the extension logic so the campaigns don't get set until the user gets to the account creation page. The current plan is to deploy to mediawiki.org on Monday, April 25, and to en.wikipedia.org on Wednesday, April 27. Exact deployment timing can be found on the Software deployments page on wikitech
- (it looks like this did deploy in April 2011, as of July 2012 the extension was still live on en wiki, and it was disabled in August 2012 as a precursor to the Account creation user experience Editor engagement experiment.)
The ACIP results concluded that the "ACP1" set of changes increased the number of newcomers who make 1 edit by 15% and increased newcomers who make 5 edits slightly.
Task list
[edit]Current tasklist, in order:
- Validate whether Special:Userlogin hooks are sufficient to inject UI message replacements for any part of the Special:Userlogin flow
- If not, modify Special:Userlogin accordingly
- Begin building extension for Special:Userlogin campaigns (Extension:SignupCampaigns?)
- MediaWiki: scheme something like MediaWiki:Campaigns-Signup/foocampaign/Welcomecreation etc. (where "foocampaign" is the identifier for the campaign)
- Begin building ClickTracking support for ACP campaigns
General overview
[edit]Objectives
[edit]Our goal is to increase the number of people who successfully follow through after clicking on the "create account" link (i.e. create a user account) and make at least one edit after account creation.
Metrics
[edit]- Whether we can increase the % of users who edit by showing various treatments upon successfully completing the Account Creation flow
- Whether we can increase the % of users who make it through the Account Creation flow
Current account creation flow
[edit]Here is the current flow on the English Wikipedia:
Current steps for account creation:
- 0: User clicks on the Log in/ create account link in the navigation
- 1: User is taken to the Login page, which asks the user to either log in or to create an account
- 2: User is taken to the Account Creation pages, which asks the user to create an account by filling in a form
- 3: User is taken to a Confirmation page
A/B-testing
[edit]Overview of the elements that will be A/B tested
[edit]The following diagram shows the elements which will be A/B tested:
Each of the three main steps within the flow will have multiple versions. Examples of each of these versions are included.
Requirements for A/B
[edit]- Ability to implement different pages for each step in the flow
- Ability to set percentages for each of the pages (% of users that see a given page)
- Ability to track fallout of each flow
Analytics Requirements
[edit]- Impact on editing: The first priority is to understand whether different versions of Page 3 result in different editing behavior. We need to be able to track which version users see so that we can run editing histories against users which have seen different versions of the page.
- We do not need to analyze the effects of different combinations of login-account creation-confirmation on editing. Simply measuring the effect of different versions of the Confirmation page is sufficient
- Funnel analysis: In order to understand the effectiveness of each point in the flow, we would ideally have funnel analysis (% of users that see 1a, 2a, 3a, etc.).
- An approximation for this is to be able to report the click-through-rates for each page
Technical specifications
[edit]General
[edit]Test settings management
[edit]There needs to be a configurable way to manage various 'treatment plans' and rates of users to get them. At the very simplest, a global configuration arrays that looks like this:
$wgBuckets = array( array( "name" => "A", "rate" => .5),
array( "name" => "B", "rate" => .5));
The structure of this configuration and how it is used will depend on whether or not we want to treat the steps as independent or not.
Edits
[edit]The software may need to send edit data to the data collection mechanism, depending on what mechanism we use for data collection.
A/B-testing in step 1 (Login)
[edit]On page load, the software needs to check if the user has a "bucket" cookie set. If not, user will need to be given a bucket cookie that corresponds to the bucket they're in. Based on this, the user will be sent to the appropriate login page. This will need to send information to the data collection mechanism.
A/B-testing in step 2 (Account creation page)
[edit]On page load, the software needs to perform the same check as above, and be sent to the appropriate account creation page, and this will need to be sent to the data collection mechanism.
A/B-testing in step 3 (Confirmation)
[edit]Use of the function "UserLoginComplete" in "Specialuserlogin.php": the developer needs to deploy an extension that provides a switcher which switches based on the configuration settings in the function "successfulCreation()". This switcher will send people either to the existing page "MediaWiki:Welcomecreation" or to a new page "MediaWiki:WelcomecreationAlternative". MediaWiki:Welcomecreation is the existing page that is shown to all users who create an account. Every admin on a local Wikipedia will be able to change it. For the bigger language versions of Wikipedia it will be sufficient if 20% of the new users will be directed to the new page MediaWiki:WelcomecreationAlternative that we use for our testing purposes. The other 80% will see the existing MediaWiki:Welcomecreation. This ratio needs to be adjustable, as smaller language versions will need a higher percentage of users to be directed to our new testing page. So, the percentage of users going to MediaWiki:WelcomecreationAlternative needs to be customizable on a project level.
Open Questions
[edit]Technical
[edit]- Should we use OWA for data collection, or specially formatted log lines sent to our log collector?
- Should the user treatment information be stored locally on the user machine or in the database?
General
[edit]- Will we treat the steps as independent and optimize on every step, or treat the whole flow as being cumulative and create a strong flow?
A/B testing examples
[edit]For content development, see Account Creation Improvement Project/Testing content.
Step 1 (Login)
[edit]- Different sizes and colors of "Create new account" button
Step 2 (Account creation page)
[edit]- Less wordy, fewer warning signs
Step 3 (Confirmation)
[edit]- Different welcome-videos
- Welcome text/video "Did you know you can edit?"
- Welcome text/video "This is how you edit"
- Welcome text/video "This is what you can do on Wikipedia"
- Welcome text/video "Places where we need help"
- Welcome text/video "These features are available to you now"
- "Learn how to edit" tutorial
- User page creation wizard