Reading/Readers contributions via Android
This consultation is closed and the page has been left up for archival purposes. Please do not edit the page. The consultation was up from December of 2016 and slowed significantly in the weeks leading to its official close on February 27th. The outcome of the consultation can be read here. |
About
[edit]What kind of tasks that could be done by readers or casual contributors that would support editors at the same time?
After the work on community of readers and the user interaction consultation, the Reading team would like to move forward with a conversation that helps the team frame a theme of products, where readers are actively contributing with some of the required editing tasks. Below, we have created wireframes to demonstrate a subset of the ideas mentioned in our first consultation on this subject.
You can check the Reading team thoughts on the topic here, while below is a list of suggested tasks, and their evaluation based on some criteria. Please feel free to add more ideas, and add to the criteria or comment in the talk page.
Kindly note that all ideas need organization and planning around content moderation, a thing which we would need to discuss seriously if any of those idea are going to move towards implementation.
The consultation has been around for over 4 weeks, now, and there is a dedicated place to summarise the discussion to help us with wrap up.
Recording audio titles
[edit]- Goal
Adding an appropriate audio micro-contribution capability to mobile apps, where there is an increasing readership, as a way to foster more user usage and contributions on mobile for audio content. New activities might inspire existing readers community with new experiences to convert more mobile readers to editors.
- User Story
When I come across an article that has a hard to pronounce title that I am familiar with, I want to contribute an audio sample of my pronunciation so that other readers unfamiliar with IPA are able to hear the correct pronunciation.
User contributes a recorded pronunciation of the article's title using their mobile device's microphone. (e.g. en:Eyjafjallajokull which can be linked in Wikidata e.g. D:Q39651#P443)
- How do we know we’ve been successful?
- Increase in addition of quality (ie., non-vandalism type) audio contributions
- Increase usage of listen feature
View the PDFs showing the upload and moderation workflow of this idea, or alternatively view an Interactive mockup (on Invision).
Image contributions
[edit]- Goal
Adding an appropriate image via direct upload through the Android app. The entry point of the image could have one of the below two elaborated scenarios below:
Add image via Nearby
[edit]If one allowed our app to access their geo-location, the app could send them a notification that there was a wikipedia entry near them that needs an image. A handy feature that you can opt-in for if you are sightseeing and have time. Could also work well for an organized photo walk, or hackathon kind of activity.
View the PDF showing the image upload workflow, or alternatively view an Interactive mockup (on Invision).
Add/edit lead image to the article
[edit]The idea is if someone is reading an article which is missing a lead image and there is a more suitable image on their device, they can add it to the article.
This was requested in the 2016 community wishlist: https://meta.wikimedia.org/wiki/2016_Community_Wishlist_Survey/Categories/Mobile_and_apps#Uploading_from_a_mobile_phone
View the PDF showing this upload workflow, or alternatively view an Interactive mockup (on Invision).
Moderation queue for image submissions
[edit]As part of supporting the ability to allow relatively high-visibility contribution of images via mobile devices is an ability to also moderate image contributions in the app through the introduction of a dedicated moderation queue/dashboard.
Mockup of a proposed Moderation queue here
View the PDF showing this upload workflow, or alternatively view an Interactive mockup (on Invision).
Lead image editing
[edit]Goal: Improve the quality of the lead image feature, where images are sometimes poorly cropped.
This is a lower impact image feature to allow improving the display of the lead image for an article by allowing users to update the crop and choice of already existing images within the article.
A. Update the crop of the lead image on an article
B. Change the lead image to another image within the article
View the PDFs showing the lead image edit and selection workflows, or alternatively view an Interactive mockup (on Invision).
Article Feedback (Rate/Thank)
[edit]This idea has two variations, and concentrates on a 'voting' aspect of article feedback rather than providing feedback on improving article content (this has been separated into another proposal called 'Report an issue')
A. Rating
[edit]The idea is to provide a voice to readers on a platform that is predominantly readers (mobile apps). Increase reader contributions from mobile app users via micro-contributions. But introducing a low effort feedback mechanism for readers that can give value back to Editors (providing recognition and impetus for improving quality of articles focused on the audience), readers then benefit from seeing others’ feedback that may help inform their reading choices, and secondarily they become more aware of the editorial aspect of Wikipedia.
A specific version of this was proposed and discussed in the 2016 community wishlist: https://meta.wikimedia.org/wiki/2016_Community_Wishlist_Survey/Categories/Reading#Readers_comments_and_vote
View the PDF showing the rate an article workflow, or alternatively view an Interactive mockup (on Invision).
B. Readers thank Editors
[edit]A variation of this idea is allowing users to thank Editors at the end of an article. This essentially has the same benefits as rating an article in providing appreciation to Editors and promoting the fact that Wikipedia is editable, whilst addressing some of the concerns raised in the past regarding users rating negatively based on topic rather than article content quality by removing thumbs down voting option.
View the PDF showing the readers thank editors workflow, or alternatively view an Interactive mockup (on Invision).
Questions about articles
[edit]Make use of the quick contribution format by asking short questions suitable for mobile readers to help improve Wikipedia content. We would do this by engaging the existing reader community with new experiences that allow them to contribute to improving Wikipedia.
This project overall asks humans to review and categorize article content. The resulting output could be used for two Wikimedia initiatives:
A. Wikilabels for mobile
[edit]Users ‘label’ or categorize articles and edit types made to articles to help ‘train’ Wikipedia’s machine learning system (ORES).
Currently, there are three ‘campaigns’ asking for labeling in Wikilabels (English):
- Article topic – categorize an article as Academic and/or Pop culture by reading the article.
- Edit type – categorize the likely intention(s) of an edit, and whether it was an addition, modification and/or removal by looking at the diff of an article edit.
- Edit quality – categorize whether an edit was damaging and whether it was made in good faith by looking at the diff of an article edit.
For the purpose of categorization on mobile devices for readers, it seems apt to start with the first article topic campaign, since the majority of readers would be unfamiliar with edit diffs and it would be outside the normal workflow of a mobile reader.
View the PDF showing the Wikilabels on mobile workflow, or alternatively view an Interactive mockup (on Invision).
User confirm metadata information about an article that will be eventually stored in Wikidata.
The variation in this mock is that readers are asked questions in a section at the end of the article that they can optionally answer, since the presumption is they would require more knowledge gleaned from reading the article to be both more engaged and able to answer metadata type questions.
View the PDF showing the WikiGrok on mobile workflow, or alternatively view an Interactive mockup (on Invision).
How do we measure success of this idea?
Reduction in the Wikilabels queue for article categorizations or increase in quality and content of article metadata in Wikidata.
Report an issue
[edit]Help empower mobile readers to make minor contributions to improve the quality of article content and display, which may ultimately engage some readers to convert to become more active editors.
Background - why do this?
- Low barrier to entry to encourage readers to start contributing
- Useful as a quick flag also for Editors who do not want to/cannot edit on mobile
- Contextually relevant to allow users to submit visual inconsistencies native on their device in situ
- Mobile apps users are familiar with the model of submitting content feedback (eg., Maps, Instagram, YouTube, Foursquare)
View the PDF showing the report article error workflows, or alternatively view an Interactive mockup (on Invision).