Reading/Multimedia/Research/Archive
This page is currently a draft.
|
This is an archive from a previous draft version of the Multimedia Research page, which includes many unprioritized tasks. A streamlined version is being created at on this page.
Key metrics
[edit]To measure our progress with the multimedia program, we propose to track these two primary metrics on a monthly basis:
- files uploaded
- files used in pages
While we have been tracking the number of uploads for several years now, this metric only provides a partial measure of success, as a large number of uploaded files are never used in articles. Hence the need for a second metric to track how many files are actually published in articles on our sites, serving our ultimate goal to provide a richer multimedia experience for our users.
Besides these two metrics, we plan to collect other data to inform our next steps for the plan above. For example, we have already started to collect these file metrics, as well as these user metrics.
Going forward, here are some of our key research questions related to file feedback, curation, discovery and publication.
Product Research
[edit]For our first products, we would like to track basic usage patterns, so that research can inform our next steps. Here are some examples of what metrics we hope to collect.
Media Viewer
[edit]Here is a preliminary list of research questions we hope to answer about the Media Viewer feature, for discussion purposes.
- How many times did users click on a thumbnail to open the media viewer?
- How many of these clicks were from an article page thumbnail? a gallery thumbnail? a category thumbnail?
- How many times did users click on the full screen button? (total and percentage)
- How many times did users click on the Commons link? (total and percentage)
- How many times did users click on the close button? (total and percentage)
- How many unique users opened the media viewer? how many were logged in?
- How many users enabled this beta feature? (pull data from beta features research)
- How many users disabled this beta feature? (pull data from beta features research)
These metrics would be collected with EventLogging and related technologies, and visualized with LIMN dashboards (see example). We would track this data on a daily basis, as well as on a cumulative basis.
Beta Features
[edit]Here is a preliminary list of research questions we hope to answer about the Beta Features tool, for discussion purposes.
- How many times did users click on the 'Beta' link in the personal bar?
- How many times did users view the 'Beta features' preference page? (impressions)
- How many times did users enable (or disable) features after clicking on the 'Beta' link?
- How many times did users enable all beta features?
- How many times did users disable all beta features?
- How many unique users enabled all features after clicking on the 'Beta' link?
- How many unique users disabled all features after clicking on the 'Beta' link?
- How many times did users enable beta feature x?
- How many times did users disable beta feature x?
- How many unique users enabled feature x?
- How many unique users disabled featurex?
- How many times did users click on the project page icon for feature x?
- How many times did users click on the discussion page icon for feature x?
- What is the clickthrough rate for enabling feature x based on total impressions for the Beta preferences page?
(repeat for each feature x, z, z)
These metrics would be collected with EventLogging and related technologies, and visualized with LIMN dashboards (see example). We would track this data on a daily basis, as well as on a cumulative basis.
File feedback / curation
[edit]- What are the many different workflows that people use to curate files on Commons today?
- Can we quantify which curation tools/workflows are used the most? Map out the most popular workflows?
- Primary workflow: How many files are uploaded, categorized, edited, overwritten, rated or featured on Commons? (see first stats)
- Orphan workflow: How many files are uncategorized and/or never edited or used on Commons? (see first stats)
- Deletion workflow: how many files are nominated for deletion and/or deleted per month on Commons? (see first stats)
- Is there a current 'quality rating' tool that could be adapted so that any user could 'mark a file as useful' in the media viewer?
- Is there a current 'flag as inappropriate' tool that could be adapted so that any user could 'mark a file as inappropriate' in the media viewer?
(for the last two questions, we are prepared to start a separate data table if there is no good match, but wanted to leave no stone unturned)
File discovery / publishing
[edit]- What are the different workflows which Wikipedia editors use to find and publish media files on articles?
- Publication workflow: How many files are used in articles on other wikis? (see GlobalUsage extension)
- Can we quantify which discovery/publishing tools/workflows are used the most? Map out the most popular workflows?
- How often do editors search for media files to add to their articles? can we tell how many of these searches are successful?
- What data do we have now on how often the Visual Editor's new 'Insert Media' tool is being used to browse and/or add files to articles?
- How many files are used today on the article name space of Wikipedia? How many images vs. sounds vs. videos? How many from Commons?
- How many articles have multimedia files on Wikipedia? How many files per article? How many articles have no media files?
- How many unique editors publish files on other wikis each month? how many for the English Wikipedia in particular?
- How often are files added to articles reverted/removed? (compared to text edits, is this more or less?)
(Note that 'file usage' or 'file user' metrics could be the most important measure of success of our multimedia program)
In coming days, these questions will be prioritized and turned into a long-term metrics plan for multimedia, based on available resources.