Since growth team is an inactive project shouldn't it be removed from the project list? The link to bugzilla should also be changed to phabricator.
Talk:Quality Assurance/Flow
Thanks for letting us know. A lot of things should be updated. Feel free to help out. :)
QA_and_testing#Features_testing is a thick section and there is no dedicated Features testing page, just like we have a Browser testing page. Being the first of the two main areas of QA & testing, wouldn't it deserve an own page with all the details?
This way this landing page would be cleaner, offering better a high level overview.
Having a Features testing page could make sense. I could picture it holding general information about "Exploratory Testing" and similar topics.
Let me try something tomorrow, and if you don't like it we can revert.
Created: QA/Features_testing. It has the content the section had. Now *everything* related to Features testing can be added there shamelessly. Let's hove only the superhighlights here at QA and testing. By the way, one precise call to action is welcome: that page you created about Echo testing?
Browser testing feels half backed and misses many links offered at QA_and_testing#Browser_testing. It would make sense to reduce the size of that section here and nurture more Browser testing in order to have everything someone interested in that topic needs.
What do you think?
That could be. Alternately, we could eliminate the Browser testing page as too much clutter (see Features testing discussion above)
Let me try something tomorrow, and if you don't like it we can revert.
I have left the highlights about Browser testing in the section here and the rest has been moved to QA/Browser_testing#How_to_contribute. Note that I have made the page a subpage of QA. I believe all this clarifies a lot what is QA and testing here all about and what to do to know more and get involved.
This text comes from the old Testing portal:
Test suites
MediaWiki internal
- Parser tests
- Currently automatically run on trunk and integrated into CodeReview overview
- Hooks for extensions to add tests (not yet automated)
- t/ unit tests
- not totally functional bugzilla:20112
- Some of the more generic ones (eol checks, bom checks) moved to tools/code-utils in r54922
- tests/ unit tests
- checkSyntax
- In maintenance/
- Needs upload capability
This text comes from the old Testing portal:
Client-side test suites
- There's some stuff going on with some of the UsabilityInitiative extension, not yet automated.
- Some talk of doing Selenium-based browser-hosted testing but this has not yet been implemented.
- Some Selenium tests
- Selenium no longer the way forward Sumanah 19:15, 1 July 2011 (UTC)
- Using HTML rendering engine (Gecko? WebKit?) to run user interface tests
- some JS testing notes
- Once you get QUnit tests, have a look at TestSwarm, a distributed JS tester.
This text comes from the old Testing portal:
Performance testing
It would likely be wise to set up some automated performance testing; e.g., checking how much time, how many DB queries, how much memory used, etc. during various operations (maybe the same tests running above).
While performance indicators can vary based on environment and non-deterministic factors, logging and graphing the results of multiple iterations from a consistent testing environment can help us with two important things:
- Identify performance regressions when we see an unexpected shift in the figures
- Confirm performance improvements when we make some optimization
See for instance Mozilla's performance test graphs for Firefox:
(more FF stuff via https://developer.mozilla.org/en/Tinderbox )
Where's a good place to list new tools that we might want to try out? (or for that matter inventory of everything already in use?)
e.g. reading mediazilla:57811 made me think of https://moztrap.readthedocs.org/
(but I guess that tool is not actually a good fit for the bug linked; bug is talking more about acceptance testing (is this a good UI?) rather than functional/regression testing)
Also, apparently the beta cluster is more used than people realized. (or so I heard someone say during a recent hiccup) If we have so many volunteers testing with it then maybe a tool like this would help organize their efforts. especially could be useful at strategic points in deploy train?
ping user:Greg (WMF). also, user:Qgil-WMF, why was that removed in special:diff/937805?
Hi Jeremy,
You bring up several issues here I can address:
- Functional testing by volunteers
- Test case management for persisent, repeatable tests
- Use of beta cluster by multiple teams for multiple purposes
We have tried several experiments enlisting volunteers to test features under development, and determined through experience that such projects cost more than they benefit. All of the software development projects done by WMF are iterative to some extent, meaning that we always have a backlog of features under development. Naive testers, that is, people not following closely the day-to-day activities of these projects, invariable report bugs in three categories and only three categories: WONTFIX; DUPLICATE; and ENHANCEMENT (with the ENHANCEMENT being known issues managed in the product backlog.)
Our needs for persistent test case management is mostly answered by our suites of automated browser tests. However, I do know that our Language team has a desire to manage a suite of test cases for manual testing, which is the sort of thing that Moztrap does. I've discussed this with them, and we intend to use Phabricator to do this. In the very near future, Phabricator will be the issue-tracking system for all of WMF, so it makes sense to use it for test case management also.
Finally, beta cluster has become an important part of our work across virtually every WMF project. The beta labs shared test environment today cannot really properly support all of the projects it is being used for. We've identified the need for at least one additional test environment similar to beta labs, and we are starting work to build such a test environment.
A recent edit seems to have "disbanded" the QA team/merged it into a new "Release Engineering team". Should the team/person-related info be moved somewhere else or updated? Is there an announcement about the change?
The "Quality Assurance" page and subpages are more of a resource for doing QA at WMF/with Mediawiki and extensions and less about team management. As such, they're capable of standing on their own. We can do fancy things with the info boxes to show it's part of Release Engineering, though, if needed (suggestions welcome there).
Also and FWIW: The QA and Release Engineering teams have been merged for almost a year now, "virutally" (ie: without any real org structure change, everyone was still reporting to Robla), this change that happened is just officially making it a team in the org (Release Engineering). I'll have Robla forward the announcement.
How do I report performance problems with Visual Editor? Can I get access to Bugzilla? Robert McClenon (talk) 20:54, 4 July 2013 (UTC)
See How to report a bug-
How do I sign up to take part in Quality Assurance? (Maybe this should have been done in the past, based on the controversy about the premature release of Visual Editor in the English encyclopedia in article space.) Robert McClenon (talk) 20:53, 4 July 2013 (UTC)
Hi Robert and welcome! :)
The easiest way to get started is to join our QA mailing list and introduce yourself: https://lists.wikimedia.org/mailman/listinfo/qa
Željko
This post was posted by ZFilipin (WMF), but signed as Zeljko.filipin(WMF).