NattyGraphicsTestPlan
|
Size: 13471
Comment:
|
Size: 13771
Comment:
|
| Deletions are marked like this. | Additions are marked like this. |
| Line 203: | Line 203: |
| = Community Involvement = | = Resources = == Community Involvement == The community can actively participate in all the testing effort during that cycle: * Writing manual test cases * Writing automated test cases for Mago * Application Testing * Desktop Testing * Hardware Testing (especially with the laptop initiative from the IT LoCo Team |
DRAFT
Introduction
Purpose
The purpose of this document is to explain the different testing efforts for Unity in the Natty cycle. It is designed to prescribe the scope, approach, resources, and schedule for all testing activities during this cycle.
This document will be modified and enhanced as required throughout the verification testing engagement. The purpose of this document:
- Defines the overall test approach.
- Identifies software and system hardware to be tested.
- Identifies hardware, software, and tools to be used to support the testing efforts.
- Defines the types of tests to be performed.
Note: Specific effort for regression testing ?
Scope
The testing tasks for the components of Ubuntu includes:
- Functional testing of the applications (automated and manual - Desktop Testing initiative)
- Integration with the desktop environment
- Unity
- 2D Desktop (gnome fallback when 3D support is not available)
- Server Testing
- ISO Testing
- Hardware Testing
- Additional Testing: i18n, a11y, printing
For each testing task we will also:
- Collect, provide and review Test Cases that uncover new bugs or cover existing known issues.
- Collect test data and review testing results.
- Reporting of all test results.
- Risk analysis and communication of the blocking issues.
Testing will be performed all along during the development cycle of Natty. The main milestones of the test plan are aligned on the Release Schedule and Natty Release Interlock Testing is a very dependant activity, As a result, test planning is a continuing activity performed throughout the system development life cycle.
Note: HW Testing -> Cert Team: to be included in this doc ?
Test Items and Features
Application
Category of applications to be tested
The categories of applications to be tested are:
- Core components (Help, menus, ...)
- Audio/Video
- Games
- Graphics
- Internet
- Office
- Utilities
- System Preferences
- System Administration
Each application must be covered by at least 1 automated test case to ensure that it can be launched on the platform. Additionally, because using manual testing will cover very few test cases and requires important man power, we will focus on test cases that uncover new bugs and cover existing known issues.
ACTION
Work with bdmurray and pedro to do some datamining in launchpad and identify known issues for the aforementioned applications
Automated Testing
We will use Mago (lp:mago) and its helper tool magomatic (lp:magomatic) to automate test cases.
Unity itself is not accessible, so the Unity interface cannot be accessed directly with the tools we have right now. Luckily, Mago is tool agnostic and, although is mostly based on LDTP, we will have to add other tools to help us testing the interface itself (see Actions below for example of tools)
ACTIONS
- Investigate XPresser (Limited to image matching)
- xautomation (Lack of 'record' feature ?)
- xnee/cnee (Crashes desktop in Natty)
- Other Tools based on XTEST ?
Manual Testing
The Desktop Testing Program started last cycle. It provided a good way for the community to test the desktop applications every milestone. As part of the modifications for this program in the Natty cycle, we will be including Unity as part of the program.
For each of the applications, we will be duplicating the testcases, to be run in Unity and also the 2D experience.
The resources available to the community are:
Test Cases wiki to be extended with the following additional test cases
Desktop
Unity
Although the DX team is the main responsible of testing the Unity interface itself, we need to collaborate with them so our tests work better. The QA Team is responsible of the testing of the integration of the application with Unity (ex. Shotwell Menu not integration in the global menu)
The DX team will provide us "only dbus access, not dbusmenu. The D-Bus Debug Service will provide a service to collect the state of the Unity interface i.g Panel, Menu, Indicator, Files place, ...
Based on that, we can send dbus commands to compiz/unity to know the state of the interface without atk support.
If we want to test global menu bits, e.g. the global /appmenu/, we can do so with the gnome-panel version which call 'hidden' menus in Unity. The tests should be easily transposable to the Unity version once that menu gets atk support as well.
All the applications shipped by default with Natty must be tested in Unity. The applications on Unity can be tested using the same a11y tools that we already use for the 2D experience. One of the main caveats is that Unity does not run under Xvfb and, therefore, we cannot run it on a server without having a full desktop up and running.
The subsystems of Unity that needs to be covered are
- Applications
- Notifications
- Messaging Menu
- Sound Menu
- Files
- Dash
An example of integration Test Case is:
- create a document in gedit,
- verify that it appears in the Files Place.
- Then rename that document
- verify that the document modified appears in places.
ACTIONS
- Research on how to run the tests using Hudson (or something else) in an automated fashion
- Run the mago tests on Unity at least twice per week in a machine with Unity running as mean time solution
- The tests results will be publicly available
Additional tool to extend Mago Xpresser, which is based in image comparison, is not suitable to check the state of Unity after running a test, as the design will change a lot during the Natty cycle. Xpresser, although based in image comparison, can be good to launch applications from the launcher, as it is not accessible yet, and the icons don't tend to change that much. We need to be careful, though, as Xpresser is not being actively maintained and there could be some bugs.
2D Experience
All the applications must be tested in the 2D (legacy) environment.
The 2D experience needs to be tested as well and, actually, as the 2D experience runs well in a Xvfb, it will be easier to run using Hudson.
The subsystems of the 2D experience that needs to be covered are
- Notifications
- Messaging Menu
- Application Menu
- Sound Menu
ACTIONS
- Run a Hudson server on the datacenter
- Run a cronjob to update the system on a daily basis
- Run the mago tests on a daily basis in an automated fashion
Support Features
The following additional component are part of the Testing effort:
- Accessibility (see with charlie_tca which items and features need to be reviewed)
- Translation (see with dpm)
- Printing (tkampeter was expected to provide test cases)
Server
Although we now have expanded hardware support on Natty (AMD64, AMD64-mac, i386, PowerPC, PowerPC-ps3, ARMEL-dove, possibly others?) we will test only i386 and AMD64. This expanded support is mostly a consolidation of the different ports -- see Natty cdimage ports consolidation.
Server itself
Expand tool-chain testing with more packages (currently we daily-build libvirt, postgresql-8.4, mysql-dfsg-5.1, openldap, php5, python2.6, and coreutils).
Applications
Cloud
We will test both Eucalyptus and Openstack; amount of coverage of Openstack will depend on availability.
- add euca2ools tests
- verify euca2ools compatibility with EC2, and other Cloud technologies
- automate as much as possible the UEC tests currently run (depends on updates on euca-* utilities)
- test image compatibility (Lucid, Maverick, Natty) with EUC-Natty
Hardware Testing
HW Team
- Any input from the HW Team ?
Laptop Testing Initiative
Laptop Testing BluePrint (From the Ubuntu-IT Team)
We are going to set up a new tracker to track the results (http://laptop.qa.ubuntu.com). The Italian Testing team will be the main responsible to maintain the milestones and to provide reports.
ISO Testing
As part as our per milestone testing, we need to be testing Unity on desktops and the 2D experience fallback. Many of us use VMs to test the images and we need to find a VM technology capable of running Unity.
Also, we need to modify the testcases (or add new ones) to match the new possibilities.
Arch / Flavour ?
- AMD64
- i386
Other Reference
Schedule
ACTIONS
- Communication with the installer and design teams to modify the testcases
- Include the new testcases into the ISO tracker
Feature not to be tested
Resources
Community Involvement
The community can actively participate in all the testing effort during that cycle:
- Writing manual test cases
- Writing automated test cases for Mago
- Application Testing
- Desktop Testing
Hardware Testing (especially with the laptop initiative from the IT LoCo Team
Engage the LoCo Teams
Additionally to the Desktop Testing program and the Laptop Testing initiative, we will rely on the following efforts:
General Schedule
Week |
Date (Thursday) |
Work Item Iteration |
Status |
Testing Schedule |
Notes |
October 2010 |
|||||
1 |
October 21st |
A-2 |
Toolchain Uploaded |
|
|
2 |
October 28th |
A-2 |
Developer Summit |
|
|
November 2010 |
|||||
3 |
November 4th |
A-2 |
|
Test Plan |
|
4 |
November 11th |
A-2 |
|
Test Plan |
|
5 |
November 18th |
A-2 |
|
Mago Tests Writing |
|
6 |
November 25th |
A-2 |
Mago Tests Writing |
|
|
December 2010 |
|||||
7 |
December 2nd |
Iteration Planning |
Alpha 1 |
A1 ISO Testing |
|
8 |
December 9th |
A-2 |
|
|
|
9 |
December 16th |
A-2 |
|
|
|
10 |
December 23rd |
A-2 |
|
|
|
11 |
December 30th |
A-2 |
|
|
|
January 2011 |
|||||
12 |
January 6th |
Iteration Planning |
|
|
|
13 |
January 13th |
A-2 |
|
|
|
14 |
January 20th |
A-2 |
|
10.04.2 SRU Cleaning |
|
15 |
January 27th |
A-2 |
Ubuntu 10.04.2 |
10.04.2 ISO Testing |
|
February 2011 |
|||||
16 |
February 3rd |
Iteration Planning |
Alpha 2 |
A2 ISO Testing |
|
17 |
February 10th |
A-3 |
|
|
|
18 |
February 17th |
A-3 |
|
|
|
19 |
February 24th |
A-3 |
|
|
|
March 2011 |
|||||
20 |
March 3rd |
Iteration Planning |
Alpha 3 |
A3 ISO Testing |
|
21 |
March 10th |
Beta UI |
|
Desktop Testing |
|
22 |
March 17th |
Beta UI |
|
Desktop Testing |
|
23 |
March 24th |
Beta UI |
|
|
|
24 |
March 31st |
Quality |
Beta |
Beta ISO Testing |
|
April 2011 |
|||||
25 |
April 7th |
Quality |
Desktop Testing |
|
|
26 |
April 14th |
Quality |
FinalFreeze, KernelFreeze, NonLanguagePackTranslationDeadline |
Desktop Testing |
|
27 |
April 21st |
Quality |
RC ISO Testing |
|
|
28 |
April 28th |
Quality |
FinalRelease Ubuntu 11.04 |
Final ISO Testing |
|
QATeam/NattyGraphicsTestPlan (last edited 2011-03-17 17:01:13 by ACaen-151-1-93-135)