Proposal - please contribute by editing this page or posting your comments on discussion area for this page.
Note that this is template giving the frame from actual test plan. The intent is to list areas/things which one need to consider when planning needed QA activities for certain MeeGo Release (e.g. 1.2, 1.3).
The purpose of the Test Plan is to introduce the testing strategy, scope, sub-plans and testing activities done or supported by QA team for MeeGo X.X release.
The intended readers for this document are people interested about the reasoning behind the actual activities done by QA team.
<please write more>
The Test Plan is intended to provide the vehicle for QA team to confirm the functionality and completeness of the MeeGo X.X release. It is expected that the satisfaction of the complete series of criteria contained in this plan will signify successful functionality of the integrated release.
<please list more objectives>
The goal is to deliver a product with no open bugs with a severity level of critical and a minimal number of open bugs with severity level major.
<please add other goals>
During the the release phase aiming to final release, test cases will be executed. The test cases are described in test documents which are in gitorius at meego.gitorious.org/meego-quality-assurance. The actual test case code might be (depending on the license type and test level) part of the source package. Each test case also contains test specific criteria which decide upon test case success or failure.
Test cases shall fulfill the definitions given in Test case template
<add other sources and definitions used>
Verdict shall be assigned for a test case after execution according test case verdict instructions.
For each feature listed bugs.meego.com, in principle, one or more test cases should exist (see detailed test plans). This information should be defined in the test case according instructions given in Test case template
Components can be tested by a combination of component (unit, module and component integration testing put together), integration and system tests.
For a system test the component is typically tested by launching the application in the test environment and by executing functionality of the component using the User Interface, and by verifying the outcome against an 'expected result'. In cases where manual verification is required the expected outcome is specified in the test case.
Where possible all or parts of the test steps are automated, but in all cases there will be a manually executed 'Acceptance Test' for each GUI component in which the module will, at least, be verified against a standard checklist.
For an Integration test the component is typically tested by launching the component (if it has a GUI), or by launching other applications that make use of the tested component, inside the test environment and by executing functionality of the component using the User Interface. Where applicable, parts of the system may be replaced by stubs to partially insulate the component from its environment.
Unit tests are so named because they each test one unit of code. This type of tests are usually written by developers as they work on code (white-box style), to ensure that the specific function is working as expected. Whether a module of code has hundreds of unit tests or only five is irrelevant. One function might have multiple tests, to catch corner cases or other branches in the code. A test suite of unit tests should never cross process boundaries in a program, let alone network connections. Doing so introduces delays that make tests run slowly and discourage developers from running the whole suite.
Module test tests a module of code similar way than unit tests - the difference is that test are generated black-box style (without reference to the internal structure of the module).
Introducing dependencies on external modules or data also turns unit tests into component integration tests. If one module misbehaves in a chain of interrelated modules, it is not so immediately clear where to look for the cause of the failure. Tests are checking ready made part of the component with simulators (stubs, test drivers) for missing external dependencies.
Tests are written on a highest priority first basis, and secondly on broad coverage.
<add other priorization rules used by vertical or project>
At this stage we are aiming towards feature and functionality coverage.
<put additional coverage targets here>
MeeGo X.X is tested in a number of configurations, both in reference hardwares and a QEMU environment. The public configurations used for this release are YY.
<update data according the information given by product management>
All automated tests are executed in a automated environment (OTS), and typically test results are available for each build.
Manual tests are executed regularly, but certainly before each release.
<update your rules and environments here>
Reporting for individual test sessions is done in qa-reports. Verticals might have additional reporting practices defined.
Additional tools such as Valgrind may be used to do a deep analysis of test details/failures and tools such as <please contribute> are used in the documentation process. These tools which will provide additional data used to analyse the quality of the product but are not necessarily used in the Go/No-Go decision making process.
<add your own tool portfolio here>
Check out the Test areas and types.
The basic functionality checklist can be categorized as below:
Test areas and types for functionality has more details/ideas.
<list your goals for functional testing here>
The performance target is to produce a quality product with a performance that is competitive in the market. This is going to be achieved by first looking at the different areas that affect the performance of MeeGo X.X release, on actual reference hardwares, and then trying to measure these values to find out where performance needs to be improved. The focus areas might be:
Test areas and types for performance has more details/ideas.
Our goal to have a fast and responsive system having following targets:
These are our initial targets only. Once we have reached these targets we will continue to improve and create a new set of targets to aim for.
<define your approach in more details here>
Our goal is to have reliable system having following targets:
Test areas and types for reliability has more details/ideas.
The usability check list can be categorized as below:
Test areas and types for usability has more details/ideas.
MeeGo can be tested on reference hardwares as well as in a QEMU environment. Both cases have value and may provide data that would be hard or impossible to acquire 'on the other side'.
Once MeeGo is installed in the QEMU environment or installed onto a reference hardware, no additional environment setup should be required that isn't already required by MeeGo itself. It is important though that the environment is equal for each testrun, i.e. the same background processes are running for each test run, and that only background processes are running that are required.
Certain aspects of MeeGo can only be tested if associated hardware modules are available on the reference hardware/desktop machine on which the test is executed. For instance:
Please refer to the manuals that come with these items to setup correctly.
<put reference to your test environment here>
To test MeeGo in a QEMU environment, MeeGo needs to be built for the desktop which is accomplished by <put reference/link to the latest instructions over here>
To test MeeGo on a reference hardware, MeeGo needs to be built for that specific reference hardware, which is accomplished by <put reference/link to the latest instructions over here>
<Structure taken here is following components and verticals in bugs.meego.com. This is provided as the illustrative example how subplans can be organized. Align this according your project setup>
|Core OS||Handheld UX||Netbook UX|
|Bluetooth||<please add reference here>||<please add reference here>||<please add reference here>|
|Contacts||<please add reference here>||<please add reference here>||<please add reference here>|
|Content framework||<please add reference here>||<please add reference here>||<please add reference here>|
|Graphics subsystem||<please add reference here>||<please add reference here>||<please add reference here>|
|Media subsystem||<please add reference here>||<please add reference here>||<please add reference here>|
|Photo viewer||<please add reference here>||<please add reference here>||<please add reference here>|
|Qt||<please add reference here>||<please add reference here>||<please add reference here>|
|Virtual keyboard||<please add reference here>||<please add reference here>||<please add reference here>|
|Web browser||<please add reference here>||<please add reference here>||<please add reference here>|