Proposal - please contribute by editing this page or posting your comments on discussion area for this page.
Note that this is template giving the frame from actual test plan. The intent is to list areas/things which one need to consider when planning needed QA activities for certain MeeGo Release (e.g. 1.2, 1.3).
The purpose of the Test Plan is to define a strategy, scope and testing activities done or supported by SDK QA team for MeeGo 1.3 release involving the HW-accelerated MeeGo Emulator for Mac & Windows host, that is developed and maintained by Intel. This plan describes the QA approach of testing the VMM module.
Objectives in VMM QA test plan is to validate the functionality and stability of VMM module running on diffrent host OS'es. The OS'es that we are focusing on are Mac OS and Windows. The target is to ensure that:
The goal is to deliver a product with no open bugs with a severity level of critical and a minimal number of open bugs with severity level major. Also the goal is to obtain a performance similar as the performance in Linux we have with Gl acceleration and VT support.
During the the release phase aiming to final release, test cases will be executed. The test cases that will be executed will be "QEMU specific" test suite from the MeeGo SDK (2) test project and they can be found on Test Link. The test cases should cover the main Objectives described above. Each test case also contains test specific criteria which decide upon test case success or failure.
Verdict shall be assigned for a test case after execution according test case verdict instructions.
For each feature listed in bugs.meego.com, in principle, one or more test cases should exist. This information should be defined in the test case according instructions given in Test Case Template
The overall objective of VMM QA is to ensure the stability and performance of the VMM module that will be used in Windows and Mac OS hosts. For example the image under QEMU is not affected by the VMM module, the Host is stable when VMM service is running and so on. Different test types will be done, including:
Tests will be ran on QEMU to ensure that it's components were not affected by the VMM. The tests performed will be the basic tests that are performed on a QEMU weekly image. They should cover the overall behavior of the QEMU. After ensuring that basic tests are passed, the VMM module will be tested and monitored by checking it's stability under various conditions (eg. VMM does not crash/ restart if the Host is under stress) or check the amount of memory it's using when idle or when playing a movie.
The priorities are as follows:
The VMM module will be released first on Mac and this will be the first priority. When it will be available for Windows, P1 will change for Windows (Win7 vs XP priority will be set in the near future).
In the first stage of testing we will cover basic functionality scenarios. Once we make sure the VMM module is stable we'll shift the focus on performance testing/benchmarking. Long hour/load and stress testing will be executed after the performance testing. The negative scenarios will the last ones executed.
Basic functionality testing will be conducted in parallel on Mac OS and Windows XP, 60% of time allocated to Mac OS and 40% to Windows XP. The allocation is dependent on the presence of any blocking issue on one of the operating systems. The same logic will apply for performance, stress and load testing - the Mac OS tests take precedence over the Windows ones.
VMM will tested in a number of standard host configurations that include Mac OS(64bit),Windows XP(32 bit) and Windows 7(64bit). The main testing will be performed on these hosts:
HostName: SDKat Graphic Unit CPU: Intel Corei5email@example.com GHz GPU: GeForce GT 430 1024MB 128bit RAM: 4GB
HostName: HP Elite 8100 small form factor CPU: Intel Core i5-650 Processor 3.2-GHz GPU: Intel Integrated RAM: 6GB
HostName: Lenovo ThinkCentre A85 Tower CPU: Intel Core i5 650 CPU @ 3.20 GHz GPU: Intel Integrated RAM: 4GB
HostName: MacBook PRO CPU: Intel Core i7 dual processor @2.75 GHz GPU: Intel HD Graphics 3000 with 384MB of DDR3 SDRAM shared with main memory RAM: 8GB
The tests will be executed manually, until some of the testing process will be automated. The automation tests will vary according to the Host OS that they will be run.
Reporting for individual test sessions is done in qa-reports. Verticals might have additional reporting practices defined.
QA will focus to automate tests as much as possible. On benchmarks, various tools will be used including OpenArena, Sunspider, Phoronix Test Suite, AppTimer, Sysbench. They will give data helping us to compare the performance of the emulator when VMM is used, and then with and without GL acceleration. Also tools like Perfmon and Win Debugger will be used to monitor the module and capture eventual errors.
Check out the Test areas and types.
The basic functionality checklist can be categorized as below:
Test areas and types for functionality has more details/ideas.
The performance target is to produce a quality product with a performance that is competitive in the market. This is going to be achieved by first looking at the different areas that affect the performance of the emulator and then trying to measure these values to find out where performance needs to be improved. The focus areas might be:
Our goal to have a fast, responsive and stable emulator using VMM on Mac and Windows.
Our goal is to have a reliable(stable) VMM module in the following targets:
The VMM should be available on any machine that runs Windows or Mac OS.