Testing

From OLPC
Revision as of 11:53, 24 May 2007 by Kimquirk (talk | contribs)
Jump to: navigation, search
  This page is monitored by the OLPC team.

Test Strategy

There are many people and organizations who are helping out with the test effort for the XO, network communications, the server, performance and system testing. This page will summarize all the test efforts and link to more details where ever possible.

The open source community is providing a good test effort for much of the open source code. In our local test plans we need to concentrate on areas that are not easy or possible for the community due to lack of physical XO, Server, or other OLPC specific equipment.

Trac is being used to track bugs found during test. We currently don't have a system for test case management, but we may want to implement something for that as well.


Release Criteria for Trial-2, early July:

The release criteria for Trial-2 software is based on meeting the expectations of as many as 2000 children and teachers using OLPC products to evaluate whether this is the right product for their school/country.

With that in mind, we need to focus on a few good features to demonstrate 'explore', 'express', 'communicate', and 'collaborate' with good quality. This translates to prioritizing the following features:

  1. Writing (AbiWord)
  2. Drawing (Paint)
  3. Video create (Camera)
  4. Programming (eToys)
  5. Browser (Web)

Collaboration and communication should work across these activities.

Release Criteria Trial-2


Test plans/pages:

  1. Quanta HW test plan - One time test per build. This test is aimed at environmental conditions (thermal, humidity, altitude, shake)
  2. Quanta SW Test plan - One time test per SW release to Quanta. This test is aimed at basic functionality of XOs.
  3. OLPC Test Plan - One time test per build. Written for B2, to be modified for B3 so as not to overlap Tinderbox or Quanta testing. Detailed manual testing for camera, keyboard, USB, mouse pad, other devices, power management.
  4. Tinderbox Testing - Regression after all new SW builds. This test is aimed at boot up, X drivers, low level device testing, network, and performance at the CPU level, and power management.
  5. User Stories Testing - One time test per SW release. This testing is aimed at system level testing based on real user stories.