Talk:Test cases 8.2.1

From OLPC
Jump to: navigation, search

Notes from 2008-12-23

These are rough notes from a conversation with Ed McNierney (8.2.1 release manager) and Kim Quirk (VP of Testing & Support), with Frances Hopkins (OLPC QA for 8.2.0) and Michael Stone (8.2.0 release manager, 8.2.1 developer) joining in towards the end. They're probably incomprehensible to most people right now, but please ask if you have questions or want clarifications and I'm happy to edit. Mchua 16:45, 24 December 2008 (UTC)


Ticket testing
---------------
- Refers to "are the blockers for this release closed?"
- Important, and currently the only thing in the test plan
- There should be more kinds of testing, though (notably, smoke testing
  and also examining fixes to these tickets for areas of risk that we need
  to regression test against).

Connectivity testing
----------------------

- degrading connectivity: the quickest check is to look at Neighborhood
  for 2 XOs that should be visible to each other; if the Neighborhoods of
  both XOs look different, or strange things (Activities without anyone
  sharing them, XO-people gathered around a nonexistent Activity) appear,
  it's a likely sound of connectivity degradation and further tests should
  be performed.
- when doing mesh testing, you want RF isolation; nothing else should show
  up on Neighborhood view.
- mesh testing is incredibly noisy (in the RF sense) because it's a multicast
  spew - for N laptops in a mesh, each laptop has to continuously ping N-1
  other laptops.
- there are two kinds of communications that create noise with XOs: "Are you
  there?" presence pings, and "We're sharing an activity" collaboration.
  (aka "what are you doing?" pings.)
- when laptops are in a simple mesh, presence noise and activity noise are
  high because each XO has to ping N-1 other XOs for it.
- when laptops are on an AP, the presence noise goes down (each XO now has
  to keep track of 1 point - the AP - for presence) but collaboration
  (jabber) is still XO-to-XO so the noise is high.
- when laptops are on an XS, presence and collaboration are both done through
  the single point of the XS, so noise is low; this is why we can get
  many laptops collaborating on an XS.

Connectivity tests pass if they work for these numbers:
10 laptops in a simple mesh
20 laptops on a single AP
50 laptops on a XS

- Noise notes: most non-XO laptops start trying to connect with 802.11a, XOs do b/g. 
- XOs default to trying to come up on channel 1 when they're turned on. This means
  it is a bad idea to turn on 50 XOs at once (when trying to run the 50 XOs on an XS
  tests, for example) because they'll all come up multicasting on channel 1 and
  everyone attempting to connect to wifi around you will be rather upset by their
  newfound inability to do so. Bring XOs up in small groups (5 is a good number)
  and switch them to another channel; connect them to an XS (stop the multicast spew
  and then go for the next group.

When running connectivity tests, it doesn't matter what AP (hardware) you ran them with,
so long as you document it the AP you used.

Do make sure that wpa, wep, wpa2, etc. work.

Connectivity testing is particularly intermittent. We need to create a standard for
how many/frequently/often connectivity tests have to pass for connectivity to be
declared "good."

Terminology
------------

Ed's definition of a beta release: It's done, but there are probably bugs.
In other words, there's nothing known wrong, but we probably don't know what's wrong.

firmware testing
-------------------
Must test all 4 options for:
signed, unsigned
secure, unsecure

Also look at upgrade paths for what is out there
(what are customers running? find out!)

improving the release testing process
----- --------------------------------

collect more test cases. 8.2.0 was the first time we did this at all
status reports: some easy big green bar indicator of "how close we are
to done" at all times

comumunity testers
--------------------

- enabling them is a lot of work (it's an investment).

Things they can do that 1cc can't:

- have many more man-hours that we do
- environmental permutations of both hardware and RF environment

Things 1cc can do that they can't easily (yet):

- access to large-scale testbeds (20, 50 XOs, an XS)
- access to deployments

acceptance testing
--------------------

Making deployment-specific test case stubs may be a good idea.
(In other words, a test which passes only if it's, say, Peru that
does the testing.)

We should try to get deployments to do this.
Highest priority: Uruguay.
Peru and Mongolia may be able to help. (Peru has a testbed,
Mongolia has an excellent and excited engineer.)
Oceania may be a good place to try grassroots acceptance testing.

Questions/Todo (for mchua)
------------------------------
- gather more todos from these notes
- how can you tell what channel an AP is on? (In general, I need to learn
  a lot more about how wireless works. --Mchua)
- set up an RF monitor. Reuben might have one.
- talk with Joe about testbed usage
- talk with Reuben about XS usage
- acceptance testing by deployments: have those conversations