Augmented reality is overlaying computer-based information on your view of the real world.
Picture holding an XO up to the sky, and the XO displaying the stars you would see there (if you weren't indoors, the Sun was down, etc). Or pointing an XO down towards the ground, and seeing through the Earth, to continents on the other side.
The Inertial navigation peripheral hack provides compass, accelerometers, and a gyro. The gyro makes it too expensive to build into every XO, but the others are relatively inexpensive. Here are some ideas for software which might be written for it.
 See the stars
Where ever you point the XO, the XO shows that section of sky. Even if you point it down.
 See the Earth
When you point the XO horizontally or downwards, you see through the Earth. Crust, core, and the geography on the other side.
 Respecting relative head position
Rather than assume the user's eyes are a fixed distance from the XO, and centered on the screen, it would be nice to:
- determine how far the user's eyes are from the screen. So when eyes are moved closer to the screen, the program displays a larger angular slice of whatever.
- determine the vertical/horizontal offset of the user. So the user can change their angle of view through the XO "porthole", simply by shifting their head. There will be field of view limitations due to the XO's off-center camera.
This can be done using OpenCV, apparently with acceptable performance.
 See also
- Where is China? (or Antarctica, or...) It is down over there... Old sketch of looking down through the Earth from Boston.