Programming the camera

Revision as of 13:28, 10 June 2007 by MitchellNCharity (talk | contribs) (Other resources: link-ify Camera.activity/
Jump to: navigation, search

This page explores how to interact with the laptop's built-in video camera.

Getting started

First, let's see the quickest way we can capture a still image from the camera--using a GStreamer command-line tool:

 gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png

You'll need to run the above command in either a terminal in the developer console (alt-=) or one of the virtual terminals (e.g. ctrl-alt-f1). Note this means you don't need the Sugar GUI to be running to access the camera.

You can view the PNG image created as a result of the command in the Web activity.

Now, let's try and get some video on the screen:

 gst-launch-0.10 v4l2src ! ximagesink

Unlike the first command this command will only work when executed in a terminal in the developer console. The resulting video will appear behind the developer console window so you'll need to move the developer console window aside to see the video.

Since you've now had your first hit of "ooo, shiny" moving pictures let's take a look at what's happening behind the scenes.

What just happened?

I'm going to assume you have a passing familiarity with GStreamer, if not, you could go read about it. Basically, it's a series of pipes you can throw multimedia data down and get something in a file or on screen at the end. The data starts at a source (src) and ends up in a sink (sink) and can go through a number of intermediate manipulations along the way.

While we will get to using the camera from Python eventually, we've started out with using the GStreamer command line tool gst-launch. The gst-launch tool is a quick way to experiment with putting a pipeline together and seeing what it does.

Let's take a look at that first command line again:

 gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png

The camera in a XO laptop is a regular Video4Linux 2 device which is accessed via GStreamer's v4l2src source--see that wasn't just a pile of random characters my cat threw up. Since the camera is our source it's the first item in our pipeline--notice the individual parts of the pipeline are separated with ! characters. (You could say the data goes out with a bang but it'd be a pretty bad joke.)

Next, we'll skip to look at the end of the pipeline--the sink end--here we find the filesink which simply outputs some data to a particular file. The name of the file (in our case foo.png) is provided by specifying location=foo.png—this is an example of how to supply arguments to the individual items in the pipeline. (Note: if you try to use name= it won't work! The name parameter is for referring to the element in the pipeline, not the name of the destination file.)

As you can probably guess, the pngenc plugin is in the pipeline to convert the data from the video camera into the PNG file format before it is written to the the file.

The only other item in the pipeline is the delightfully named ffmpegcolorspace plugin which performs colorspace conversions—essentially the v4l2src and pngenc plugins can't talk to each other directly because they each describe images in different ways, it's the job of ffmpegcolorspace to enable them to communicate by translating between the two styles of image description.

More to come...

Other resources

  • See sugar/shell/intro/ for sample code [1]

Note: The author is learning this as he goes along, so items described may not be the best way or correct way to do anything. And he might get bored with writing this and stop halfw