GStreamer: Difference between revisions

From OLPC
Jump to navigation Jump to search
(+categories. +intro paragraph)
(Move Camera/video Getting_started_programming sections to here.)
Line 3: Line 3:


http://gstreamer.freedesktop.org/
http://gstreamer.freedesktop.org/


{{anchor|Getting started}}
==Getting started programming==

[http://www.jonobacon.org/?p=750 Getting started with GStreamer with Python]

''' Camera '''
* Have to script v4l2 via gstreamer to capture a single frame from the camera. See sugar/shell/intro/glive.py for sample code
* Try this from the command line: <tt>gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png</tt>
* See also: [[Programming the camera]]
* You can simulate this hardware via a file source in gstreamer

''' Camera-as-video-camera (v4l2) '''
* Is a regular v4l2 device available via gstreamer (gst module)
** If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source)
** For test-driven development, you can use any gstreamer source (such as a regular file), hooking up your code to use a file source instead of a v4l2src
* Try this from a terminal in the developer console: <tt>gst-launch-0.10 v4l2src ! ximagesink</tt> (The image will appear <i>behind </i>the developer console window so you'll need to move the window aside. In addition, it will have strange colors, since you are displaying the YCrCb colorspace as though it is RGB data. Use <tt>gst-launch-0.10 v4l2src ! ffmpegcolorspace ! ximagesink</tt> to run slowly and get the right colors.)
* See also: [[Programming the camera]]
* You can simulate this hardware via a file source in gstreamer



==GStreamer 101==
==GStreamer 101==

Revision as of 15:33, 18 October 2007

GStreamer is our multimedia framework. It is a library that allows the construction of graphs of media-handling components, ranging from simple Ogg/Vorbis playback to complex audio (mixing) and video (non-linear editing) processing.

http://gstreamer.freedesktop.org/


Getting started programming

Getting started with GStreamer with Python

Camera

  • Have to script v4l2 via gstreamer to capture a single frame from the camera. See sugar/shell/intro/glive.py for sample code
  • Try this from the command line: gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png
  • See also: Programming the camera
  • You can simulate this hardware via a file source in gstreamer

Camera-as-video-camera (v4l2)

  • Is a regular v4l2 device available via gstreamer (gst module)
    • If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source)
    • For test-driven development, you can use any gstreamer source (such as a regular file), hooking up your code to use a file source instead of a v4l2src
  • Try this from a terminal in the developer console: gst-launch-0.10 v4l2src ! ximagesink (The image will appear behind the developer console window so you'll need to move the window aside. In addition, it will have strange colors, since you are displaying the YCrCb colorspace as though it is RGB data. Use gst-launch-0.10 v4l2src ! ffmpegcolorspace ! ximagesink to run slowly and get the right colors.)
  • See also: Programming the camera
  • You can simulate this hardware via a file source in gstreamer


GStreamer 101

Examples

v4l2src can always be replaced by videotestsrc, and alsasrc by audiotestsrc.

Beginning of a normal video pipeline:

v4l2src ! queue ! videorate ! video/x-raw-yuv,framerate=15/1 ! videoscale ! video/x-raw-yuv,width=160,height=120 ! ...

Beginning of a normal audio pipeline:

alsasrc ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! ...

Video encoding:

... ! ffmpegcolorspace ! theoraenc ! ...

Audio encoding:

... ! audioconvert ! vorbisenc ! ...

Video output:

... ! ffmpegcolorspace ! videoscale ! ximagesink

Audio output:

... ! audioconvert ! alsasink sync=false

Stdout:

... ! fdsink fd=1

Encode video+audio as ogg:

v4l2src ! queue ! ffmpegcolorspace ! theoraenc ! queue ! \
  oggmux name=mux  alsasrc ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! ...

A long version of videotestsrc ! ximagesink:

videotestsrc ! theoraenc ! oggmux ! oggdemux ! theoradec ! ffmpegcolorspace ! videoscale ! ximagesink

A long version of videotestsrc ! ximagesink &; audiotestsrc ! alsasink:

videotestsrc ! ffmpegcolorspace ! theoraenc ! queue ! \
  oggmux name=mux audiotestsrc ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! \
  oggdemux name=demux ! queue ! theoradec ! ffmpegcolorspace ! \
    ximagesink demux. ! queue ! vorbisdec ! audioconvert ! alsasink
Hangs on current fc6. MitchellNCharity 17:43, 12 June 2007 (EDT)

Live video streaming to an icecast server:

v4l2src ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv,width=320,height=240 ! theoraenc quality=16 ! oggmux ! shout2send ip=192.168.1.100 port=8000 password=hackme mount=olpc.ogg

Live streaming to an icecast server:

v4l2src ! queue ! ffmpegcolorspace ! theoraenc quality=16 ! queue ! oggmux name=mux  alsasrc ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! \
  queue ! shout2send ip=... port=... password=... mount=/whatever.ogg

Notes

video size, framerate, and theoraenc quality

audio encoding

glive.py used wav rather than vorbis. Why?

Elements

Adapters:

Video characteristics:

Audio characteristics:

File I/O:

Video sources:

Audio sources:

Coding video:

Coding audio:

Wrapping:

Outputing video:

Outputting audio:

Not sure about:

Sending ogg to an icecast streaming server:

... ! shout2send ip=... port=... password=... mount=/whatever.ogg
gst-launch ... ! fdsink | oggfwd host port password mount
Regretabbly, it appears fc6, and thus olpc, does not include shout2send in gst-plugins-good-plugins. Nor does it have rpms for oggfwd, or other possible alternatives. You can grab a random binary of oggfwd from it's site, or compile shout2send from source. :( MitchellNCharity 17:43, 12 June 2007 (EDT)
Correction, shout2send is on the olpc. oggfwd is not. --Damonkohler 23:14, 13 September 2007 (EDT)

Doing live streaming video from an xo

Section TODO

  • Fill in blank sections.
  • Include snapshot recipes.
  • Discuss python interfacing.

See also

Other resources