GStreamer: Difference between revisions

From OLPC
Jump to navigation Jump to search
(→‎Introduction: add section mentioning command line, flag rest for {{Developers}})
No edit summary
Line 8: Line 8:


== Using gst at the command line ==
== Using gst at the command line ==
The rest of this page is oriented towards developers using GStreamer.
Expert users can run GStreamer commands in a [[Terminal Activity]] to do things like

In addition, expert users can run GStreamer commands similar to those below in a [[Terminal Activity]] to do things like
* play audio files in the background
* play audio files in the background
* stream XO video to other servers
* stream XO video to other servers

{{developers}}
The rest of this page is oriented towards developers using GStreamer.


Home page: http://gstreamer.freedesktop.org/
Home page: http://gstreamer.freedesktop.org/
Line 23: Line 22:
[http://www.jonobacon.org/?p=750 Getting started with GStreamer with Python]
[http://www.jonobacon.org/?p=750 Getting started with GStreamer with Python]


== Recipes ==
''' Camera '''
=== Camera ===
* Have to script v4l2 via gstreamer to capture a single frame from the camera. See sugar/shell/intro/glive.py for sample code
* Have to script v4l2 via gstreamer to capture a single frame from the camera. See sugar/shell/intro/glive.py for sample code
* Try this from the command line: <tt>gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png</tt>
* Try this from the command line: <tt>gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png</tt>
Line 29: Line 29:
* You can simulate this hardware via a file source in gstreamer
* You can simulate this hardware via a file source in gstreamer


''' Camera-as-video-camera (v4l2) '''
=== Camera-as-video-camera (v4l2) ===
* Is a regular v4l2 device available via gstreamer (gst module)
* Is a regular v4l2 device available via gstreamer (gst module)
** If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source)
** If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source)

Revision as of 23:19, 23 November 2008

GStreamer is our multimedia framework library.

Introduction

GStreamer is a library that supports multimedia, ranging from simple audio/video playback of Ogg/Vorbis files to complex audio mixing and video processing.

Using gst at the command line

The rest of this page is oriented towards developers using GStreamer.

In addition, expert users can run GStreamer commands similar to those below in a Terminal Activity to do things like

  • play audio files in the background
  • stream XO video to other servers

Home page: http://gstreamer.freedesktop.org/

Getting started programming

Getting started with GStreamer with Python

Recipes

Camera

  • Have to script v4l2 via gstreamer to capture a single frame from the camera. See sugar/shell/intro/glive.py for sample code
  • Try this from the command line: gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png
  • See also: Programming the camera
  • You can simulate this hardware via a file source in gstreamer

Camera-as-video-camera (v4l2)

  • Is a regular v4l2 device available via gstreamer (gst module)
    • If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source)
    • For test-driven development, you can use any gstreamer source (such as a regular file), hooking up your code to use a file source instead of a v4l2src
  • Try this from a terminal in the developer console: gst-launch-0.10 v4l2src ! ximagesink (The image will appear behind the developer console window so you'll need to move the window aside. In addition, it will have strange colors, since you are displaying the YCrCb colorspace as though it is RGB data. Use gst-launch-0.10 v4l2src ! ffmpegcolorspace ! ximagesink to run slowly and get the right colors.)
  • See also: Programming the camera
  • You can simulate this hardware via a file source in gstreamer


GStreamer 101

Examples

v4l2src can always be replaced by videotestsrc, and alsasrc by audiotestsrc.

Beginning of a normal video pipeline:

v4l2src ! queue ! videorate ! video/x-raw-yuv,framerate=15/1 ! videoscale ! video/x-raw-yuv,width=160,height=120 ! ...

Beginning of a normal audio pipeline:

alsasrc ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! ...

Video encoding:

... ! ffmpegcolorspace ! theoraenc ! ...

Audio encoding:

... ! audioconvert ! vorbisenc ! ...

Video output:

... ! ffmpegcolorspace ! videoscale ! ximagesink

Audio output:

... ! audioconvert ! alsasink sync=false

Stdout:

... ! fdsink fd=1

Encode video+audio as ogg:

v4l2src ! queue ! ffmpegcolorspace ! theoraenc ! queue ! \
  oggmux name=mux  alsasrc ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! ...

A long version of videotestsrc ! ximagesink:

videotestsrc ! theoraenc ! oggmux ! oggdemux ! theoradec ! ffmpegcolorspace ! videoscale ! ximagesink

A long version of videotestsrc ! ximagesink &; audiotestsrc ! alsasink:

videotestsrc ! ffmpegcolorspace ! theoraenc ! queue ! \
  oggmux name=mux audiotestsrc ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! \
  oggdemux name=demux ! queue ! theoradec ! ffmpegcolorspace ! \
    ximagesink demux. ! queue ! vorbisdec ! audioconvert ! alsasink
Hangs on current fc6. MitchellNCharity 17:43, 12 June 2007 (EDT)

Live video streaming to an icecast server:

v4l2src ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv,width=320,height=240 ! theoraenc quality=16 ! oggmux ! shout2send ip=192.168.1.100 port=8000 password=hackme mount=olpc.ogg

Live streaming to an icecast server:

v4l2src ! queue ! ffmpegcolorspace ! theoraenc quality=16 ! queue ! oggmux name=mux  alsasrc ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! \
  queue ! shout2send ip=... port=... password=... mount=/whatever.ogg

Time lapse photography (1 frame every 30 seconds):

gst-launch v4l2src ! videorate ! video/x-raw-yuv,width=640,height=480,framerate=\(fraction\)1/30 ! ffmpegcolorspace ! jpegenc ! multipartmux ! filesink location=lapse1.mjpeg

press ctrl-c to stop

Playback:

gst-launch filesrc location=lapse1.mjpeg ! multipartdemux ! jpegdec ! autovideosink

Notes

Colorspace

The video coming raw out of v4l2src is color formatted YUYV. That is, there are 4 bytes per 2 pixels. The first 8 bit Y represents the luma (brightness) of the first pixel, the second 8 bit Y is the brightness of the second, the 8 bit U is the blue chrominance of both pixels, and the 8 bit V is the red chrominance of both.

video size, framerate, and theoraenc quality

audio encoding

glive.py used wav rather than vorbis. Why?

Elements

Adapters:

Video characteristics:

Audio characteristics:

File I/O:

Video sources:

Audio sources:

Coding video:

Coding audio:

Wrapping:

Outputing video:

Outputting audio:

Not sure about:

Sending ogg to an icecast streaming server:

... ! shout2send ip=... port=... password=... mount=/whatever.ogg
gst-launch ... ! fdsink | oggfwd host port password mount
Regretabbly, it appears fc6, and thus olpc, does not include shout2send in gst-plugins-good-plugins. Nor does it have rpms for oggfwd, or other possible alternatives. You can grab a random binary of oggfwd from it's site, or compile shout2send from source. :( MitchellNCharity 17:43, 12 June 2007 (EDT)
Correction, shout2send is on the olpc. oggfwd is not. --Damonkohler 23:14, 13 September 2007 (EDT)

Doing live streaming video from an xo

Method1

xo sending video:

gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)5/1 ! ffmpegcolorspace ! jpegenc ! multipartmux ! tcpserversink host=192.168.2.1 port=5000

xo recieving video:

gst-launch tcpclientsrc host=192.168.2.2 port=5001 ! multipartdemux ! jpegdec ! autovideosink

Method2

xo sending video:

gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)2/1 ! ffmpegcolorspace ! smokeenc keyframe=8 qmax=40 ! udpsink host=192.168.1.1 port=5000

xo receiving video:

gst-launch udpsrc port=5000 ! smokedec ! autovideosink

Method3 Video+Audio

Sending XO:

gst-launch v4l2src ! queue ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)4/1 ! videorate ! videoscale ! ffmpegcolorspace ! queue ! smokeenc ! queue ! udpsink host=192.168.2.129 port=5000 alsasrc ! queue ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! audioconvert ! speexenc ! queue ! tcpserversink host=192.168.2.129 port=5001

Receiving XO:

gst-launch-0.10 udpsrc port=5000 ! queue ! smokedec ! queue ! autovideosink tcpclientsrc host=192.168.2.129 port=5001 ! queue ! speexdec ! queue ! alsasink sync=false

Section TODO

  • Fill in blank sections.
  • Include snapshot recipes.
  • Discuss python interfacing.


See also

Other resources

Newer package

Some new rpm's that may fix gstreamer codec and playback issues : http://people.collabora.co.uk/~daf/olpc-gst/

needs testing! Sj talk WHICH VERSION, WHEN?! Please DATE comments like this