GStreamer/Developers: Difference between revisions
(copied from page above) |
No edit summary |
||
Line 20: | Line 20: | ||
** If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source) |
** If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source) |
||
** For test-driven development, you can use any gstreamer source (such as a regular file), hooking up your code to use a file source instead of a v4l2src |
** For test-driven development, you can use any gstreamer source (such as a regular file), hooking up your code to use a file source instead of a v4l2src |
||
* Try this from a terminal in the developer console: <tt>gst-launch-0.10 v4l2src ! ximagesink</tt> (The image |
* Try this from a terminal in the developer console: <tt>gst-launch-0.10 v4l2src ! ximagesink</tt> (The image may have strange colors, since you are displaying the YCrCb colorspace as though it is RGB data. Use <tt>gst-launch-0.10 v4l2src ! ffmpegcolorspace ! ximagesink</tt> to run slowly and get the right colors.) |
||
* See also: [[Programming the camera]] |
* See also: [[Programming the camera]] |
||
* You can simulate this hardware via a file source in gstreamer |
* You can simulate this hardware via a file source in gstreamer |
Latest revision as of 05:51, 2 June 2015
The rest of this page is oriented towards developers using GStreamer.
Getting started programming
Getting started with GStreamer with Python
Recipes
Camera
- Have to script v4l2 via gstreamer to capture a single frame from the camera. See sugar/shell/intro/glive.py for sample code
- Try this from the command line: gst-launch-0.10 v4l2src ! ffmpegcolorspace ! pngenc ! filesink location=foo.png
- See also: Programming the camera
- You can simulate this hardware via a file source in gstreamer
Camera-as-video-camera (v4l2)
- Is a regular v4l2 device available via gstreamer (gst module)
- If you do not have hardware, you can simulate the camera interactively with an regular v4l2 source (many cheap web-cams provide this type of source)
- For test-driven development, you can use any gstreamer source (such as a regular file), hooking up your code to use a file source instead of a v4l2src
- Try this from a terminal in the developer console: gst-launch-0.10 v4l2src ! ximagesink (The image may have strange colors, since you are displaying the YCrCb colorspace as though it is RGB data. Use gst-launch-0.10 v4l2src ! ffmpegcolorspace ! ximagesink to run slowly and get the right colors.)
- See also: Programming the camera
- You can simulate this hardware via a file source in gstreamer
GStreamer 101
Examples
v4l2src can always be replaced by videotestsrc, and alsasrc by audiotestsrc.
Beginning of a normal video pipeline:
v4l2src ! queue ! videorate ! video/x-raw-yuv,framerate=15/1 ! videoscale ! video/x-raw-yuv,width=160,height=120 ! ...
Beginning of a normal audio pipeline:
alsasrc ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! ...
Video encoding:
... ! ffmpegcolorspace ! theoraenc ! ...
Audio encoding:
... ! audioconvert ! vorbisenc ! ...
Video output:
... ! ffmpegcolorspace ! videoscale ! ximagesink
Audio output:
... ! audioconvert ! alsasink sync=false
Stdout:
... ! fdsink fd=1
Encode video+audio as ogg:
v4l2src ! queue ! ffmpegcolorspace ! theoraenc ! queue ! \ oggmux name=mux alsasrc ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! ...
A long version of videotestsrc ! ximagesink:
videotestsrc ! theoraenc ! oggmux ! oggdemux ! theoradec ! ffmpegcolorspace ! videoscale ! ximagesink
A long version of videotestsrc ! ximagesink &; audiotestsrc ! alsasink:
videotestsrc ! ffmpegcolorspace ! theoraenc ! queue ! \ oggmux name=mux audiotestsrc ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! \ oggdemux name=demux ! queue ! theoradec ! ffmpegcolorspace ! \ ximagesink demux. ! queue ! vorbisdec ! audioconvert ! alsasink
- Hangs on current fc6. MitchellNCharity 17:43, 12 June 2007 (EDT)
Live video streaming to an icecast server:
v4l2src ! ffmpegcolorspace ! videoscale ! video/x-raw-yuv,width=320,height=240 ! theoraenc quality=16 ! oggmux ! shout2send ip=192.168.1.100 port=8000 password=hackme mount=olpc.ogg
Live streaming to an icecast server:
v4l2src ! queue ! ffmpegcolorspace ! theoraenc quality=16 ! queue ! oggmux name=mux alsasrc ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! \ queue ! shout2send ip=... port=... password=... mount=/whatever.ogg
Time lapse photography (1 frame every 30 seconds):
gst-launch v4l2src ! videorate ! video/x-raw-yuv,width=640,height=480,framerate=\(fraction\)1/30 ! ffmpegcolorspace ! jpegenc ! multipartmux ! filesink location=lapse1.mjpeg
press ctrl-c to stop
Playback:
gst-launch filesrc location=lapse1.mjpeg ! multipartdemux ! jpegdec ! ffmpegcolorspace ! autovideosink
Notes
Colorspace
The video coming raw out of v4l2src is color formatted YUYV. That is, there are 4 bytes per 2 pixels. The first 8 bit Y represents the luma (brightness) of the first pixel, the second 8 bit Y is the brightness of the second, the 8 bit U is the blue chrominance of both pixels, and the 8 bit V is the red chrominance of both.
video size, framerate, and theoraenc quality
audio encoding
glive.py used wav rather than vorbis. Why?
Elements
Adapters:
- tee
- queue
- videoscale Convert video size.
- videorate Convert video rate.
- ffmpegcolorspace Convert video colorspace.
- audioconvert Convert audio format.
Video characteristics:
Audio characteristics:
File I/O:
Video sources:
Audio sources:
Coding video:
Coding audio:
Wrapping:
oggmuxbroken (MitchellNCharity 11:44, 13 June 2007 (EDT))- oggdemux
Outputing video:
Outputting audio:
Not sure about:
Sending ogg to an icecast streaming server:
... ! shout2send ip=... port=... password=... mount=/whatever.ogg gst-launch ... ! fdsink | oggfwd host port password mount
- Regretabbly, it appears fc6, and thus olpc, does not include shout2send in gst-plugins-good-plugins. Nor does it have rpms for oggfwd, or other possible alternatives. You can grab a random binary of oggfwd from it's site, or compile shout2send from source. :( MitchellNCharity 17:43, 12 June 2007 (EDT)
- Correction, shout2send is on the olpc. oggfwd is not. --Damonkohler 23:14, 13 September 2007 (EDT)
Doing live streaming video from an xo
Method1
xo sending video:
gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)5/1 ! ffmpegcolorspace ! jpegenc ! multipartmux ! tcpserversink host=192.168.2.1 port=5000
xo recieving video:
gst-launch tcpclientsrc host=192.168.2.2 port=5001 ! multipartdemux ! jpegdec ! autovideosink
Method2
xo sending video:
gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)2/1 ! ffmpegcolorspace ! smokeenc keyframe=8 qmax=40 ! udpsink host=192.168.1.1 port=5000
xo receiving video:
gst-launch udpsrc port=5000 ! smokedec ! autovideosink
Method3 Video+Audio
Sending XO:
gst-launch v4l2src ! queue ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)4/1 ! videorate ! videoscale ! ffmpegcolorspace ! queue ! smokeenc ! queue ! udpsink host=192.168.2.129 port=5000 alsasrc ! queue ! audio/x-raw-int,rate=8000,channels=1,depth=8 ! audioconvert ! speexenc ! queue ! tcpserversink host=192.168.2.129 port=5001
Receiving XO:
gst-launch-0.10 udpsrc port=5000 ! queue ! smokedec ! queue ! autovideosink tcpclientsrc host=192.168.2.129 port=5001 ! queue ! speexdec ! queue ! alsasink sync=false
Section TODO
- Fill in blank sections.
- Include snapshot recipes.
- Discuss python interfacing.
See also
- Programming the camera
- Video
- Sound
- Display
- Compiling_GStreamer_On_The_XO for cutting-edge GStreamer development
- http://gstreamer.freedesktop.org/
- http://gstreamer.freedesktop.org/documentation/
- Getting started with GStreamer with Python
- Software components ; Sugar Architecture/API#Third_Party_Packages
Other resources
- http://www.mediamods.com/public-svn/camera-activity/Record.activity/glive.py
- https://coderanger.net/svn/projects/olpc/games/olpcgames/camera.py (said to not really be working (trouble knowing when gs is done) MitchellNCharity 18:24, 12 June 2007 (EDT))
- Where is the camera code in sugar? MitchellNCharity 18:16, 12 June 2007 (EDT)
- gstreamer-plugins
- gstreamer-plugins-plugin-coreelements
- gst-plugins-base-plugins
- gst-plugins-good-plugins
- google search "site:gstreamer.freedesktop.org"
- http://xiph.org/ Ogg/Theora/Vorbis