|GStreamer Base Plugins 0.10 Plugins Reference Manual|
|Top | Description | Object Hierarchy | Implemented Interfaces | Properties|
GObject +----GstObject +----GstElement +----GstBin +----GstPipeline +----GstPlayBaseBin +----GstPlayBin
GstPlayBin implements GstChildProxy.
"audio-sink" GstElement* : Read / Write "frame" GstBuffer* : Read "subtitle-font-desc" gchar* : Write "video-sink" GstElement* : Read / Write "vis-plugin" GstElement* : Read / Write "volume" gdouble : Read / Write "connection-speed" guint : Read / Write
Playbin provides a stand-alone everything-in-one abstraction for an audio and/or video player.
It can handle both audio and video files and features
A playbin element can be created just like any other element using
gst_element_factory_make(). The file/URI to play should be set via the "uri"
property. This must be an absolute URI, relative file paths are not allowed.
Example URIs are file:///home/joe/movie.avi or http://www.joedoe.com/foo.ogg
Playbin is a GstPipeline. It will notify the application of everything that's happening (errors, end of stream, tags found, state changes, etc.) by posting messages on its GstBus. The application needs to watch the bus.
Playback can be initiated by setting the element to PLAYING state using
gst_element_set_state(). Note that the state change will take place in
the background in a separate thread, when the function returns playback
is probably not happening yet and any errors might not have occured yet.
Applications using playbin should ideally be written to deal with things
When playback has finished (an EOS message has been received on the bus) or an error has occured (an ERROR message has been received on the bus) or the user wants to play a different track, playbin should be set back to READY or NULL state, then the "uri" property should be set to the new location and then playbin be set to PLAYING state again.
Seeking can be done using
on the playbin element. Again, the seek will not be executed
instantaneously, but will be done in a background thread. When the seek
call returns the seek will most likely still be in process. An application
may wait for the seek to finish (or fail) using
-1 as the timeout, but this will block the user interface and is not
recommended at all.
Applications may query the current position and duration of the stream
setting the format passed to GST_FORMAT_TIME. If the query was successful,
the duration or position will have been returned in units of nanoseconds.
By default, if no audio sink or video sink has been specified via the "audio-sink" or "video-sink" property, playbin will use the autoaudiosink and autovideosink elements to find the first-best available output method. This should work in most cases, but is not always desirable. Often either the user or application might want to specify more explicitly what to use for audio and video output.
If the application wants more control over how audio or video should be
output, it may create the audio/video sink elements itself (for example
gst_element_factory_make()) and provide them to playbin using the
"audio-sink" or "video-sink" property.
GNOME-based applications, for example, will usually want to create gconfaudiosink and gconfvideosink elements and make playbin use those, so that output happens to whatever the user has configured in the GNOME Multimedia System Selector configuration dialog.
The sink elements do not necessarily need to be ready-made sinks. It is possible to create container elements that look like a sink to playbin, but in reality contain a number of custom elements linked together. This can be achieved by creating a GstBin and putting elements in there and linking them, and then creating a sink GstGhostPad for the bin and pointing it to the sink pad of the first element within the bin. This can be used for a number of purposes, for example to force output to a particular format or to modify or observe the data before it is output.
It is also possible to 'suppress' audio and/or video output by using 'fakesink' elements (or capture it from there using the fakesink element's "handoff" signal, which, nota bene, is fired from the streaming thread!).
Most of the common meta data (artist, title, etc.) can be retrieved by watching for TAG messages on the pipeline's bus (see above).
Other more specific meta information like width/height/framerate of video
streams or samplerate/number of channels of audio streams can be obtained
using the "stream-info" property, which will return a GList of
stream info objects, one for each stream. These are opaque objects that can
only be accessed via the standard GObject property interface, ie.
Each stream info object has the following properties:
Stream information from the "stream-info" property is best queried once playbin has changed into PAUSED or PLAYING state (which can be detected via a state-changed message on the GstBus where old_state=READY and new_state=PAUSED), since before that the list might not be complete yet or not contain all available information (like language-codes).
gst-launch -v playbin uri=file:///path/to/somefile.avi
gst-launch -v playbin uri=cdda://4
gst-launch -v playbin uri=dvd://1
|Wim Taymans <firstname.lastname@example.org>|
"audio-sink" GstElement* : Read / Write
the audio output element to use (NULL = default sink).
"frame" GstBuffer* : Read
The last frame (NULL = no video available).
"subtitle-font-desc" gchar* : Write
Pango font description of font to be used for subtitle rendering.
Default value: NULL
"video-sink" GstElement* : Read / Write
the video output element to use (NULL = default sink).
"vis-plugin" GstElement* : Read / Write
the visualization element to use (NULL = none).
"volume" gdouble : Read / Write
Get or set the current audio stream volume. 1.0 means 100%, 0.0 means mute. This uses a linear volume scale.
Allowed values: [0,10]
Default value: 1
"connection-speed" guint : Read / Write
Network connection speed in kbps (0 = unknown).
Default value: 0