Expand description

gstreamer-rs crates.io pipeline status

GStreamer (Video library) bindings for Rust. Documentation can be found here.

These bindings are providing a safe API that can be used to interface with GStreamer, e.g. for writing GStreamer-based applications and GStreamer plugins.

The bindings are mostly autogenerated with gir based on the GObject-Introspection API metadata provided by the GStreamer project.

Table of Contents

  1. Installation
    1. Linux/BSDs
    2. macOS
    3. Windows
  2. Getting Started
  3. License
  4. Contribution


To build the GStreamer bindings or anything depending on them, you need to have at least GStreamer 1.8 and gst-plugins-base 1.8 installed. In addition, some of the examples/tutorials require various GStreamer plugins to be available, which can be found in gst-plugins-base, gst-plugins-good, gst-plugins-bad, gst-plugins-ugly and/or gst-libav.


You need to install the above mentioned packages with your distributions package manager, or build them from source.

On Debian/Ubuntu they can be installed with

$ apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev \
      gstreamer1.0-plugins-base gstreamer1.0-plugins-good \
      gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \
      gstreamer1.0-libav libgstrtspserver-1.0-dev libges-1.0-dev

The minimum required version of the above libraries is >= 1.8. If you build the gstreamer-player sub-crate, or any of the examples that depend on gstreamer-player, you must ensure that in addition to the above packages, libgstreamer-plugins-bad1.0-dev is installed and that the version is >= 1.12. See the Cargo.toml files for the full details,

$ # Only if you wish to install gstreamer-player, make sure the version
$ # of this package is >= 1.12.
$ apt-get install libgstreamer-plugins-bad1.0-dev

Package names on other distributions should be similar. Please submit a pull request with instructions for yours.


You can install GStreamer and the plugins via Homebrew or by installing the binaries provided by the GStreamer project.


Homebrew only installs various plugins if explicitly enabled, so some extra --with-* flags may be required.

$ brew install gstreamer gst-plugins-base gst-plugins-good \
      gst-plugins-bad gst-plugins-ugly gst-libav gst-rtsp-server \
      gst-editing-services --with-orc --with-libogg --with-opus \
      --with-pango --with-theora --with-libvorbis --with-libvpx \

If you wish to install the gstreamer-player sub-crate, make sure the version of these libraries is >= 1.12. Otherwise, a version >= 1.8 is sufficient.

GStreamer Binaries

You need to download the two .pkg files from the GStreamer website and install them, e.g. gstreamer-1.0-1.12.3-x86_64.pkg and gstreamer-1.0-devel-1.12.3-x86_64.pkg.

After installation, you also need to install pkg-config (e.g. via Homebrew) and set the PKG_CONFIG_PATH environment variable

$ export PKG_CONFIG_PATH="/Library/Frameworks/GStreamer.framework/Versions/Current/lib/pkgconfig${PKG_CONFIG_PATH:+:$PKG_CONFIG_PATH}"


You can install GStreamer and the plugins via MSYS2 with pacman or by installing the binaries provided by the GStreamer project.

MSYS2 / pacman
$ pacman -S glib2-devel pkg-config \
      mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-plugins-base \
      mingw-w64-x86_64-gst-plugins-good mingw-w64-x86_64-gst-plugins-bad \
      mingw-w64-x86_64-gst-plugins-ugly mingw-w64-x86_64-gst-libav \

If you wish to install the gstreamer-player sub-crate, make sure the version of these libraries is >= 1.12. Otherwise, a version >= 1.8 is sufficient.

Note that the version of pkg-config included in MSYS2 is known to have problems compiling GStreamer, so you may need to install another version. One option would be pkg-config-lite.

GStreamer Binaries

You need to download the two .msi files for your platform from the GStreamer website and install them, e.g. gstreamer-1.0-x86_64-1.12.3.msi and gstreamer-1.0-devel-x86_64-1.12.3.msi.

After installation, you also need to install pkg-config (e.g. via MSYS2 or from here) and set the PKG_CONFIG_PATH environment variable

$ export PKG_CONFIG_PATH="c:\\gstreamer\\1.0\\x86_64\\lib\\pkgconfig${PKG_CONFIG_PATH:+:$PKG_CONFIG_PATH}"

Getting Started

The API reference can be found here, however it is only the Rust API reference and does not explain any of the concepts.

For getting started with GStreamer development, the best would be to follow the documentation on the GStreamer website, especially the Application Development Manual. While being C-centric, it explains all the fundamental concepts of GStreamer and the code examples should be relatively easily translatable to Rust. The API is basically the same, function/struct names are the same and everything is only more convenient (hopefully) and safer.

In addition there are tutorials on the GStreamer website. Many of them were ported to Rust already and the code can be found in the tutorials directory.

Some further examples for various aspects of GStreamer and how to use it from Rust can be found in the examples directory.

Various GStreamer plugins written in Rust can be found in the gst-plugins-rs repository.


gstreamer-rs and all crates contained in here are licensed under either of

  • Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
  • MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)

at your option.

GStreamer itself is licensed under the Lesser General Public License version 2.1 or (at your option) any later version: https://www.gnu.org/licenses/lgpl-2.1.html


Any kinds of contributions are welcome as a pull request.

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in gstreamer-rs by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.


pub use ffi;
pub use glib;
pub use gst;
pub use gst_base;
pub use crate::video_frame::VideoFrame;
pub use crate::video_frame::VideoFrameRef;
pub use crate::video_event::DownstreamForceKeyUnitEvent;
pub use crate::video_event::ForceKeyUnitEvent;
pub use crate::video_event::NavigationEvent;
pub use crate::video_event::StillFrameEvent;
pub use crate::video_event::UpstreamForceKeyUnitEvent;
pub use crate::video_message::NavigationEventMessage;
pub use crate::video_message::NavigationMessage;
pub use crate::video_overlay_composition::VideoOverlayComposition;
pub use crate::video_overlay_composition::VideoOverlayCompositionRef;
pub use crate::video_overlay_composition::VideoOverlayRectangle;
pub use crate::video_overlay_composition::VideoOverlayRectangleRef;
pub use crate::video_meta::VideoCaptionMeta;
pub use crate::video_meta::VideoAFDMeta;
pub use crate::video_meta::VideoBarMeta;
pub use crate::video_meta::VideoAffineTransformationMeta;
pub use crate::video_meta::VideoCropMeta;
pub use crate::video_meta::VideoMeta;
pub use crate::video_meta::VideoOverlayCompositionMeta;
pub use crate::video_meta::VideoRegionOfInterestMeta;
pub use crate::video_converter::VideoConverter;
pub use crate::video_converter::VideoConverterConfig;
pub use crate::video_codec_state::VideoCodecState;
pub use crate::video_codec_state::VideoCodecStateContext;




This interface is implemented by elements which can perform some color balance operation on video frames they process. For example, modifying the brightness, contrast, hue or saturation.

The ColorBalanceChannel object represents a parameter for modifying the color balance implemented by an element providing the ColorBalance interface. For example, Hue or Saturation.

The Navigation interface is used for creating and injecting navigation related events such as mouse button presses, cursor motion and key presses. The associated library also provides methods for parsing received events, and for sending and receiving navigation related bus events. One main usecase is DVD menu navigation.

VideoAggregator can accept AYUV, ARGB and BGRA video streams. For each of the requested sink pads it will compare the incoming geometry and framerate to define the output parameters. Indeed output video frames will have the geometry of the biggest incoming video stream and the framerate of the fastest incoming one.

An implementation of GstPad that can be used with VideoAggregator.

An implementation of GstPad that can be used with VideoAggregator.

Additional video buffer flags. These flags can potentially be used on any buffers carrying closed caption data, or video data - even encoded data.


Various Chroma sitings.

A VideoCodecFrame represents a video frame both in raw and encoded form.

Structure describing the color info.

This base class is for video decoders turning encoded data into raw video frames.

Flags to be used in combination with VideoDecoderExt::request_sync_point(). See the function documentation for more details.

This base class is for video encoders turning raw video into encoded video data.

Provides useful functions and a base class for video filters.

Extra video flags

The different video flags that a format info can have.

Information for a video format.

Extra video frame flags

Information describing image properties. This information can be filled in from GstCaps with from_caps(). The information is also used to store the specific video info when mapping a video frame with VideoFrame::from_buffer_readable().

GstVideoMultiviewFlags are used to indicate extra properties of a stereo/multiview stream beyond the frame layout and buffer mapping that is conveyed in the VideoMultiviewMode.

The interface allows unified access to control flipping and autocenter operation of video-sources or operators.

The VideoOverlay interface is used for 2 main purposes :

Overlay format flags.

The different flags that can be used when packing and unpacking.

Provides useful functions and a base class for video sinks.

field_count must be 0 for progressive video and 1 or 2 for interlaced.

Flags related to the time code information. For drop frame, only 30000/1001 and 60000/1001 frame rates are supported.

A representation of a difference between two VideoTimeCode instances. Will not necessarily correspond to a real timecode (e.g. 00:00:10;00)


An enumeration indicating whether an element implements color balancing operations in software or in dedicated hardware. In general, dedicated hardware implementations (such as those provided by xvimagesink) are preferred.

A set of commands that may be issued to an element providing the Navigation interface. The available commands can be queried via the gst_navigation_query_new_commands() query.

Enum values for the various events that an element implementing the GstNavigation interface might send up the pipeline. Touch events have been inspired by the libinput API, and have the same meaning here.

A set of notifications that may be received on the bus when navigation related status changes.

Types of navigation interface queries.

Enumeration of the different standards that may apply to AFD data:

Enumeration of the various values for Active Format Description (AFD)

Different alpha modes.

The various known types of Closed Caption (CC).

Different chroma downsampling and upsampling modes

The color matrix is used to convert between Y’PbPr and non-linear RGB (R’G’B’)

The color primaries define the how to transform linear RGB values to and from the CIE XYZ colorspace.

Possible color range values. These constants are defined for 8 bit color values and can be scaled for other bit depths.

Different dithering methods to use.

Field order of interlaced content. This is only valid for interlace-mode=interleaved and not interlace-mode=mixed. In the case of mixed or GST_VIDEO_FIELD_ORDER_UNKOWN, the field order is signalled via buffer flags.

Enum value describing the most common video formats.

The possible values of the VideoInterlaceMode describing the interlace mode of the stream.

Different color matrix conversion modes

VideoMultiviewFramePacking represents the subset of VideoMultiviewMode values that can be applied to any video frame without needing extra metadata. It can be used by elements that provide a property to override the multiview interpretation of a video stream when the video doesn’t contain any markers.

All possible stereoscopic 3D and multiview representations. In conjunction with VideoMultiviewFlags, describes how multiview content is being transported in the stream.

The different video orientation methods.

Different primaries conversion modes

Different subsampling and upsampling methods

Enum value describing the available tiling modes.

The video transfer function defines the formula for converting between non-linear RGB (R’G’B’) and linear RGB



A bufferpool option to enable extra padding. When a bufferpool supports this option, gst_buffer_pool_config_set_video_alignment() can be called.

An option that can be activated on a bufferpool to request gl texture upload meta on buffers from the pool.

An option that can be activated on bufferpool to request video metadata on buffers from the pool.

Name of the caps feature indicating that the stream is interlaced.

List of all video formats, for use in template caps strings.