Crate gstreamer_video
source ·Expand description
§gstreamer-rs
GStreamer (Video library) bindings for Rust. Documentation can be found here.
These bindings are providing a safe API that can be used to interface with GStreamer, e.g. for writing GStreamer-based applications and GStreamer plugins.
The bindings are mostly autogenerated with gir based on the GObject-Introspection API metadata provided by the GStreamer project.
§Table of Contents
§Installation
To build the GStreamer bindings or anything depending on them, you need to have at least GStreamer 1.14 and gst-plugins-base 1.14 installed. In addition, some of the examples/tutorials require various GStreamer plugins to be available, which can be found in gst-plugins-base, gst-plugins-good, gst-plugins-bad, gst-plugins-ugly and/or gst-libav.
§Linux/BSDs
You need to install the above mentioned packages with your distributions package manager, or build them from source.
On Debian/Ubuntu they can be installed with
$ apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev \
gstreamer1.0-plugins-base gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \
gstreamer1.0-libav libgstrtspserver-1.0-dev libges-1.0-dev
The minimum required version of the above libraries is >= 1.14. If you
build the gstreamer-player sub-crate, or any of the examples that
depend on gstreamer-player, you must ensure that in addition to the above
packages, libgstreamer-plugins-bad1.0-dev
is installed. See the Cargo.toml
files for the full details,
$ apt-get install libgstreamer-plugins-bad1.0-dev
Package names on other distributions should be similar. Please submit a pull request with instructions for yours.
§macOS
You can install GStreamer and the plugins via Homebrew or by installing the binaries provided by the GStreamer project.
We recommend using the official GStreamer binaries over Homebrew, especially as GStreamer in Homebrew is currently broken.
§GStreamer Binaries
You need to download the two .pkg
files from the GStreamer website and
install them, e.g. gstreamer-1.0-1.20.4-universal.pkg
and
gstreamer-1.0-devel-1.20.4-universal.pkg
.
After installation, you also need to set the PATH
environment variable as
follows
$ export PATH="/Library/Frameworks/GStreamer.framework/Versions/1.0/bin${PATH:+:$PATH}"
Also note that the pkg-config
from GStreamer should be the first one in
the PATH
as other versions have all kinds of quirks that will cause
problems.
§Homebrew
Homebrew only installs various plugins if explicitly enabled, so some extra
--with-*
flags may be required.
$ brew install gstreamer gst-plugins-base gst-plugins-good \
gst-plugins-bad gst-plugins-ugly gst-libav gst-rtsp-server \
gst-editing-services --with-orc --with-libogg --with-opus \
--with-pango --with-theora --with-libvorbis --with-libvpx \
--enable-gtk3
Make sure the version of these libraries is >= 1.14.
§Windows
You can install GStreamer and the plugins via MSYS2
with pacman
or by installing the
binaries provided by
the GStreamer project.
We recommend using the official GStreamer binaries over MSYS2.
§GStreamer Binaries
You need to download the two .msi
files for your platform from the
GStreamer website and install them, e.g. gstreamer-1.0-x86_64-1.20.4.msi
and
gstreamer-1.0-devel-x86_64-1.20.4.msi
. Make sure to select the version that
matches your Rust toolchain, i.e. MinGW or MSVC.
After installation set the ``PATH` environment variable as follows:
# For a UNIX-style shell:
$ export PATH="c:/gstreamer/1.0/msvc_x86_64/bin${PATH:+:$PATH}"
# For cmd.exe:
$ set PATH=C:\gstreamer\1.0\msvc_x86_64\bin;%PATH%
Make sure to update the path to where you have actually installed GStreamer and for the corresponding toolchain.
Also note that the pkg-config.exe
from GStreamer should be the first one in
the PATH
as other versions have all kinds of quirks that will cause
problems.
§MSYS2 / pacman
$ pacman -S glib2-devel pkg-config \
mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-plugins-base \
mingw-w64-x86_64-gst-plugins-good mingw-w64-x86_64-gst-plugins-bad \
mingw-w64-x86_64-gst-plugins-ugly mingw-w64-x86_64-gst-libav \
mingw-w64-x86_64-gst-rtsp-server
Make sure the version of these libraries is >= 1.14.
Note that the version of pkg-config
included in MSYS2
is
known to have problems
compiling GStreamer, so you may need to install another version. One option
would be pkg-config-lite
.
§Getting Started
The API reference can be found here, however it is only the Rust API reference and does not explain any of the concepts.
For getting started with GStreamer development, the best would be to follow the documentation on the GStreamer website, especially the Application Development Manual. While being C-centric, it explains all the fundamental concepts of GStreamer and the code examples should be relatively easily translatable to Rust. The API is basically the same, function/struct names are the same and everything is only more convenient (hopefully) and safer.
In addition there are tutorials on the GStreamer website. Many of them were ported to Rust already and the code can be found in the tutorials directory.
Some further examples for various aspects of GStreamer and how to use it from Rust can be found in the examples directory.
Various GStreamer plugins written in Rust can be found in the gst-plugins-rs repository.
§LICENSE
gstreamer-rs and all crates contained in here are licensed under either of
- Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
GStreamer itself is licensed under the Lesser General Public License version 2.1 or (at your option) any later version: https://www.gnu.org/licenses/lgpl-2.1.html
§Contribution
Any kinds of contributions are welcome as a pull request.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in gstreamer-rs by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.
Re-exports§
pub use crate::video_frame::VideoFrame;
pub use crate::video_frame::VideoFrameExt;
pub use crate::video_frame::VideoFrameRef;
pub use crate::video_event::DownstreamForceKeyUnitEvent;
pub use crate::video_event::ForceKeyUnitEvent;
pub use crate::video_event::StillFrameEvent;
pub use crate::video_event::UpstreamForceKeyUnitEvent;
pub use crate::video_overlay_composition::VideoOverlayComposition;
pub use crate::video_overlay_composition::VideoOverlayCompositionRef;
pub use crate::video_overlay_composition::VideoOverlayRectangle;
pub use crate::video_overlay_composition::VideoOverlayRectangleRef;
pub use crate::video_meta::VideoCaptionMeta;
pub use crate::video_meta::VideoAFDMeta;
pub use crate::video_meta::VideoBarMeta;
pub use crate::video_meta::VideoAffineTransformationMeta;
pub use crate::video_meta::VideoCropMeta;
pub use crate::video_meta::VideoMeta;
pub use crate::video_meta::VideoOverlayCompositionMeta;
pub use crate::video_meta::VideoRegionOfInterestMeta;
pub use crate::video_converter::VideoConverter;
pub use crate::video_converter::VideoConverterConfig;
pub use crate::video_codec_state::VideoCodecState;
pub use crate::video_codec_state::VideoCodecStateContext;
pub use glib;
pub use gst;
pub use gst_base;
pub use gstreamer_video_sys as ffi;
Modules§
Macros§
Structs§
- This interface is implemented by elements which can perform some color balance operation on video frames they process. For example, modifying the brightness, contrast, hue or saturation.
- The
ColorBalanceChannel
object represents a parameter for modifying the color balance implemented by an element providing theColorBalance
interface. For example, Hue or Saturation. - The Navigation interface is used for creating and injecting navigation related events such as mouse button presses, cursor motion and key presses. The associated library also provides methods for parsing received events, and for sending and receiving navigation related bus events. One main usecase is DVD menu navigation.
- Flags to indicate the state of modifier keys and mouse buttons in events.
- GLib type: Inline allocated boxed type with stack copy semantics.
- VideoAggregator can accept AYUV, ARGB and BGRA video streams. For each of the requested sink pads it will compare the incoming geometry and framerate to define the output parameters. Indeed output video frames will have the geometry of the biggest incoming video stream and the framerate of the fastest incoming one.
- An implementation of GstPad that can be used with
VideoAggregator
. - Properties
- An implementation of GstPad that can be used with
VideoAggregator
. - Video Ancillary data, according to SMPTE-291M specification.
- Additional video buffer flags. These flags can potentially be used on any buffers carrying closed caption data, or video data - even encoded data.
- Implements
- Various Chroma sitings.
- A
VideoCodecFrame
represents a video frame both in raw and encoded form. - Flags for
VideoCodecFrame
- Structure describing the color info.
- This base class is for video decoders turning encoded data into raw video frames.
- Flags to be used in combination with
VideoDecoderExt::request_sync_point()
. See the function documentation for more details. - This base class is for video encoders turning raw video into encoded video data.
- Provides useful functions and a base class for video filters.
- Extra video flags
- The different video flags that a format info can have.
- Information for a video format.
- Extra video frame flags
- Information describing image properties. This information can be filled in from GstCaps with
from_caps()
. The information is also used to store the specific video info when mapping a video frame withVideoFrame::from_buffer_readable()
. - Information describing a DMABuf image properties. It wraps
VideoInfo
and adds DRM information such as drm-fourcc and drm-modifier, required for negotiation and mapping. - GstVideoMultiviewFlags are used to indicate extra properties of a stereo/multiview stream beyond the frame layout and buffer mapping that is conveyed in the
VideoMultiviewMode
. - The interface allows unified access to control flipping and autocenter operation of video-sources or operators.
- The
VideoOverlay
interface is used for 2 main purposes : - Overlay format flags.
- The different flags that can be used when packing and unpacking.
- Provides useful functions and a base class for video sinks.
field_count
must be 0 for progressive video and 1 or 2 for interlaced.- Flags related to the time code information. For drop frame, only 30000/1001 and 60000/1001 frame rates are supported.
- A representation of a difference between two
VideoTimeCode
instances. Will not necessarily correspond to a real timecode (e.g. 00:00:10;00) - An encoder for writing ancillary data to the Vertical Blanking Interval lines of component signals.
- A parser for detecting and extracting
VideoAncillary
data from Vertical Blanking Interval lines of component signals.
Enums§
- Location of a
GstAncillaryMeta
. - An enumeration indicating whether an element implements color balancing operations in software or in dedicated hardware. In general, dedicated hardware implementations (such as those provided by xvimagesink) are preferred.
- A set of commands that may be issued to an element providing the
Navigation
interface. The available commands can be queried via thegst_navigation_query_new_commands()
query. - Enum values for the various events that an element implementing the GstNavigation interface might send up the pipeline. Touch events have been inspired by the libinput API, and have the same meaning here.
- A set of notifications that may be received on the bus when navigation related status changes.
- Types of navigation interface queries.
- Enumeration of the different standards that may apply to AFD data:
- Enumeration of the various values for Active Format Description (AFD)
- Different alpha modes.
- Some know types of Ancillary Data identifiers.
- The various known types of Closed Caption (CC).
- Different chroma downsampling and upsampling modes
- The color matrix is used to convert between Y’PbPr and non-linear RGB (R’G’B’)
- The color primaries define the how to transform linear RGB values to and from the CIE XYZ colorspace.
- Possible color range values. These constants are defined for 8 bit color values and can be scaled for other bit depths.
- Different dithering methods to use.
- Field order of interlaced content. This is only valid for interlace-mode=interleaved and not interlace-mode=mixed. In the case of mixed or GST_VIDEO_FIELD_ORDER_UNKOWN, the field order is signalled via buffer flags.
- Enum value describing the most common video formats.
- The possible values of the
VideoInterlaceMode
describing the interlace mode of the stream. - Different color matrix conversion modes
VideoMultiviewFramePacking
represents the subset ofVideoMultiviewMode
values that can be applied to any video frame without needing extra metadata. It can be used by elements that provide a property to override the multiview interpretation of a video stream when the video doesn’t contain any markers.- All possible stereoscopic 3D and multiview representations. In conjunction with
VideoMultiviewFlags
, describes how multiview content is being transported in the stream. - The different video orientation methods.
- Different primaries conversion modes
- Different subsampling and upsampling methods
- Enum value describing the available tiling modes.
- The video transfer function defines the formula for converting between non-linear RGB (R’G’B’) and linear RGB
- Video Vertical Blanking Interval related Errors.
Constants§
Statics§
- A bufferpool option to enable extra padding. When a bufferpool supports this option,
gst_buffer_pool_config_set_video_alignment()
can be called. - An option that can be activated on a bufferpool to request gl texture upload meta on buffers from the pool.
- An option that can be activated on bufferpool to request video metadata on buffers from the pool.
- Name of the caps feature indicating that the stream is interlaced.
- List of all video formats, for use in template caps strings.
- This is similar to
GST_VIDEO_FORMATS_ALL
but includes formats like DMA_DRM that do not have a software converter. This should be used for passthrough template caps.