notes/gstreamer/gstreamer.md

12 KiB
Raw Blame History

title
GStreamer

Internals

RTP timestamp

  • Domain for rtptime is clock-time from SDP
  • 48000 in this case
  • So one tick is 1/48000 of a second
  • Meaning 20 milliseconds is 960

Discussion on threading internals

http://gstreamer-devel.966125.n4.nabble.com/gstreamer-threading-internal-td4428624.html

Threads and probes

Probes

Threads

Audio decoders

There is max-errors property on audio decoder elements which can be helpful in debugging.

Autoplugging

Why are queues required

This is a classical problem, and a good illustration why you usually need queues after elements with N source pads, and before elements with N sink pads.

The demuxer is pushing to its source pads from the same thread, and if you have connected these source pads with sync=true sinks, their chain function blocks until pre-rolling is done, that's until each sink with sync=true has received a buffer. Adding queues decouples branches of the pipeline, as they start their own streaming thread.

See probes here.

Webkit debugging

Quark

Quarks represent a different idea Quarks are basically an internal "registry" of strings. The idea is that if you want to compare strings often, that can be inefficient, so you make a GQuark of it (essentially a hash) that you can stably compare. So if there is a string comparison you are performing often, or you want to make faster, you convert the string to a quark (hashing it to an integer), and then use that instead.

Flushing probe

https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/274

Elements

GstDiscoverer

  • Internally uses uridecodebin. This element decodes data from a URI into raw media. It selects a source element that can handle the given URI scheme and connects it to a decodebin element. It acts like a demuxer, so it offers as many source pads as streams are found in the media.

    gst-launch-1.0 uridecodebin uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! videoconvert ! autovideosink
    
  • Primary function seems to be discoverer_collect which gets called pipeline is pre-rolled

  • The next important function is parse_stream_topology. It can also be called recursively

    • Checks for next in GstStructure passed as argument to this function

    • If next is NULL, then it's the end of the stream, else one has a GstValue holding either a structure or a list

    • For the case of a GstStructure, collect_information is then called. It reads various fields in the caps to collect information like rate, channels and channel-mask. Something called TAGS and QUARK also come into picture here. Quarks seem to be a 2 way association between a string and a unique identifier

    • For the case of list, along with some things, walk over the list, calling parse_stream_topology for each item in the list, where the item seems to a sub stream

Decodebin

This element automatically constructs a decoding pipeline using available decoders and demuxers via auto plugging until raw media is obtained. It's used internally by uridecodebin which is often more convenient to use, as it creates a suitable source element as well. It replaces the old decodebin element. It acts like a demuxer, so it offers as many source pads as streams are found in the media.

``` bash
gst-launch-1.0 souphttpsrc location=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm ! decodebin ! autovideosink
```

Thumbnails

Can GstDiscoverer be used for thumbnails? Search with Google a bit, and land on this issue discoverer should generated video/image thumbnails. This references Thumbnailer Spec and Thumbnail Management DBus Specification. GstDiscoverer doesn't seem to have a thumbnailing API.

There is a totem video thumbnailer which is the video thumbnailer for the GNOME desktop but seems not to be available as a separate library. ffmpegthumbnailer based on ffmpeg is a lightweight video thumbnailer that can be used by file managers to create thumbnails for video files. However, packaging ffmpeg might be a concern here and ffmpegthumbnailer has a GPL-2.0 license. FFmpeg is licensed under the LGPL license, however, if a particular build of FFmpeg is linked against any GPL libraries (notably x264), then the entire binary is licensed under the GPL.

If this needs to be provided with GStreamer then gstbasesink has a note on last-sample property where it mentions that the property can be used to generate thumbnails. Also see gst_base_sink_get_last_sample. A related option seems to be convertframe which has two APIs for converting a raw video buffer into the specified output caps. See convertframe.

Some links which have examples.

Tracker miner

GstDiscoverer is used in Tracker Miner though not for everything. From Into the Pyramid.

  • Tracker has a generic image extraction rule that tries to find metadata for any image/* MIME type that isnt a .bmp, .jpg, .gif, or .png. This codepath uses the GstDiscoverer API, the same as for video and audio files, in the hope that a GStreamer plugin on the system can give useful info about the image.

  • The GstDiscoverer instance is created with a timeout of 5 seconds. (This seems high, the typefind utility that ships with GStreamer uses a timeout of 1 second).

  • GStreamers GstDiscoverer API feeds any file where the type is unknown into an MPEG decoder, which is an unwanted fuzz test and can trigger periods of high CPU and memory usage.

  • 5 seconds of processing non-MPEG data with an MPEG decoder is somehow enough to cause Linuxs scheduler to lock up the entire system.

Rank

Modify the elements rank

Each GStreamer element has a detail called rank that defines the priority used by the autoplugger when it wants to connect a new element but has multiple options.

Pipeline examples

Opus RTP encoding decoding pipeline

Note that in place of rtpjitterbuffer, rtpbin also works.

export AUDIO_CAPS="application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS"
gst-launch-1.0 -v audiotestsrc ! audioconvert ! audioresample ! opusenc ! rtpopuspay ! udpsink host=localhost port=50004
gst-launch-1.0 udpsrc caps=$AUDIO_CAPS address=localhost port=50004 ! rtpjitterbuffer latency=20 do-lost=true ! rtpopusdepay ! opusdec plc=true ! pulsesink

Tearing down a branch of tee

tee = pipe.get_by_name(RTMP_TEE)
srcPad = tee.get_static_pad("src_" + str(idx))
srcPad.add_probe(
    Gst.PadProbeType.BLOCK, rtmp_tee_srcpad_block_probe, rtmpUrl
)

def rtmp_tee_srcpad_block_probe(
    pad: Gst.Pad, info: Gst.PadProbeInfo, rtmpUrl
) -> Gst.PadProbeReturn:
    tee = pipe.get_by_name(RTMP_TEE)
    queue = pipe.get_by_name(MUX_OUTPUT_QUEUE + "-" + rtmpUrl)
    rtmpSink = pipe.get_by_name(RTMP_SINK + "-" + rtmpUrl)
    sinkPad = queue.get_static_pad("sink")
    pad.unlink(sinkPad)
    rtmpSink.set_state(Gst.State.NULL)
    queue.set_state(Gst.State.NULL)
    tee.remove_pad(pad)
    pipe.remove(queue)
    pipe.remove(rtmpSink)
    return Gst.PadProbeReturn.REMOVE

Check camera capabilities

gst-device-monitor-1.0 Video/Source
v4l2-ctl --list-formats-ext -d0

Annotations

Annotation Glossary

AAC test samples

Debugger

If SIGSEGV occurs, to use gdb and gst-launch to debug the crashed process, first run

echo 0 > /proc/sys/kernel/yama/ptrace_scope
echo -1 > /proc/sys/kernel/perf_event_paranoid

Arch enables the Yama LSM by default, which provides a kernel.yama.ptrace_scope kernel parameter. This parameter is set to 1 (restricted) by default which prevents tracers from performing a ptrace call on traces outside of a restricted scope unless the tracer is privileged or has the CAP_SYS_PTRACE capability.

Terminology

Transmuxing

  • Referred to as repackaging or packetizing, is a process in which audio and video files are repackaged into different delivery formats without changing the files' contents.

Yocto

Enable RTMP plugin

RTP

Some FAQs about RTP

Basics of time and synchronization

Live pipeline

Live sources

Pipeline to check base time gets updated after PAUSE -> PLAY transition

env GST_DEBUG=basesink:6 gst-play-1.0 /usr/share/sounds/alsa/Front_Left.wav --audiosink="fakesink sync=true" -q 2>&1 | grep base_time

Running test suite with gdb

ninja -C build && env CK_FORK=no meson test --gdb -C build --suite gst-plugins-good elements_splitmuxsink

Re-timestamping frames

H264

Video mp4