Gstreamer: how to decode h264 from appsrc?

Hello, in the last few days, I’ve been trying to find a way to decode h264 from appsrc, that uses frames which will be passed from the media of the webrtc crate.

I tried to test decoding raw h264 file was generated using ffmpeg with the following command:

ffmpeg -i video.mp4 -an -c:v libx264 -bsf:v h264_mp4toannexb -b:v 2M -max_delay 0 -bf 0 output.h264

Using the following pipeline to output video works:

gst-launch-1.0 filesrc location=output.h264 ! h264parse ! avdec_h264 ! videoconvert ! ximagesink

What I wrote is the following:

use gstreamer as gst;
use gst::prelude::*;

use std::fs::File;
use std::io::BufReader;
use webrtc::media::io::h264_reader::H264Reader;

fn main() {
    gst::init().unwrap();

    // declaring pipeline
    let pipeline = gst::Pipeline::new(None);
    let src = gst::ElementFactory::make("appsrc", None).unwrap();
    let parse = gst::ElementFactory::make("h264parse", None).unwrap();
    let decode = gst::ElementFactory::make("avdec_h264", None).unwrap();
    let glup = gst::ElementFactory::make("videoconvert", None).unwrap();
    let sink = gst::ElementFactory::make("ximagesink", None).unwrap();

    // attaching pipeline elements
    pipeline.add_many(&[&src, &parse, &decode, &glup, &sink]).unwrap();
    gst::Element::link_many(&[&src, &parse, &decode, &glup, &sink]).unwrap();

    let appsrc = src
        .dynamic_cast::<gst_app::AppSrc>()
        .expect("Source element is expected to be an appsrc!");


    // Closure that feeds frames to pipeline
    let mut i = 0;
    let file = File::open("./output.h264").unwrap();
    let reader = BufReader::new(file);
    // A reader that groups by nal units
    let mut nalreader = H264Reader::new(reader);
    appsrc.set_callbacks(
        gst_app::AppSrcCallbacks::builder()
            .need_data(move |appsrc, _| {
                println!("Producing frame {}", i);
                match nalreader.next_nal() {
                    Ok(frame) => {
                        // giving the  buffer the data without header
                        let buffer = gst::Buffer::from_slice(frame.data.freeze());

                        i += 1;

                        // appsrc already handles the error here
                        let _ = appsrc.push_buffer(buffer);
                    },
                    Err(err) => {
                        println!("All video frames parsed and sent: {}", err);
                    }
                }
            })
            .build(),
    );

    pipeline.set_state(gst::State::Playing).unwrap();

    let bus = pipeline
        .bus()
        .expect("Pipeline without bus. Shouldn't happen!");

    for msg in bus.iter_timed(gst::ClockTime::NONE) {
        use gst::MessageView;

        match msg.view() {
            MessageView::Eos(_) => break,
            MessageView::Error(_) => break,
            _ => (),
        }
    }

    pipeline.set_state(gst::State::Null).unwrap();
}

I did also try to decode vp9 frames and it did work with the following code:

use gstreamer as gst;
use gst::prelude::*;


use webrtc::media::io::ivf_reader::IVFReader;
use std::fs::File;
use std::io::BufReader;
use webrtc::media::io::h264_reader::H264Reader;

fn main() {
    gst::init().unwrap();

    let pipeline = gst::Pipeline::new(None);
    let src = gst::ElementFactory::make("appsrc", None).unwrap();
    let parse = gst::ElementFactory::make("vp9parse", None).unwrap();
    let decode = gst::ElementFactory::make("vp9dec", None).unwrap();
    let glup = gst::ElementFactory::make("videoconvert", None).unwrap();
    let sink = gst::ElementFactory::make("ximagesink", None).unwrap();


    pipeline.add_many(&[&src, &parse, &decode, &glup, &sink]).unwrap();
    gst::Element::link_many(&[&src, &parse, &decode, &glup, &sink]).unwrap();

    let appsrc = src
        .dynamic_cast::<gst_app::AppSrc>()
        .expect("Source element is expected to be an appsrc!");


    let mut i = 0;
    let file = File::open("./output_vp9.ivf").unwrap();
    let reader = BufReader::new(file);
    let (mut ivf, header) = IVFReader::new(reader).unwrap();
    appsrc.set_callbacks(
        gst_app::AppSrcCallbacks::builder()
            .need_data(move |appsrc, _| {
                println!("Producing frame {}", i);
                match ivf.parse_next_frame() {
                    Ok((frame, _)) => {
                        let buffer = gst::Buffer::from_slice(frame.freeze());

                        i += 1;

                        // appsrc already handles the error here
                        let _ = appsrc.push_buffer(buffer);
                    },
                    Err(err) => {
                        println!("All video frames parsed and sent: {}", err);
                    }
                }
            })
            .build(),
    );

    pipeline.set_state(gst::State::Playing).unwrap();

    let bus = pipeline
        .bus()
        .expect("Pipeline without bus. Shouldn't happen!");

    for msg in bus.iter_timed(gst::ClockTime::NONE) {
        use gst::MessageView;

        match msg.view() {
            MessageView::Eos(_) => break,
            MessageView::Error(_) => break,
            _ => (),
        }
    }

    pipeline.set_state(gst::State::Null).unwrap();
}

This definitely should work as I was doing more or less that in a previous project. What problems are you facing there? I am kind of missing your problem…

Hi, I’m never able to see anything when decoding h264. ximagesink never shows up to begin with.

Do you get any error messages on the bus? Does pushing to the appsrc give any kind of Err? Are you actually pushing all buffers to the appsrc, i.e. is your callback actually called?
You also might have to set proper timestamps on your buffers or otherwise it will just decode and show everything as fast as it can unless there happens to be some timing information in your H264 bitstream, which can mean that you don’t ever see anything at all.

Also a GStreamer debug log (set GST_DEBUG=6 to get everything) might be useful to understand what exactly is happening in addition to the questions above.

No there doesn’t seem to be any error in appsrc.push_buffer(buffer) as it returns Ok in all frames, neither the bus

As you said, it seems that even when there is data queued, it thinks that video time is still 00:00:00 and end up doing nothing
I’m now using dummy timestamp like the following:

let mut buffer = gst::Buffer::from_slice(frame.data.freeze());
    {
         let mbuf = buffer.get_mut().unwrap();
         mbuf.set_duration(Some(ClockTime::from_mseconds(33)));
         mbuf.set_pts(100 * gst::ClockTime::MSECOND);
    }

When using GST_DEBUG=6I found something weird:

Producing frame 92
0:00:00.419839436 145803 0x5567fc391980 DEBUG             GST_MEMORY gstmemory.c:139:gst_memory_init: new memory 0x7f2c8000da10, maxsize:12090 offset:0 size:12090
0:00:00.419845433 145803 0x5567fc391980 DEBUG                 appsrc gstappsrc.c:2678:gst_app_src_push_internal:<appsrc0> queueing buffer 0x5567fc3235a0
0:00:00.419849536 145803 0x5567fc391980 DEBUG                 appsrc gstappsrc.c:1554:gst_app_src_update_queued_push:<appsrc0> Currently queued: 12090 bytes, 1 buffers, 0:00:00.000000000

Although I’m now passing timestamps it still in 0:00:00 although this is the 92 frame

Were there error messages on the bus? And please provide the whole log :slight_smile:

Ah sorry, no there aren’t any errors in the bus, just status changes.
here is the log

0:00:00.155470736 190488 0x563c2be93980 e[33;01mLOG                h264parse gsth264parse.c:1354:gst_h264_parse_handle_frame:<h264parse0> parsing new frame
0:00:00.155472601 190488 0x563c2be93980 DEBUG              h264parse gsth264parse.c:223:gst_h264_parse_reset_frame:<h264parse0> reset frame
0:00:00.155475094 190488 0x563c2be93980 DEBUG              h264parse gsth264parse.c:1379:gst_h264_parse_handle_frame:<h264parse0> last parse position 0
0:00:00.155484498 190488 0x563c2be93980 DEBUG      codecparsers_h264 gsth264parser.c:1449:gst_h264_parser_identify_nalu_unchecked: No start code prefix in this buffer
0:00:00.155486982 190488 0x563c2be93980 DEBUG              h264parse gsth264parse.c:1575:gst_h264_parse_handle_frame:<h264parse0> skipping 23
0:00:00.155488746 190488 0x563c2be93980 DEBUG              h264parse gsth264parse.c:223:gst_h264_parse_reset_frame:<h264parse0> reset frame
0:00:00.155491490 190488 0x563c2be93980 e[33;01mLOG                baseparse gstbaseparse.c:2257:gst_base_parse_handle_buffer:<h264parse0> handle_frame skipped 23, flushed 0
0:00:00.155493513 190488 0x563c2be93980 e[33;01mLOG                baseparse gstbaseparse.c:2268:gst_base_parse_handle_buffer:<h264parse0> finding sync, skipping 23 bytes
[...]

Your problem is that you’re not providing byte-stream H264 with the start code but instead you seem to provide raw NALs. In that case you need to configure caps as video/x-h264,stream-format=avc,alignment=au and provide a) a whole AU per buffer, and b) provide the avcC codec data via the caps.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.