Continuously a mp4 file in a GStremer Pipeline with RTSP Server

I have a question how to constantly repeat an input video in MP4 while publishing it through RTSP. I have build an application based on the example test-appsrc2.c

I have modified the source to use this launch syntax for the generator pipeline:

filesrc name=vin location=%s : qtdemux : h264parse : appsink name=vid max-buffers=3 drop=false

I do receive the EOS message in the bus_callback but I am not able to rewind the pipeline nor to just switch to an different input. I guess there is some basic thing I am doing wrong.

case GST_MESSAGE_EOS:
    g_print("GST_MESSAGE_EOS: %s\n", gst_message_type_get_name (GST_MESSAGE_TYPE (msg)));
    gst_element_set_state(pipeline, GST_STATE_NULL);
    std::this_thread::sleep_for(100ms);
    vin = gst_bin_get_by_name (GST_BIN (pipeline), "vin");
    g_print ("vin: %p\n",vin);
    if(vin)
    {
      gst_util_set_object_arg (G_OBJECT (vin), "location", "video02.mp4");

    }
    /*
    if (!gst_element_seek(pipeline,
                          1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
                          GST_SEEK_TYPE_SET, 0, // 1 seconds (in nanoseconds)
                          GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE))
    {
      g_print("Seek failed!\n");
    }
    */
    gst_element_set_state(pipeline, GST_STATE_PLAYING);

This is my need_data callback, maybe I messed something up with the timestamps…

/* called when we need to give data to an appsrc */
static void need_data (GstElement * appsrc, guint unused, MyContext * ctx)
{
  GstSample *sample;
  GstFlowReturn ret;
  sample = gst_app_sink_pull_sample (GST_APP_SINK (ctx->vid_appsink));
  if (sample) {
    GstBuffer *buffer = gst_sample_get_buffer (sample);
    GstSegment *seg = gst_sample_get_segment (sample);
    GstClockTime pts, dts;
    /* Convert the PTS/DTS to running time so they start from 0 */
    pts = GST_BUFFER_PTS (buffer);
    if (GST_CLOCK_TIME_IS_VALID (pts))
      pts = gst_segment_to_running_time (seg, GST_FORMAT_TIME, pts);

    dts = GST_BUFFER_DTS (buffer);
    if (GST_CLOCK_TIME_IS_VALID (dts))
      dts = gst_segment_to_running_time (seg, GST_FORMAT_TIME, dts);
    if (buffer) {
      /* Make writable so we can adjust the timestamps */
      buffer = gst_buffer_copy (buffer);
      GST_BUFFER_PTS (buffer) = pts;
      GST_BUFFER_DTS (buffer) = dts;
      g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
    }

    /* we don't need the appsink sample anymore */
    gst_sample_unref (sample);
  }
}

I am able to play my 5 second initial video, then the EOS signal is received and no new data is published to the client nor the need_data function is called.

Please let me know if you need more source-code or log gathered with GST_DEBUG=X

I am on Ubuntu 20.04 with GStreamer 1.16.3

Any idea what this message is about?

GLib-GObject-WARNING **: 18:32:08.347: ../../../gobject/gsignal.c:2736: instance '0x7f21d0048270' has no handler with id '5'

The Pointer '0x7f21d0048270' is my qtdemux in the generating pipeline

I checked the qtdemux signals with gst-inspect:

Element Signals:
  "pad-added" :  void user_function (GstElement* object,
                                     GstPad* arg0,
                                     gpointer user_data);
  "pad-removed" :  void user_function (GstElement* object,
                                       GstPad* arg0,
                                       gpointer user_data);
  "no-more-pads" :  void user_function (GstElement* object,
                                        gpointer user_data);

I have registered the callbacks:

 auto demux  = gst_bin_get_by_name (GST_BIN (ctx->generator_pipe), "demux");
  g_print ("demux: %p\n",demux);
  if(demux)
  {
    g_signal_connect (demux, "pad-added", G_CALLBACK (cb_new_pad), (gpointer)"demux");
    g_signal_connect (demux, "pad-removed", G_CALLBACK (cb_pad_removed), (gpointer) "demux");
    g_signal_connect (demux, "no-more-pads", G_CALLBACK (cb_no_more_pads), (gpointer) "demux");
  }

But still get the Warning

I don’t know about anything else but if you want to replay the same thing, you shouldn’t be setting pipeline state to NULL on EOS or changing location.

You are right @mazharhussain I replaced the location on pure desperation. Setting the pipeline to NULL was needed due to the new location. Otherwise there was an error like can not set location while running or something similar. I will investigate further and re-read what NULL actually means to the pipeline.

I have checked the documentation for States Based on this

GST_STATE_NULL is the default state of an element. In this state, it has not allocated any runtime resources, it has not loaded any runtime libraries and it can obviously not handle data.

Based on this it totally makes sense to need a transition to NUL for replacing the location I digged a little bit deeper and currently wondering what the difference between ! and : in the pipeline description is:

When I run this pipeline:

gchar* pipeline_description = g_strdup_printf("filesrc name=vin location=%s : qtdemux name=demux : h264parse ! appsink name=vid max-buffers=3 drop=false",(char*)user_data);

My debug output looks like this:

Need_data: [done]
Need_data: GST_MESSAGE_EOS: eos
A pad video_0 was removed from us associated with demux
vin: 0x7fd8bc038210
got message: stream-status
got message: stream-status

(video-stream-exterior:7719): GLib-GObject-WARNING **: 07:06:04.357: ../../../gobject/gsignal.c:2736: instance '0x7fd8bc048270' has no handler with id '5'
A new pad video_0 was created associated with demux
No more pads, associated with demux
got message: stream-start
got message: duration-changed
got message: tag
got message: tag
got message: tag
got message: async-done

But when I change the : between qtdemux and h264parse I get this output:

Need_data: [done]
Need_data: GST_MESSAGE_EOS: eos
A pad video_0 was removed from us associated with demux
vin: 0x7fd710038210
got message: stream-status
got message: stream-status
A new pad video_0 was created associated with demux
No more pads, associated with demux
Error: Internal data stream error.

This is my need_data function:

/* called when we need to give data to an appsrc */
static void need_data (GstElement * appsrc, guint unused, MyContext * ctx)
{
  GstSample *sample;
  GstFlowReturn ret;
  g_print("Need_data: ");
  sample = gst_app_sink_pull_sample (GST_APP_SINK (ctx->vid_appsink));
  if (sample) {
    GstBuffer *buffer = gst_sample_get_buffer (sample);
    GstSegment *seg = gst_sample_get_segment (sample);
    GstClockTime pts, dts;
    /* Convert the PTS/DTS to running time so they start from 0 */
    pts = GST_BUFFER_PTS (buffer);
    if (GST_CLOCK_TIME_IS_VALID (pts))
      pts = gst_segment_to_running_time (seg, GST_FORMAT_TIME, pts);

    dts = GST_BUFFER_DTS (buffer);
    if (GST_CLOCK_TIME_IS_VALID (dts))
      dts = gst_segment_to_running_time (seg, GST_FORMAT_TIME, dts);
    if (buffer) {
      /* Make writable so we can adjust the timestamps */
      buffer = gst_buffer_copy (buffer);
      GST_BUFFER_PTS (buffer) = pts;
      GST_BUFFER_DTS (buffer) = dts;
      g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
    }
    g_print("[done]\n");
    /* we don't need the appsink sample anymore */
    gst_sample_unref (sample);
  }
}

From the missing [done] in the output:

Need_data: [done]
Need_data: GST_MESSAGE_EOS: eos

I do assume the rtsp pipeline is still waiting for data, one thing which came to my mind was that maybe the pulling (rtsp) pipeline might be the issue. I also changed the duration of the appsrc to unkown by setting:

  gst_app_src_set_duration(GST_APP_SRC(ctx->vid_appsrc), GST_CLOCK_TIME_NONE);

I managed to loop the video once but it does not wind for a second time, any ideas?

/* GStreamer
 * Copyright (C) 2008 Wim Taymans <wim.taymans at gmail.com>
 *
 * This library is free software; you can redistribute it and/or
 * modify it under the terms of the GNU Library General Public
 * License as published by the Free Software Foundation; either
 * version 2 of the License, or (at your option) any later version.
 *
 * This library is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
 * Library General Public License for more details.
 *
 * You should have received a copy of the GNU Library General Public
 * License along with this library; if not, write to the
 * Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
 * Boston, MA 02110-1301, USA.
 */

#include <gst/gst.h>
#include <gst/app/app.h>

#include <gst/rtsp-server/rtsp-server.h>

#include <thread>

typedef struct
{
  GstElement *generator_pipe;
  GstElement *vid_appsink;
  GstElement *vid_appsrc;
  GstElement *aud_appsink;
  GstElement *aud_appsrc;
} MyContext;


gboolean bus_callback(GstBus *bus, GstMessage *msg, gpointer data)
{
  using namespace std::chrono_literals;
  GstElement *vin;
  GstElement *pipeline = GST_ELEMENT(data);
  switch (GST_MESSAGE_TYPE(msg))
  {
  case GST_MESSAGE_EOS:
  {
    g_print("GST_MESSAGE_EOS: %s\n", gst_message_type_get_name(GST_MESSAGE_TYPE(msg)));
    if (!gst_element_seek(pipeline,
                          1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
                          GST_SEEK_TYPE_SET, 0,
                          GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE))
    {
      g_print("Seek failed!\n");
    }
    g_print("Finished Seek!\n");
  }
  break;
  case GST_MESSAGE_STATE_CHANGED:
    break;
  case GST_MESSAGE_ERROR:
  {
    gchar *debug;
    GError *error;

    gst_message_parse_error(msg, &error, &debug);

    g_free(debug);
    g_printerr("Error: %s\n", error->message);
    g_error_free(error);
  }
  break;
  default:
    g_print("got message: %s\n", gst_message_type_get_name(GST_MESSAGE_TYPE(msg)));
    break;
  }
  return TRUE;
}

/* called when we need to give data to an appsrc */
static void need_data (GstElement * appsrc, guint unused, MyContext * ctx)
{
  GstSample *sample;
  GstFlowReturn ret;
  //g_print("Need_data: ");
  sample = gst_app_sink_pull_sample (GST_APP_SINK (ctx->vid_appsink));
  if (sample) {
    GstBuffer *buffer = gst_sample_get_buffer (sample);
    GstSegment *seg = gst_sample_get_segment (sample);
    GstClockTime pts, dts;
    /* Convert the PTS/DTS to running time so they start from 0 */
    pts = GST_BUFFER_PTS (buffer);
    if (GST_CLOCK_TIME_IS_VALID (pts))
      pts = gst_segment_to_running_time (seg, GST_FORMAT_TIME, pts);

    dts = GST_BUFFER_DTS (buffer);
    if (GST_CLOCK_TIME_IS_VALID (dts))
      dts = gst_segment_to_running_time (seg, GST_FORMAT_TIME, dts);
    if (buffer) {
      /* Make writable so we can adjust the timestamps */
      buffer = gst_buffer_copy (buffer);
      GST_BUFFER_PTS (buffer) = pts;
      GST_BUFFER_DTS (buffer) = dts;
      g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
      // g_print("[done]\n");
    } else {
      // g_print("[invalid]\n");
    }

    /* we don't need the appsink sample anymore */
    gst_sample_unref (sample);
  }
}

static void ctx_free (MyContext * ctx)
{
  g_print ("ctx_free\n");
  gst_element_set_state (ctx->generator_pipe, GST_STATE_NULL);

  gst_object_unref (ctx->generator_pipe);
  gst_object_unref (ctx->vid_appsrc);
  gst_object_unref (ctx->vid_appsink);
  g_free (ctx);
}

/* called when a new media pipeline is constructed. We can query the
 * pipeline and configure our appsrc */
static void media_configure (GstRTSPMediaFactory * factory, GstRTSPMedia * media, gpointer user_data)
{
  GstElement *element, *appsrc, *appsink;
  GstCaps *caps;
  MyContext *ctx;

  ctx = g_new0 (MyContext, 1);

  gchar* pipeline_description = g_strdup_printf("filesrc name=vin location=%s : qtdemux name=demux ! h264parse ! appsink name=vid",(char*)user_data);

  /* This pipeline generates H264 video. The appsinks are kept small so that if delivery is slow,
   * encoded buffers are dropped as needed.*/
  ctx->generator_pipe = gst_parse_launch(pipeline_description,NULL);

  /* make sure the data is freed when the media is gone */
  g_object_set_data_full (G_OBJECT (media), "rtsp-extra-data", ctx,(GDestroyNotify) ctx_free);

  /* get the element (bin) used for providing the streams of the media */
  element = gst_rtsp_media_get_element (media);

  /* Find the app source video, and configure it, connect to the
   * signals to request data */
  /* configure the caps of the video */

  // TODO identify the caps from the stream
  caps = gst_caps_new_simple ("video/x-h264",
      "stream-format", G_TYPE_STRING, "byte-stream",
      "alignment", G_TYPE_STRING, "au",
      "width", G_TYPE_INT, 2880, "height", G_TYPE_INT, 1860,
      "framerate", GST_TYPE_FRACTION, 30, 1, NULL);



  ctx->vid_appsrc = appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (element), "videosrc");
  ctx->vid_appsink = appsink = gst_bin_get_by_name (GST_BIN (ctx->generator_pipe), "vid");

  /* Set the duration to unknown */
  gst_app_src_set_duration(GST_APP_SRC(ctx->vid_appsrc), GST_CLOCK_TIME_NONE);

  g_object_set (G_OBJECT (appsrc),
    "caps", caps,
    "stream-type", 0,
    "is-live", TRUE,
    "block", FALSE,
    "format", GST_FORMAT_TIME,
    "do-timestamp", TRUE,
    "min-latency", 0,
    NULL);

  g_object_set (G_OBJECT (appsink),
    "caps", caps,
    "stream-type", 0,
    "max-buffers", 3,
    "drop", FALSE,
    "block", FALSE,
    NULL);


  /* install the callback that will be called when a buffer is needed */
  g_signal_connect (appsrc, "need-data", (GCallback) need_data, ctx);
  gst_caps_unref (caps);


  GstBus *bus = gst_pipeline_get_bus (GST_PIPELINE (ctx->generator_pipe));
  gst_bus_add_watch (bus, bus_callback, ctx->generator_pipe);
  gst_object_unref (bus);

  gst_element_set_state (ctx->generator_pipe, GST_STATE_PLAYING);
  gst_object_unref (element);
}

int main (int argc, char *argv[])
{
  GMainLoop *loop;
  GstRTSPServer *server;
  GstRTSPMountPoints *mounts;
  GstRTSPMediaFactory *factory;

  if(argc < 2) {
    g_print("The video filename is missing\n");
    g_print("%s <filename>\n",argv[0]);
    return 1;
  }

  gst_init (&argc, &argv);

  loop = g_main_loop_new (NULL, FALSE);

  /* create a server instance */
  server = gst_rtsp_server_new ();

  /* get the mount points for this server, every server has a default object
   * that be used to map uri mount points to media factories */
  mounts = gst_rtsp_server_get_mount_points (server);

  /* make a media factory for a test stream. The default media factory can use
   * gst-launch syntax to create pipelines.
   * any launch line works as long as it contains elements named pay%d. Each
   * element with pay%d names will be a stream */
  factory = gst_rtsp_media_factory_new ();
  gst_rtsp_media_factory_set_launch (factory, "( appsrc name=videosrc ! h264parse ! rtph264pay name=pay0 pt=96 )");

  gst_rtsp_media_factory_set_shared(factory, TRUE);
  /* notify when our media is ready, This is called whenever someone asks for
   * the media and a new pipeline with our appsrc is created */
  g_signal_connect (factory, "media-configure", (GCallback) media_configure, argv[1]);

  /* attach the test factory to the /test url */
  gst_rtsp_mount_points_add_factory (mounts, "/test", factory);

  /* don't need the ref to the mounts anymore */
  g_object_unref (mounts);

  /* attach the server to the default maincontext */
  gst_rtsp_server_attach (server, NULL);

  /* start serving */
  g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
  g_main_loop_run (loop);

  return 0;
}

This topic was automatically closed 45 days after the last reply. New replies are no longer allowed.