A Continued Discussion Of Gstreamer And Libretro

Historical context here.

I wrote everything from scratch as far as libretro goes, and I managed to get audio passed from gstreamer to retroarch! :slight_smile:

I am running into a segfault when I try to pass the video from gstreamer to libretro, though.

Another bit of curiosity is the sample rate for audio for libretro.
The sample rate of the AC3 audio that I am using is 48khz. It seems like libretro needs the sample rate set to half of that.
The issue is, the audio jitters/stutters if I set the rate to 24000hz. I have looked at the CPU usage, and it looks as though the decoding is only using one core. I wonder if gstreamer is not able to keep up and causes underruns with only seemingly running on one core.

@aplazas If you have any insight into a solution that may be libretro specific, please let me know :slight_smile:

The issue is, the audio jitters/stutters if I set the rate to 24000hz. I have looked at the CPU usage, and it looks as though the decoding is only using one core. I wonder if gstreamer is not able to keep up and causes underruns with only seemingly running on one core.

Without seeing your code and being able to reproduce the problem that’s hard to debug, but maybe you’re just missing some buffering. Also GStreamer can do the downsampling from 48kHz to 24kHz for you with the audioresample element.

As far as gstreamer, the code is the same as it was in my previous post (I can paste it here as well).

Buffering does sound like a solution. Is that something I can/need to implement in gstreamer?

I would actually prefer to keep it at 48kHz, but if I specify that sample rate in libretro, the audio is sped up like mad.

Do you have any thoughts on why passing the audio actually functions, but passing the video causes a segfault?

Yes. How to do it best depends on your code though :slight_smile:

That sounds like libreto wants a very specific sample rate and if you give it something else it just plays at that sample rate it wants anyway.

Not without more details.

Everything is available here.

The line that is causing the segmentation fault is here.

Running thread apply all bt in gdb yielded this:

Thread 11 (Thread 0x7fffd5ffb640 (LWP 348567)):
#0  0x00007ffff5e46a89 in glBindVertexArray () at /usr/lib/nvidia/libGL.so.1
#1  0x00005555558718db in  ()
#2  0x000055555564a957 in  ()
#3  0x00007fffea0c02a1 in lasertronics::gstreamer::output_audio_and_video_streams::{{closure}}::{{closure}}::{{closure}} (appsink=0x7fffd5ffa080) at ~/lasertronics/src/gstreamer.rs:225
#4  0x00007fffea0c78d6 in <alloc::boxed::Box<F,A> as core::ops::function::FnMut<Args>>::call_mut (self=0x7fffd160ca50, args=...) at ~/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/alloc/src/boxed.rs:1335
#5  0x00007fffea0c8e9c in gstreamer_app::app_sink::trampoline_new_sample::{{closure}} () at ~/.cargo/registry/src/github.com-1ecc6299db9ec823/gstreamer-app-0.16.5/src/app_sink.rs:231
#6  0x00007fffea0c5b68 in core::ops::function::FnOnce::call_once () at ~/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/core/src/ops/function.rs:227
#7  0x00007fffea0c8fa4 in <std::panic::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once (self=..., _args=()) at ~/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/panic.rs:322
#8  0x00007fffea0c9db8 in std::panicking::try::do_call (data=0x7fffd5ff9f70 "Đ \377\325\377\177") at ~/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/panicking.rs:379
#9  0x00007fffea0ca05d in __rust_try () at ~/lasertronics/target/debug/liblasertronics.so
#10 0x00007fffea0c9c8d in std::panicking::try (f=...) at ~/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/panicking.rs:343
#11 0x00007fffea0c91bb in std::panic::catch_unwind (f=...) at ~/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/library/std/src/panic.rs:396
#12 0x00007fffea0c8ce6 in gstreamer_app::app_sink::trampoline_new_sample (appsink=0x7fffd160b490, callbacks=0x7fffd160ca00) at ~/.cargo/registry/src/github.com-1ecc6299db9ec823/gstreamer-app-0.16.5/src/app_sink.rs:230
#13 0x00007fffe9d2f5ff in  () at /usr/lib/libgstapp-1.0.so.0
#14 0x00007fffe9fb3168 in  () at /usr/lib/libgstbase-1.0.so.0
#15 0x00007fffe9f829f1 in  () at /usr/lib/libgstbase-1.0.so.0
#16 0x00007fffe9e99a15 in  () at /usr/lib/libgstreamer-1.0.so.0
#17 0x00007fffe9e9d093 in  () at /usr/lib/libgstreamer-1.0.so.0
#18 0x00007fffe9e9d4be in gst_pad_push () at /usr/lib/libgstreamer-1.0.so.0
#19 0x00007fffe9f904af in  () at /usr/lib/libgstbase-1.0.so.0
#20 0x00007fffe9e99a15 in  () at /usr/lib/libgstreamer-1.0.so.0
#21 0x00007fffe9e9d093 in  () at /usr/lib/libgstreamer-1.0.so.0
#22 0x00007fffe9e9d4be in gst_pad_push () at /usr/lib/libgstreamer-1.0.so.0
#23 0x00007fffe9f904af in  () at /usr/lib/libgstbase-1.0.so.0
#24 0x00007fffe9e99a15 in  () at /usr/lib/libgstreamer-1.0.so.0
#25 0x00007fffe9e9d093 in  () at /usr/lib/libgstreamer-1.0.so.0
#26 0x00007fffe9e9d4be in gst_pad_push () at /usr/lib/libgstreamer-1.0.so.0
#27 0x00007fffe98c8d26 in  () at /usr/lib/gstreamer-1.0/libgstcoreelements.so
#28 0x00007fffe9ec3951 in  () at /usr/lib/libgstreamer-1.0.so.0
#29 0x00007ffff3044bd7 in  () at /usr/lib/libglib-2.0.so.0
#30 0x00007ffff3041d21 in  () at /usr/lib/libglib-2.0.so.0
#31 0x00007ffff63853e9 in start_thread () at /usr/lib/libpthread.so.0
#32 0x00007ffff3905293 in clone () at /usr/lib/libc.so.6

To test it, you will need retroarch installed.

Hopefully the readme is clear enough, but to run everything, do:

mkdir test_directory
echo "Dragon's Lair" > test_directory/game_name
cargo build
env VIDEO_PATH=/path/to/test/mkv/file retroarch -f -L ./target/debug/liblasertronics.so ./test_directory

There are a few things wrong with that code at least

  • You require RGBA in the caps but handle it later as RGB16 (e.g. RGB_565_BYTES_PER_PIXEL). That’s not going to work well.
  • You enforce GL memory but then just map it normally and upload it again via libretro. That’s double work. Either don’t enforce GL memory on the GStreamer side, or use the GL texture you get there (which will require sharing between the GStreamer GL context and the libretro one)

Also check that the size of the mapped buffer is the same as the size in bytes that libretro expects there. That’s likely the reason for the crash.

When writing unsafe code you need to make sure that the types are matching, sizes that are expected behind pointers are matching, and generally that you follow all the rules given by the unsafe API you’re using. The compiler is not going to rescue there and instead you’ll get all the joys of writing C code.

I found this documentation regarding using OpenGL.

From what I can tell, I actually need the XRGB8888 format.
Is there a corresponding pixel format in gstreamer?

Thank you for spotting the RGBA discrepancy :slight_smile:

I have done a bunch of reorganisation, and I have configured some environment callbacks in retro_set_environment().

As far as using GL memory, do you mean this line:

// .features(&[&gstreamer_gl::CAPS_FEATURE_MEMORY_GL_MEMORY])

?

If so, that line is commented out, so it is currently not applied.

Do you know what the best approach is between the two options you suggested?

With the crash, how would I tell what the size of the buffer is vs. what libretro expects?
I did not see anything helpful in libretro.h.

All of this truly does show how wonderful writing in pure Rust is :slight_smile:

That’s going to be xRGB or BGRx, depending on how libretro defines it.

Get a GL texture directly from GStreamer and set up sharing between libretro’s and GStreamer’s GL context. That’s more difficult though.

I don’t know libretro. In your position I would start reading its code to figure out what exactly it’s trying to do there and how that differs from what you provide it.

Thinking about it, I would like to focus on accomplishing one of my issues.
How would I implement buffering for the audio? That way I can take something out of the queue.

Also, I did have a thought as to why the audio functions properly, but the video crashes.

There are two audio callbacks in libretro:

pub type AudioSampleFunctionPointer = unsafe extern "C" fn(left: i16, right: i16);
pub type AudioSampleBatchFunctionPointer = unsafe extern "C" fn(data: *const i16, frames: libc::size_t) -> libc::size_t;

However, there is only one video callback:

pub type VideoRefreshFunctionPointer = unsafe extern "C" fn(data: *const libc::c_void, width: libc::c_uint, height: libc::c_uint, pitch: libc::size_t);

The one I am using for audio is the AudioSampleBatchFunctionPointer, which says this from the documentation:

/* Renders multiple audio frames in one go.
 *
 * One frame is defined as a sample of left and right channels, interleaved.
 * I.e. int16_t buf[4] = { l, r, l, r }; would be 2 frames.
 * Only one of the audio callbacks must ever be used.
 */

The documentation for the video callback says this:

/* Render a frame. Pixel format is 15-bit 0RGB1555 native endian
 * unless changed (see RETRO_ENVIRONMENT_SET_PIXEL_FORMAT).
 *
 * Width and height specify dimensions of buffer.
 * Pitch specifices length in bytes between two lines in buffer.
 *
 * For performance reasons, it is highly recommended to have a frame
 * that is packed in memory, i.e. pitch == width * byte_per_pixel.
 * Certain graphic APIs, such as OpenGL ES, do not like textures
 * that are not packed in memory.
 */

My thought is that GStreamer is rendering multiple frames for both audio and video, which is fine for the batch upload for audio, but it fails on the video upload, because it only expects one frame.

For example by simply keeping around a bit of data before you start passing it to libretro, and filling that up in the beginning before starting everything.

One buffer contains one raw video frame.

Thank you :slight_smile:
I will look into storing part of the audio into a global vector and see how that goes.

How would I go about doing this?

That’s mostly a question for libretro: what options does it offer for GL context sharing?

You can find an example with GStreamer and glutin here, and there are also various examples on the internet with GStreamer sharing with GTK, SDL, Unity and various others.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.