G_file_load_contents doesn't get the whole content on Windows

Hello everyone,
I’m using glib for a multiplatform project:

I’ve an issue with g_file_load_contents that seems not correctly working with uri on Windows. This following function works well on GNU/Linux, but on Windows I only get the last line, for example with this url: http://cdsweb.u-strasbg.fr/cgi-bin/nph-sesame/-oI/A?M45

static gchar *fetch_url(const gchar *url) {
	GFile *file = g_file_new_for_uri(url);
	GError *error = NULL;
	gchar *content = NULL;

	if (!g_file_load_contents(file, NULL, &content, NULL, NULL, &error)) {
		// some outputs
		g_clear_error(&error);
	}
	g_object_unref(file);
	return content;
}

EDIT: my assumption now is that the url could be the issue? But why.

Ok, I did some tests.

The url I use are dynamics.
Example: http://cdsweb.u-strasbg.fr/cgi-bin/nph-sesame/-oI/A?M45

If I apply a wget, I have:

wget http://cdsweb.u-strasbg.fr/cgi-bin/nph-sesame/-oI/A?M45
--2020-11-19 15:47:44--  http://cdsweb.u-strasbg.fr/cgi-bin/nph-sesame/-oI/A?M45
Resolving cdsweb.u-strasbg.fr (cdsweb.u-strasbg.fr)... 130.79.128.30
Connecting to cdsweb.u-strasbg.fr (cdsweb.u-strasbg.fr)|130.79.128.30|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/plain]
Saving to: 'A?M45.1'

But if I use this little piece of standalone code on Windows:

#include <glib.h>
#include <gio/gio.h>
#include <glib/gstdio.h>
#include <glib/gprintf.h>

// Compile with gcc main.c `pkg-config --cflags --libs gio-2.0` -o test

int main() {
  const gchar *url = "http://cdsweb.u-strasbg.fr/cgi-bin/nph-sesame/-oI/A?M45";
  GFile *file = g_file_new_for_uri(url);
  GError *error = NULL;
  gchar *content = NULL;

  if (!g_file_load_contents(file, NULL, &content, NULL, NULL, &error)) {
    g_error("%s\n", error->message);
    g_clear_error(&error);
  }
  g_object_unref(file);
	
  g_printf("%s\n", content);
}

The buffer returned is like I have fetched the url without the A?M45. But this is only on Windows. On GNU/Linux it is working well.

Don’t use g_file_get_contents() with HTTP URLs. I recommend using Soup to download data from an HTTP server.

Hello, thanks for the answer.
What is the problem with g_file_load_contents?

GFile is not really an abstraction for downloading data from random URIs: it’s heavily geared towards a file-system-like use, e.g. browsing volumes, enumerating files, querying metadata, etc. Think WebDAV, not a web browser or wget.

Additionally, GFile is implemented through extension points, mostly to avoid additional/circular dependencies to GLib. On Linux this role is deferred to Gvfs, which will use libsoup under the hood anyway. On Windows, there’s an HTTP file implementation, but there’s no guarantee that URI support is fully compliant; it’s likely enough to query the metadata associated to a file accessible via an HTTP connection.

If you know you are going to retrieve a specific file from an HTTP server, using libsoup is more appropriate, as you cut out all the middle layers.

1 Like

I think you did not finished your sentence :).

1 Like