Foundation Strategy: funding decentralised/local-first applications

Hello! Thanks for your as always cool ideas and initiatives! This makes me proud for GNOME. And this initiative really motivates me to help. I haven’t had a chance to get involved in GNOME development yet, but I’d really like to start here. Where could I help? I’m somewhat familiar with Rust and C development, and I could help with testing. I’d like help getting involved in GNOME development because I’m new here. My main contacts are email - dev@shdwchn.io and Matrix - @shdwchn10:matrix.org

Now for my thoughts on this initiative:

In a nutshell, it would be nice if we could end up with some sort of Syncthing-like technology with automatic synchronization conflicts resolution and tight integration with DE GNOME, GNOME Circle, and other community applications. With close integration of p2p-syncing technology into apps and DE, we could achieve a user-friendly and private ecosystem, even better than Apple’s. It could also contribute to the popularity of Phosh & GNOME on smartphones.

But I think as the #1 platform, GNOME could do even more:

We could make technology that would eliminate the boundaries between devices and make it almost no difference what device the user is currently using. And make that technology very modular. Then Bonsai would really shine. We could take as a basis something similar to the OSI network model:

  1. At layer one, we should have zeroconf P2P communication technology that has NAT Traversal. That way the devices can be on the Virtual Private Network and we can provide the basement for all the cool stuff that we’ll have layers above. This could either be something written from scratch, or to save time, already working libp2p/Yggdrasil, and could be inspired by the work of Syncthing and tailscale. The only thing left to do is to make the GNOME Settings & Initial setup easy to add QR codes or code phrases of other devices so that sync could start working. Also, devices in the same local network could use auto-discovery so it would require minimal action from the user. I’d love to try to make a mockup of what this could look like in the near future, if the GNOME community is interested.

  2. On the second layer:
    a) We could already have those services that the network from the first layer suffices: ssh, RDP (which is now available as a simple GUI option in GNOME 42), file sharing and remote access, synchronization of selected folders, PipeWire audio and video streaming, etc. You can also stream games via a p2p network such as Steam Remote Play. And you could also think about streaming individual windows and applications, for example via Waypipe+PipeWire. Seamless app launching from other devices could be useful.
    b) It would be cool to bring some traditionally corporate technology into the consumer world, even if partially and on a smaller scale. We could use logind, RDP, PipeWire, Wayland, sssd, LDAP and systemd-homed to have not only a collaborative remote desktop (which we already have), but also: Single Sign-On; a simple and convenient separate remote graphical login session; and a local session with settings, data and an account obtained from another device, as is possible in corporate environments with Windows or RHEL. We already have some of this in GNOME, at least as an “Enterprise Login” option, but we could go further. Imagine a family of three living in a private house. Suppose they each have a desktop/laptop and a phone. It would be handy if the son on the first floor could use his mother’s laptop to quickly open some of his documents through his account, instead of having to go upstairs to his computer. Also, it would be cool to be able to selectively sync data with other users, so the family, for example, could have a shared music library for everyone. To do this, we could create a system similar to XDG Portals: if in classic portals, we have a user-to-application permission system, then in portals for Bonsai we could have a user-to-user permission system.
    c) Network traffic optimization. For example, p2p distribution of OS updates if there is enough bandwidth. This is especially useful for local networks. For already signed or encrypted data, you could not limit yourself to only your own nodes, but use all devices in whole global network. Here it is important to have built-in upload speed limiters and traffic limiting, because the user’s network can be charged. As well as P2P transmission can have a negative impact on the battery of mobile devices.
    d) Tools to integrate with GNOME technologies, just as described in the Christian’s blog post “Introducing Bonsai”. This would give the features I’m talking about in the third paragraph.
    e) The same thing that we could do in paragraph 2d could be done by other DEs and organizations. A similar approach is already used in Flatpak: GNOME Runtime, KDE Runtime, Freedesktop Runtime, RHEL Runtime, etc. libadwaita and granite implement a similar idea.
    f) Same as the previous two paragraphs, but something basic and compatible with everything (maybe even IoT, Windows and Android!). Something like Freedesktop Flatpak Runtime, but a toolset for the Bonsai ecosystem. Perfect for something that doesn’t require DE-specific integration. You could also add a cross-platform API to communicate various simple facts to applications. Imagine that I ordered food or medicines in a delivery app on my phone and the app reports an approximate delivery time that changes periodically (traffic, cooking speed, courier speed, presence of other orders, and customer response rate). Looking at the phone to monitor delivery times can sometimes be inconvenient for the user if they are currently working on a computer. Instead, the app could tell the mobile OS about the delivery time, and the mobile OS could send that information to the computer through Bonsai. Also, through the same API, devices from smartphones to smart watches could tell each other their battery level. I think such an API would be very useful for all sorts of things.

  3. Layer 3 introduces us to the capabilities available via paragraphs 2d, 2e, and 2f:
    a) Synchronization of settings, notifications, messages (SMS, IRC, Matrix), calendars, world clocks, alarms, contacts, ToDo lists, MPRIS, ssh/gpg keys, all GNOME Keyring and more. If we can do PipeWire network streaming and integrate it, we can even route normal voice calls from the phone to the laptop! You can think of this paragraph as features already offered by {KDE,GS}Connect, EteSync, Apple, and other services/vendors. But we will have integration with GNOME, simplicity and convenience, and the p2p benefits of privacy and independence from Big Tech servers. Some of these features would be better developed with “GNOME Bonsai Runtime” - GNOME settings sync have no need to integrate with others, and some would work well with the more generic “Freedesktop Bonsai Runtime” - like notifications, calendar, contacts and other stuff.
    b) Collaboration. For example, Collabora Office, OnlyOffice, or any other applications could offer users collaboration capabilities without being tied to servers.
    c) GNOME applications could also benefit from Bonsai. Imagine Komikku, which learned to synchronize library, read chapters, and server settings between desktops and phone. We could have synchronous movie or music playback services and it would work well among different users if we had the ability to selectively sync between users that was mentioned earlier.
    d) The imaginary ecosystem we are all talking about here could give huge opportunities to third party developers and everything would be limited only by their imagination: easy for users to distribute computations on CPU, GPU and other accelerators between different machines; network RAID or Ceph-like file distribution system between user devices to keep data safe - this would have a certain overhead compared to existing solutions, but would be very simple and user friendly; and also overflowing device storage could automatically offload less used data to other users devices.
    e) Access to so much data could make tracker3 even smarter, and also give the hypothetical possibility of creating a voice assistant that would know really everything about the user, but still respect the user’s privacy. And the heavy AI computations could be distributed across devices.
    f) My phone recently broke and despite the backups, I would still have to take the time to set up a new phone (I didn’t have Google Services). If I had a smartphone with Phosh and good integration with Bonsai, I could have all my data from my phone backed up to my computer, where I have a lot of space and my phone data would definitely fit. If syncing everything at all, from the app list, to the data in them, and the settings of the phone itself, then I could spend minimal effort setting up a new phone.

And we also can’t forget the GNOME Foundation’s funding and opportunities that other large organizations and companies might be interested in:

  1. GNOME would mainly benefit from having an ecosystem where GNOME would be at its heart. There would be lots of services and features that you wouldn’t have to spend money on from the fund, because none of that would require any servers, except maybe rendezvous points for p2p (they have many names - supernodes, discovery nodes, bootstrap nodes, etc.).

  2. Red Hat, Fedora Project, Flathub, LVFS and others could greatly reduce their own costs for network traffic and the servers that deliver it. After all, as stated in paragraph 2c, updates for Flathub applications, LVFS firmware updates, Fedora updates, as well as updates to RHEL and any other Red Hat products could be streamed via P2P. I’m also sure that the technologies described could find other uses within Red Hat’s walls. For example, some networking stuff in enterprise products could be migrated to our zeroconf p2p.

  3. Purism has made great progress in convergence and has done a lot for GNOME. They have desktop, laptop, and smartphone in their current product line. What’s missing here to be completely happy? An ecosystem. A really user-friendly ecosystem for which user privacy is not an empty word.

  4. Valve could save a lot of traffic on families with multiple gaming computers or Steam Decks, but if we were crazy enough to do network traffic optimization at scale, Valve would have a CDN made with user devices, without having to spend money on a real CDN. Imagine Steam game updates spreading within an entire city, like BitTorrent. On top of that, they could move Steam Remote Play and Remote Play Together to Bonsai. Also, speed up Steam Dynamic Cloud Sync and better integrate Steam Deck with existing Linux gamers desktops. And as I understand it, Valve won’t stop with just Steam Deck and they want to at least make a portable VR headset and maybe try to make a console again. What devices Valve wouldn’t want to make in the future, they should also be interested in a shared ecosystem with synchronization.

  5. Various IoT vendors might be interested in paragraph 2f if we provide opportunities to integrate IoT devices. A private zeroconf p2p network to integrate smart home and desktops/smartphones is something many would like to see in their home, instead of the often incompatible and privacy disrupting devices that are on the market today.

  6. Blender and other programs that actively use CPUs/GPUs/etc and scale well horizontally could use p2p load sharing between machines, as suggested in paragraph 3d. Of course, this would work unless the network becomes a bottleneck, i.e. not suitable for every type of computations.

  7. There could be paid cloud sync services that would save data even if you only have one device online. With data encryption (like Syncthing Encrypted Folders) and minimized metadata footprint, we could achieve a good level of privacy and convenience for users. By the way, the GNOME Foundation could provide such a service as well - an additional source of funding is always a good thing.

  8. I don’t know how the GNOME community feels about cryptocurrencies, but we could consider this option as a very good source of funding. Both for the GNOME Foundation and others in the ecosystem.

The first generation consumer operating systems were difficult to manage, often or even always required the use of a terminal, and were unreliable. Second-generation operating systems became concerned about user experience and improved overall quality, security, and reliability. GNOME today could open and lead the era of integrated operating systems into a single ecosystem - when user devices will actually take advantage of what networking technologies can provide and finally become really smart and start communicating with each other. We’re on the doorstep of that right now. I think we should focus on usability and privacy first, but we also shouldn’t forget about functionality and capabilities. With the right groundwork, we can create a limitless ecosystem.

P.S. Sorry if this is too much of a fantasy :stuck_out_tongue:

2 Likes

This section seems to be a bit… “first world”? That probably isnt the best way to put it, but there is no outreach to new users, or even users from remote locations where internet may not be universal.

Could you please elaborate more on what you are sayin by “first world”? I didn’t get what you mean.

While I think it might be a good source of donation from the crypto nerds, I think that’s definitely going to be so controversial. Still remember the Mozilla controversy around that? I don’t want GNOME to be like that, since we already have haters unfortunately.

1 Like

No: cryptocurrencies are just unregulated securities, which is mostly enabling pyramid/Ponzi schemes that benefit early investors (VCs that provide seed funding and people with enough backing capital to buy and keep escalating mining rigs), using FOMO and other manipulations to get more greater fools to enter the market and provide liquidity for those early backers to cash out. On top of this, cryptocurrencies using proof of work are actively destroying the world’s ecosystem, by burning cheap electricity for the mining rigs, and creating tons of e-waste.

The idea of GNOME endorsing or even using cryptocurrencies goes against the very nature of the project, and it ought to be resisted by anybody in the community.

14 Likes

I have a gut feeling that disability, chronic illness, and health, in general, might fit into the list. I will just post some random thoughts instead of trying to come up with a coherent concept.

  • Tracking symptoms, medication, and other treatments can be very helpful with many conditions. I probably don’t have to explain why this is data you may not want to store in the cloud.
  • Health topics like menstruation can be taboo or at least shame-ridden. Having very easy access to information about such topics could be helpful.
  • Reaching out for help for certain conditions can be hard or even dangerous for some people. I’m mostly thinking about mental health conditions. I have very concrete ideas for something like an emotional emergency intervention app.
  • Having first-aid instructions available quickly and with as few barriers as possible might be something interesting?
  • Not related to local first, but something that bothers me a lot, so I’m adding it here: Many apps and services have a tendency to cater to healthy people to help with self-optimization. Often that does not cover the reality and needs of disabled people. Also, the perspective of those apps can be dangerously one-sided. For example, health apps that focus on weight loss without asking if the person using the app might actually have an eating disorder and wants to stay away from calory tracking of any kind. I feel like every community that can help with increasing the diversity in those areas should try to keep those things in mind.

Maybe those loose thoughts can spark some ideas :slight_smile:

4 Likes

Tracking symptoms, medication, and other treatments can be very helpful with many conditions. I probably don’t have to explain why this is data you may not want to store in the cloud.

Absolutely agree! These sort of data is going to be so bad if it gets into the hands of advertisers and hackers.

Thanks Sophie, that helps a ton! It’s a perspective I didn’t have and I’m really grateful both for bringing my attention to the existing issues, why they are issues and how local-first apps could help. That’s very useful!

Hi! The point of this project is to introduce a change at the platform level indeed, so app developers targeting the GNOME platform can take advantage of it.

Thank you for bringing all these organisations and how they could benefit from it. That’s a valuable list to keep at hand for future collaboration in this field. With that said, I believe we should not be too carried away, start small, and deliver results early to help people early on and encourage donors to keep going with us :slight_smile:

How realistic is it to do something like this on p2p between Gnome hosts? I think this is a perspective look at the future of Gnome and a step towards developing our own ecosystem. Gnome is becoming more convenient and popular every day, but the lack of an ecosystem can be a problem for many sections of society, because you have to invent your own solutions or use third-party services. Yes, Gnome doesn’t have a big commercial parent company at its core, but maybe that could lead us to more modern solutions? Perhaps p2p technologies will allow us to create a self-sufficient ecosystem, first to satisfy the problem of synchronizing settings and restoring the basic environment, and then for a wider range of connected tasks, up to the interaction between devices in a distributed network through the API offered by the environment.

Of course, this should work simply and out of the box, and with a convenient permission system for third-party programs. If the system keeps my configuration up to date even with the transition to a new device, this will already be a big step towards a new experience. This could be a new security foundation to access a lost remote device when connected to a network, or do something about sensitive data when stolen. For example, we can store sensitive data encrypted even locally, and allow access to it only after the user has been verified in a decentralized network.

My point is that perhaps we should think about a secure ecosystem, at least for critical data and configuration data. This will allow Gnome to combine the best of both worlds, in the ability to work both online and offline, and at the same time get rid of (at least partially) from centralized servers. However, there are many things to consider in the architecture, such as security, reliability, simplicity, scalability, and modularity.

I’ve been thinking about how to get to a future where all your personal devices magically stay in sync with each other without any central service, no matter if that service is a cloud, a hosted server, a local server or just one device.

For example the model in bonsai still requires a central service and it brings with it all the failure modes of a cloud except that you’re now responsible for your cloud.

Looking at this issue naively is a bit of an issue. There already are people in this thread who want a system which synchronizes files but as you will hopefully see that’s a horrible idea.

Assume that we have three devices A, B and C. We don’t want a central service which means that each device can make changes independently of any other device so we will end up with state Sa on device A, Sb on device B and so on.

In a system with a central service this doesn’t happen. Every device is in the same state but it requires the central service to be available for any device to make changes. If you can’t reach the google servers you can’t make changes to your google docs.

So how do we solve that issue without a central service? Can we just take Sa and sent it to all other devices? Sure, that’s possible but you loose the changes made to Sb and Sc. Typically systems which synchronize files just ask the user which of the states they want to keep. You still loose changes but at least you can choose what you want to loose. This is what’s called a conflict and choosing one state is a conflict resolution strategy.

As a user, what we really want is some magic which takes Sa Sb and Sc and gives us a state Sabc which contains all the changes of all states. How exactly does this new state look like? Well, it depends on the application. For example consider a text-editor where you delete a word on one device and change the same word on another device: what’s the correct combined state? There isn’t an obviously correct answer to the question.

Either way, we know how to construct data types in such a way that if all nodes see all the changes made on all other nodes, the resulting state is the same on all nodes, and that no matter which changes any node sees, the state is a valid state. These data types are called CRDTs and they are amazing!

I believe that CRDTs are the only sensible choice for any application synchronization which doesn’t want to require central services. CRDTs are unfortunately something that has to be implemented in each app specifically because the choice of CRDTs depends on the specific data model of the app.

CRDTs require a network layer which provides Reliable Causal Broadcast and to reach the goal of not relying on central services the system has to work with p2p connections.

The first issue with p2p connections is that they require some sort of bootstrapping so that nodes can find each other. In the beginning this can be a central service but there is distributed ways to achieve the same, using DHTs or Tor.

The second problem with p2p connections is NATs. This is an unsolvable, best-effort problem. You can hole-punch through a lot of NATs using UDP but there is NATs where that will not succeed and you either give up or use a proxy to still get a connection going.

Other issues then are encryption and making the broadcasting reliable and causal.

Realistically it would be good to use leverage projects which already deal with all this complexity like libp2p.

So, enough of the technicalities of how to implement all of this. How would the future look like from a users perspective?

There should be a new settings panel for devices. You should be able to enroll a device into your personal devices group either via QR code or with some temporary key/code.

Applications should then synchronize their data to all the devices in the personal devices group. Tabs in the browser, settings, etc all should be synchronized. A music player should sync the metadata, ratings, etc but not the actual music. A markdown editor should synchronize the whole documents.

In the even more distant future you should also be able to add other people to the system with which you then can collaborate on e.g. a document.

There is a lot more to all of this but it’s getting too long already.

tl;dr: Applications implement everything they want to sync with CRDTs (we can build a library for that). The system has to provide a network layer to send messages between devices.

All of this is doable today. It just needs someone doing it.

1 Like

Indeed, one of the ideas behind this whole project is to develop an ecosystem and a platform developers can target to create local-first applications that can stay in sync.

Thanks for bringing these technical considerations to the brainstorming! Those are indeed challenges some of us are familiar with and we keep in mind. CRDTs are certainly one of the solutions we consider… but we’re not here yet!

At the moment, we are building the strategy to target populations who either have no access to the Internet, limited access, or untrusted access (e.g. in countries where governments can be snooping on what people are doing). The next step is to conduct a user study to have a better idea of what are the actual problems and needs these people have.

Once we have a good vision on their functional needs, we can take a step back and think of a solution that solves their problems and is extensible for future milestones, get designers to come up with a user friendly way of doing it, and finally get it implemented.

More to come on this initiative (hopefully) shortly :slight_smile:

  1. I really like the term “local first”.

  2. It’s important to focus on specific app use-cases, one at a time. Generic tools like the old Conduit are not a great end-user experience.

Also, quick nod to a free software app already doing this: Joplin Notes, which can sync notes via various providers (Dropbox, Nextcloud, S3, WebDAV, an NFS mount… or their own “Joplin Cloud” server), with end-to-end encryption.

1 Like

I really like the term “local first”.

As far as I know “local first” is an existing term and was not just made up for this proposal, but this was the first time I heard about it. To me it doesn’t seem to do a good job of communicating what it is about to people not already familiar with the term and not willing to read beyond headlines. When I first read “local first” my initial assumption was that this is about applications that keep data local to the machine they are running on and maybe add syncing as an afterthought or not at all. This is quite the opposite of what this is about.

I would prefer some term that let’s people know that this is about a specific way of syncing rather than making syncing seem like like something that is not part of the target features.

TLDR: local-first is fundamental. Peer-to-peer synchronization is a last resort for situations without internet access. With full or degraded internet access, synchronization with decentralized services is important (preferably hosted by small human-scale organizations).

Having local-first built applications is the insurrance that people can still access their data in a world without (at least permanent) internet access. I think indeed that this is a fundamental principle.

In the past years I have spoken with politicaly engaged people and activists, in a world with no lack of internet access (yet), but no one directly mentionned local-first applications.
The need that they expressed to me was to have efficient collaboration tools in which they could trust.

  • The key word here is the collaboration, because this mostly what activists do: they organize things, they communicate.
  • Some of them feared for their privacy, some of them are geeks but the most part is people without real interest in technology, or even technophobia. Their tools needs to be efficient, because nowadays a lot of political communication and collaboration is done through Facebook groups, whatsapp, Google Drive or Google Calendar. A decade after Snowden, activists still don’t know better than tech giant centralized, untrustworthy but user-friendly, services to organize themselves.
  • They need to trust their tools, because some of them told me that they feared to use tech giant tools. In the end the sometimes work unplugged (IRL talk, handwritten communication etc.), this is safer but way more inefficient.
    Local-first is a fundamental principle, but synchronization must not be forgotten. The former has a lot less value without the later, as synchronization brings collaboration, and thus usage.

Sychronization, but to who?

In a world without internet, easy peer-to-peer data exchange from one persone to another might be a good thing to have - I think of serverless synchronization through a local network or with bluetooth - though I never met someone who confirmed me this need.

In a world with permanent or even intermitent internet access, the peer-to-peer synchronization approach sounds unrealistic to me.
Serverless synchronization from device to device is cumbersome when you want to communicate with a group of person. Local political groups involve dozens of persons, this is not always realistic, nor safe, to gather everybody in the same room so you can exchange data with them.
Acvitists may not have the time or interest to learn the self-hosting skills, as they are usually pretty busy doing activism.
Multiplying the devices does not sound ecological nor resource efficient neither (for example, if everyone hosts their own raspberry-pi with yunohost and a mail stack).

If centralization and distribution are not the better paradigms for cooperation, this lets decentralization.
Small human-scale organizations hosting services for a community of a few hundred people seems a good compromise IMHO.
I think of something like the CHATONS initiative from Framasoft tries to encourage.
Organizations this size would be large enough so people can mutualize the costs of the infrastructure and the maintainance, and not everybody needs to be an expert.
Though they would be small enough so users can talk, meet and trust the people who are hosting their data.

From a technical point of view, tackling the problem from this side (instead of serverless synchronization) has the benefit that there is already a lot of existing standards to use (IMAP, SMTP, WebDAV, CalDAV, CardDAV, OIDC etc.).

I would love GNOME to encourage the development of human-size hosting organizations by supporting and developping open synchronization standards.

2 Likes

I would love GNOME to encourage the development of human-size hosting organizations by supporting and developping open synchronization standards.

This sentence reminded me of another existing player here: the Small Technology foundation, who have a concept of “Small Web”. The focus is on web rather than desktop, but the goals are very much overlapping:

1 Like

“local first” is a term coined by web developers where it makes a lot more sense than in the context of an operating system. Instead of storing app state on a server, local first apps store it locally ( localStorage) and then replicate that state to other clients with the use of CRDTs.

It’s not only about syncing though but also about collaboration. I agree that “local first” is not the best term either way.

I am so against this. Any kind of hosting requires new machines, resources to keep the machines operating and makes a central point of failure. People have tried this over and over without any success. Requiring extra hardware fails, requiring people to spend money on a service fails.

The last ~5 years people have developed the technology to make this device to device synchronization and collaboration without central services and extra hardware possible. Not using this would be a huge mistake.

2 Likes

Note that I am not against the peer-to-peer synchronization principle. As I said it is great to have when internet access is broken. However I do not understand how it would scale to match the usage people have today, with abundant internet access and numerous online social interactions.

I may not have understood the whole picture though, so I am interested to hear fictional user stories where a large group exchange data without any kind of hosting.
Say one need to share a document with a dozen people around the country. How do you find on the internet the peers you want to synchronize your data with, without registering them on any kind of hosted service? Should you meet them IRL instead?

Once you have a completely decentralized system it’s easy to add back some degree of centralization.

For example if you have two devices which will never be able to connect directly to each other, maybe because they are never turned on at the same time or because the firewalls between them prevent this kind of access, you can introduce a third node which acts as a proxy-buffer for messages.

If you have a group of hundreds of people the bandwidth becomes a problem if every change as to be sent to every other peer. With a suitable network and encryption layer it’s possible to add nodes which are responsible for fan-out on behalf of clients to reduce their bandwidth.

All of those extra nodes could be your own devices, your peers devices or a third party service. Starting with the decentralized p2p approach makes you way more flexible and also allows to give up some degree of decentralization. The other way is either impossible or at least much harder.

That’s also a problem with hosted services. You need some kind of identifier for your peers and share that somehow. Granted, p2p systems often have identifiers which are not really usable for humans but this is just another example where introducing some degree of centralization could improve usability.

Gosh, this has been quite heated. To cool things down, may I talk about something?

You know, honestly. There is an absence of local first and open source apps in proprietary platforms like Windows or Android.

The ecosystem there are dominated by commercial apps that usually well, heavily influenced by cloud-first thingies, require online accounts where it doesn’t really make sense to have them, have poor privacy practices and well, all that good stuff.

On Android, it’s even worse. Try search for “QR Code Scanner” or “Whatsapp Status Downloader”, most apps them are proprietary and even sketchy!

I wonder if the GNOME project and the foundation would like to promote an open source and local-first ecosystem in Windows and Android. Our Circle or core apps are too focused on Linux at the moment, and GDK hasn’t been ported to Android yet.

I think this is really important since I’m pretty sure most journalists out there still use Windows for their stuff.

1 Like