It’s impressive how in the last few months (and especially the last few weeks) the discussion around bundled applications for the GNU/Linux Desktop has sparked.
It’s especially interesting because:
- The problem is not new.
- The solutions that have attempted to tackle the problem in the past have been ignored (both by us developers and by distributions).
The TLDR
First, let me try to subjectively summarize the problem: Historically, the resources we get in GNU/Linux come from the distributions. Anything: executables, libraries, icons, wallpapers, etc. There’s been alternatives to all of those, but none has flourished as a globally adopted solution.
This guarantees that everyone using a distribution will have access to the resources the distribution can offer. The more powerful the distribution is, the more we get. There’s limitations nevertheless, so some restrictions have to get in place. The ensemble of limitations and technologies adopted will effectively define the user’s experience.
This works. It has worked for years and, given the technology is in place, it could easily keep working. Like in most engineering solutions there’s drawbacks and properly addressing them can bear some goodness. It seems like now it’s the moment to review this situation. Let’s enumerate some of the problems we have nowadays:
- We have users using really old versions of our software with issues we’ve solved in versions they can’t use.
- It’s really hard for GNU/Linux users to get users to test unstable versions of our software.
- We have users who want to use fresh versions of some software but not in the whole system.
There’s been many solutions to fix those, some easily come to mind: ArchLinux’s AUR (with yaourt), Ubuntu’s PPAs, big-tar application packages, OpenSuse’s OBS, and possibly others.
Far from showing the maturity of the Linux desktop, what this depicts is the deep fragmentation we’re into: we have come up with different solutions that break the established distribution paradigm by lowering the restrictions and considering the resources offered as unsupported (often tainting the whole system).
What has appeared recently is sandboxing. It’s especially interesting because by letting the users execute any binaries we’re increasing the exposition of their systems. Hence, jumping from our distributions’ nest into the lions. As always, sandboxing creates new challenges: It requires changes in applications (or frameworks) to adapt, often creating a user interaction fence (e.g. a popup asking if you let Kamoso access the webcam). For what it’s worth, that’s not new: Android does it, OS X does it, Windows does it (from the Store), Chrome OS does it, etc.
Now where are we?
We need to decide about GNU/Linux’s future. Or at least, we need to understand what Plasma users will have available. So far, most of the noise comes from the big players in the business trying to differentiate their products, meaning incompatible versions.
Without an agreed unified solution, we’ll have to assume we’ll end up having installed snappies, flatpaks, AppImages as well as applications from the distribution. Then it’s just a matter of:
- Presenting it properly so that the user knows the risks taken by executing an application (!)
- Make sure we don’t lose many features by sandboxing.
Still, one of the good things of this new approach is that it shouldn’t have to be necessary to have several people dedicated to build every single application and component. If the solution is to add 3 more solutions that will need dedicated people, we’re not really moving forward.
Building
As soon as we’ve decided how we want to work, then the interesting stuff needs to appear. If this is properly engineered, it can bring really interesting possibilities that now we hardly ever find:
- Newer versions of applications on administered systems (e.g. universities).
- Enabling stable distributions on professional environments.
- Beta channels.
- Binary application 3rd party extensions.
- Provision of debug symbols (some distros don’t offer them).
To finish the fantastic post, a note for the dreamers:
How easier would all that be in a microkernel architecture?
We need you!
Of course this will be a long journey and we need your collaboration. This year in Randa we started working on all these problems in several different angles. It’s important for the KDE Community to have your support, so we can keep providing quality software. Consider donating, doesn’t need to be a lot, everything counts.
I love KDE but each time I update in hope of problems getting fixed (I have openSUSE Tumbleweed) I not only get old stuff fixed, I get new bugs and sometimes worse than old bugs. Like this new update that breaks keyboard input whenever I disconnect and reconnect my thinkpad form docking station. I think the solution is for KDE to improve its quality control, radically. I’d rather have a stable release every year than a broken release every month.
Hi Rsh,
I agree, help is very welcome!
I am not convinced, that this bundled packages will really be an advantage on a broad field. Maybe this is because i already use Gentoo, where many of the above mentioned problems are already “solved”, such as:
– beta channels
– running stable system with a few fresh (bug-fixed) packages.
– debug symbols
– test unstable or even git versions of software (however you still have to care to not mess up your configuration files, but i guess this is still the problem with the bundles.
– having slotted packages which can be installed in parallel
– Sandboxing is also not limited to the new bundled package formats
I see the advantage for this type of bundles especially for:
– proprietary vendors who want to keep their code out of the public repositories
– niche applications, which are not packaged by your distribution.
– testing on binary distributions which do not have beta packages.
– you have not to solve dependencies on a client level.
What was not talked here are the problems that come with bundles, which were already solved now:
– bundles ship a lot of software that now needs to be maintained by the application developer. If library X has a security problem and needs an update, you rely on every single applications to ship correct updates in order to make your system save. So as a system administrator i now have to rely on many different developers to do their thing right. Many places, easy to oversee one of them. Questions like, did package XY already include the updated library Z, will be your daily bread. This will NOT make it easier to keep a save system as an administrator, because bundled applications will not free you as an administrator from understanding which components are involved. Ironically it makes the situation worse as it removes dependency transparency. Nowadays, i see that library Z is updated and i am sure, that every other software will use it and i do not even have to know which packages actually depend on it.
– package size (every app need to deliver everything)
– memory usage (no more library sharing)
– one less filter mechanism to prevent malicious software (distribution package maintainer). Even if the developer of a software is not knowingly doing something insecure, he now needs to track the development of every single component he relies on! In the current situation, the package maintainers need to follow the security issues of their specific package only.. not all the depending packages.
Overall, I see the advantages in some cases, but i do not think that this is the right way to do for general purpose. It weakens the free software position in the distributions (as the proprietary vendors will now be able to deliver software on par) and introduces many other issues. Many problems are just moved towards the developers.
I completely agree with Till. I cannot see bundled applications ever being more than a last-case-scenario supplement to software shipped by regular distribution channels.
I’m on openSUSE Tumblweed and also a packager of a few minor things there. OBS is really easy to use and it supports a variety of OSs (including Arch itself, Ubuntus, Debian, etc.), so I’d rather push for a broader adoption of OBS as a solution instead. Of course, it would be nice if the other distributions would also contribute hardware to OBS, since it can be quite overloaded at times.
Hi,
What all application bundles are is a subset of static linking. I see this as crazy; “DLL hell” wil ensue IMO.
It already happens : Steam bundles its own runtime (opengl & libc included) What happens when glibc gets updated on the host system? bundled graphic drivers stop working (among others)
https://wiki.archlinux.org/index.php/Steam/Troubleshooting#Steam_runtime_issues
I do see where the desire for appbundles comes from, but IMO the push should be to to “reform” distributions, to be more “rolling”
“The solutions that have attempted to tackle the problem in the past have been ignored” because the whole concept of “bundled applications” is fatally flawed. It is a huge waste of space everywhere (download bandwidth, disk space, even RAM because you lose the sharing of read-only sections of system-wide shared libraries) and a security nightmare (who will keep the bundled libraries up to date? And how long will the updates take to trickle down? Bandwidth requirements are also going to limit the respin rate for changed dependencies).
This horrible “idea” was discarded again and again for a reason! Sadly, it keeps getting floated over and over again by people who just don’t get it.
“How easier would all that be in a microkernel architecture?”
I don’t know. How would it be? Please educate me here.
Application bundles certainly have their disadvantages and they are certainly not the future solution for every problem with software deployment. They do however fill a useful niche, which is shipping up-to-date applications on not up-to-date systems.
Some counter-arguments I just don’t understand though.
– “waste of space”. When you open amazon.com, you cause 10 MB of traffic (try it!). Your 4k wallpaper is probably 30 MB in size. You probably have 500 MB of locales installed because your system just pulled them all in. You have 2 TB of movies stored on your external hard drive. But downloading and storing 100 MB to run a program you use all day is a problem? Don’t be ridiculous.
– “waste of RAM”. RAM usage by shared libraries has _never_ been a major point compared to application runtime data, and RAM being as cheap as it is today it just doesn’t matter. If it works, cool, but if it doesn’t it’s really not an argument which should cause any decision to tip.
– “security nightmare”. How many of the applications you run are actually security-critical? Probably your web browser and email client, and maybe a chat application. But not your IDE, your photo editing program, or your file manager. Before you start discussing what kinds of security flaws they might have: we don’t publish security updates for those _anyways_. So unless you are on a rolling release distro, you are not getting any security updates for those applications. And I have yet to see the hoarde of users losing their data to a dolphin security issue.
So, let’s stay pragmatic and try to have a realistic view on the _actual_ pros and cons, and possible applications, for alternative distribution mechanisms.
Everytime I read about snappy or flatpack I remember Klik:
https://dot.kde.org/2005/09/16/dont-install-just-copy-klik
That blog entry is from 2005, just to say.