this post was submitted on 19 Oct 2023
47 points (100.0% liked)
Free and Open Source Software
17953 readers
150 users here now
If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I disagree with so much of this.
You might not care about the extra disk space, network bandwidth, and install time required by having each application package up duplicate copies of all the libraries they depend on, but I sure do. Memory use is also higher, because having separate copies of common libraries means that each copy needs to be loaded into memory separately, and that memory can't be shared across multiple processes. I also trust my distribution to be on top of security updates much more than I trust every random application developer shipping Flatpaks.
But tbh, even if you do want each application to bundle its own libraries, there was already a solution for that which has been around forever: static linking. I never understood why we're now trying to create systems that look like static linking, but using dynamic linking to do it.
I think it's convenient for developers to be able to know or control what gets shipped to users, but I think the freedom of users to decide what they will run on their own system is much more important.
I think the idea that it's not practical for different software to share the same libraries is overblown. Most common libraries are generally very good about maintaining backwards compatibility within a major version, and different major versions can be installed side-by-side. I run gentoo on my machines, and with the configurability the package manager exposes, I'd wager that no two gentoo installations are alike, either in version of packages installed, or in the options those packages are built with. And for a lot of software that tries to vendor its own copies of libraries, gentoo packages often give the option of forcing them to use the system copy of the library instead. And you know what? It's actually works almost all the time. If gentoo can make it work across the massive variability of their installs, a distribution which offers less configurability should have virtually no problem.
You are right that some applications are a pain to package, and that the traditional distribution model does have some duplication of effort. But I don't think it's as bad as it's made out to be. Distributions push a lot of patches upstream, where other distributions will get that work for free. And even for things that aren't ready to go upstream, there's still a lot of sharing across distributions. My system runs musl for its C library, instead of the more common glibc. There aren't that many musl-based distributions out there, and there's some software that needs to be patched to work -- though much less than used to be the case, thanks to the work of the distributions. But it's pretty common for other musl-based distributions to look at what Alpine or Void have done when packaging software and use it as a starting point.
In fact, I'd say that the most important role distributions play is when they find and fix bugs and get those fixes upstreamed. Different distributions will be on different versions of libraries at different times, and so will run into different bugs. You could make the argument that by using the software author's "blessed" version of each library, everybody can have a consistent experience with the software. I would argue that this means that bugs will be found and fixed more slowly. For example, a rolling release distro that's packaging libraries on the bleeding edge might find and fix bugs that would eventually get hit in the Flatpak version, but might do so far sooner.
The one thing I've heard about Flatpak/Snap/etc that sounds remotely interesting to me is the sandboxing.