Basically, the AUR works by downloading source code and running a build script that builds the app specifically for an Arch system, right? So why isn’t there an equivalent that works on most or all distros? Almost every time, compiling a Linux app is just running the commands that the developer tells you to, and for the uncommon times there are distro-specific differences, they’re usually listed in the README. For many userspace apps, even building across different processor architectures requires little to no change in the commands required as long as there’s a platform specific compiler. Why isn’t there a cross-distro system that can download source code and run a build script while knowing what distro-specific commands to use based on the developer’s instructions? Heck, compiling an app on the system you plan on running it on can unlock a little more performance by taking advantage of compiler optimizations for that specific processor, and it doesn’t take that long on a reasonably modern computer anyway, so why isn’t there an incentive to do it more often?
I would say a tool like this sounds even better than snaps or flatpaks. Perhaps no one thought of this? Because it sounds to me like a great idea waiting to be pulled off. I really don’t wanna see any more curls or wgets in installation guides.
People have tried and failed this concept for decades. It’s not a new idea, but it’s an incredibly large and complicated problem to solve, not only because package names and versions differ from distro to distro, or distro version to distro version, but the contents of the packages and what they support and are compiled with differ too.
In reality it’s not possible to get perfect, but with an absurd amount of effort a subset of support could be made. Your program just wouldn’t be able to have the same guaranteed feature set across distributions.
I think it is too complex. Most distributions don’t have the same Linux directory structure and different versions of libraries. So you have to ship and maintain different versions of the same library. Nix and Guix are more or less a solution, I think.
I like how Guix does it. It’s nominally a source based distro but their build farms provide binaries, so in many cases you don’t even need to build anything from source. You can also supply transformation options such as “build from this git commit” or “build with this patch” if you need it.
You can even setup your own build farm if you so desire, or if you’re using it in a research or commercial setting.
Pkgsrc does this. It’s just a huge pain.
I think the main problem is dependencies. If a tool is supposed to compile any code for any distro, it needs to support many different package managers. And I would say that Docker or other container software effectively solves this problem (but still needs more work for desktop use).
The Nix package manager works on most distros and even on MacOS so most nixpkgs expresions work out of the box (except
modules
which are intended to simplify system configuration and are tailored to NixOS)On the GNU side of thing there is Guix but lacks non-libre software. (Same principles and also great)
deleted by creator
I have also wondered this sometimes.
right. on top of package manager support there would need to be more rigid standardization of package names. I have run into dependency libraries with differing names across different package managers even though they are the same code so it’s not as simple as dropping in apt install/pacman -S/emerge <package>