I’ve seen hot takes about this on Reddit where people bring up these points:
A bunch of volunteer maintainers that neither have the time, motivation or skill to security audit these packages won’t do any good.
It’ll never work because devs want to go fast and it’s not worth introducing stuff that slows down process by doing any vetting
To the first point I want to look at the evidence, which clearly suggests malware and shit like never make it into any Linux distro. This probably has less to do with security audits and expertise and such, but rather the desire of the packagers to actually package useful and legit software. It acts more like a general heuristic spam filter that throws out sketchy shit as part of the assessment of any software being useful and trustworthy by culturally aware people. These people can’t be tricked like a shitty spam filter would.
To the second point I think some bleeding-edgelords undervalue stability and ignore the amount of work this actually causes everyday. Updating too often creates more work in many cases, though updating very rarely clearly also causes problems. There’s probably a middle ground here.
Plus this whole argument is arguably kinda tangential to the actual point: There are rolling release distros that are only days or weeks behind upstream, and they still don’t suffer that spam problem where random strangers are allowed to basically upload any crapware without human supervision.
There are rolling release distros that are only days or weeks behind upstream, and they still don’t suffer that spam problem where random strangers are allowed to basically upload any crapware without human supervision.
Well, yeah, because they only provide an incredibly tiny subset of dependencies. Writing useful software with only those dependencies would probably take at least ten times as long, since you’d have to implement everything yourself.
And basically no user or customer actually cares about potential supply chain issues. They want feature-rich software for not so rich amounts of money. If you don’t implement that software for cheap, someone else will. And no one will ever hear about your oh-so-supply-chain-secured software.
I’ve seen hot takes about this on Reddit where people bring up these points:
To the first point I want to look at the evidence, which clearly suggests malware and shit like never make it into any Linux distro. This probably has less to do with security audits and expertise and such, but rather the desire of the packagers to actually package useful and legit software. It acts more like a general heuristic spam filter that throws out sketchy shit as part of the assessment of any software being useful and trustworthy by culturally aware people. These people can’t be tricked like a shitty spam filter would.
To the second point I think some bleeding-edgelords undervalue stability and ignore the amount of work this actually causes everyday. Updating too often creates more work in many cases, though updating very rarely clearly also causes problems. There’s probably a middle ground here.
Plus this whole argument is arguably kinda tangential to the actual point: There are rolling release distros that are only days or weeks behind upstream, and they still don’t suffer that spam problem where random strangers are allowed to basically upload any crapware without human supervision.
deleted by creator
Well, yeah, because they only provide an incredibly tiny subset of dependencies. Writing useful software with only those dependencies would probably take at least ten times as long, since you’d have to implement everything yourself.
And basically no user or customer actually cares about potential supply chain issues. They want feature-rich software for not so rich amounts of money. If you don’t implement that software for cheap, someone else will. And no one will ever hear about your oh-so-supply-chain-secured software.