• 0 Posts
  • 894 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle
  • If the package is popular then it is very likely already packaged by your distro. You should always go there first if you care that much. If the package is not popular enough to be packaged by a distro then how does another centralized approach help? Either it is fully curated like a distro package list and likely also wont contain some random small project, or it is open for anyone to upload scripts to so will become vulnerable to malicious scripts. Worst yet people would be able to upload scripts to projects they don’t control as the developers of said project likely wont.

    Basically it is not really any safer then separate dev owned websites if open nor offer better package support then distro repos if curated.

    Maybe the server was hacked and the script was changed?

    Same thing can happen to any system though. What happens if your servers for this service are hacked? Being a central point makes you a bigger target and with more people able to change (assuming you are not going to be the only one to curate packages) things you have a bigger area of attack. And once hacked they can compromise far more downloads than a single package.

    Your solution does not improve security - just shuffles it around a bit. Sounds nice on paper but when you look at it in more details there are a lot more things you need to consider to create an actually secure system that is better then what we currently have.



  • There is also no way to verify that the software that is being installed is not going to do anything bad. If you trust the software then why not trust the installation scripts by the same authors? What would a third party location bring to improve security?

    And generally what you are describing is a software repo, you know the one that comes with your distro.




  • Random programming certificates are generally worthless. The course to get them might teach you a lot and be worth while, but the certificate at the end is worthless. If it is free then it does not matter too much either way, might be a good way to test yourself. But I would not rely on it to get you a job at all. For that you need other ways to prove you can do the job - typically with the ability to talk about stuff and having written some real world like application. Which a course might help you do to.



  • The only things not linked to cancer are the things not yet been studied. Seems like everything at some point has been linked to cancer.

    The data showed that people who ate as little as one hot dog a day when it comes to processed meats had an 11% greater risk of type 2 diabetes and a 7% increased risk of colorectal cancer than those who didn’t eat any. And drinking the equivalent of about a 12-ounce soda per day was associated with an 8% increase in type 2 diabetes risk and a 2% increased risk of ischemic heart disease.

    Sounds like a correlation… someone who eats one hot dog and drinks one soda per day is probably doing a lot of unhealthy things.

    It’s also important to note that the studies included in the analysis were observational, meaning that the data can only show an association between eating habits and disease –– not prove that what people ate caused the disease.

    Yup, that is what it is. A correlation. So overall not really worth the effort involved IMO. Not eating any processed meats at all is not likely a big issue, but your overall diet and amount of exercise/lifestyle. I would highly suspect that even if you did eat one hotdog per day, but had a otherwise perfect diet for the rest of the day and did plenty of exercise, got good sleep and all the other things we know are good for you then these negative effects would likely becomes negligible. But who the hell is going to do that? That’s the problem with these observational studies - you cannot really tease out the effect of one thing out of a whole bad lifestyle.

    I hate headlines like this as it makes it sounds like you can just do thins one simple thing and get massive beneficial effects. You cannot. You need to change a whole bunch of things to see the types of reduction in risk they always talk about. Instead they always make it sounds like if you have even one hot dog YOU ARE GOING TO DIE.





  • Never said it had to be a text file. There are many binary serialization formats that could be used. But is a lot of situations the overhead you save is not worth the debugging effort of working with binary data. For something like this that is likely not going to be more then a GB or so, probably much less it really does not matter that much if you use binary or text formats. This is an export format that will likely just have one batch processing layer on. This type of thing is generally easiest for more people to work with in a plain text format. If you really need efficient querying of the data then it is trivial and quick to load it into a DB of your choice rather then being stuck with sqlite.


  • export tracking data to analyze later on

    That is essentially log data or essentially equivalent. Log data does not have to be human readable, it is just a series of events that happen over time. Most log data, even what you would think of as traditional messages from a program, is not parsed by humans manually but analyzed by code later on. It is really not that hard to slow to process log data line by line. I have done this with TB of data before which does require a lot more effort to do. A simple file like this would take seconds to process at most, even if you were not very efficient about it. I also never said it needed to be stored as text, just a simple file is enough - no need for a full database. That file could be binary if you really need it to be but text serialization would also be good enough. Most of the web world is processed via text serialization.

    The biggest problem with yaml like in OP is the need to decode the whole file at once since it is a single list. Line by line processing would be a lot easier to work with. But even then if it is only a few 100 MBs loading it all in memory once and analyzing it all in memory would not take long at all - it just does not scale very well.



  • There is in this case, and why Linus did accept the patch in the end. Previous cases less so though which is why Linus is so pissed at this one.

    The reason for this new feature is to help fix data loss on users systems - which is a fine line between a bug and a new feature really. There is precedent for this type on thing in RC releases from other filesystems as well. So the issue in this instance is a lot less black and white.

    That doesn’t excuse previous behaviour though.



  • The attack is known as the evil maid attack. It requires repeated access to the device. Basically if you can compromise the bootloader you can inject a keylogger to sniff out the encryption key the next time someone unlocks the device. This is what secure boot is meant to help protect against (though I believe that has also been compromised as well).

    But realistically very few people need to worry about that type of attack. Encryption is good enough for most people. And if you don’t have your system encrypted then it does not matter what bootloader you use as anyone can boot any live usb to read your data.