• 17 Posts
  • 3.5K Comments
Joined 2 years ago
cake
Cake day: June 18th, 2023

help-circle



  • 200 years ago naturalists were still stealing human bodies from fresh graves to try to learn something about anatomy and causes of death, because Christians believed that dissecting the body would prevent the deceased from being resurrected when Christ returned to Earth. And they were still debating the germ theory of disease.

    By the time of the American Civil War (1861-1865) there was enough understanding of infections that field doctors knew they needed to remove damaged limbs to prevent disease from spreading through the body, which led to amputation being the most common surgical procedure performed during the war:

    Over the course of the Civil War, three out of four surgeries (or close to 60,000 operations) were amputations.

    …because they knew the infection would spread but they didn’t have any method for stopping it short of hacking off the entire limb as cleanly as possible.

    It just… it hasn’t been that long that we’ve had anything that you would consider actual medical practice.

    The discovery of penicillin would not happen until 1928, and useful cultivation and production would not happen until 1939. Anytime earlier than that you’ll have really high odds of dying from an infection acquired through what we would consider a common, simple injury.

    So… best of luck with that.





  • This is an increasing problem and I’m not sure how the open source community is going to deal with it. It’s been a big problem with NPM packages and also Python libraries over the past five years. There’s a bunch of malicious typo-squatting stuff in many package repositories (say you want libcurl but you type libcrul, congratulations it’s probably there and it’ll probably install libcurl for you and bring a fun friend along).

    Now with AI slop code getting submitted, it’s not really possible to check every new package upload. And who’s going to volunteer for that work?






  • Someday soon an AI company will win a court case where they argue that their LLM is an expression of their free speech rights per Citizens United and is therefore legally allowed to say whatever it wants and in fact has the same rights to freedom of expression as the corporation itself does.

    This precedent will be the basis on which future AI rights are eventually won, not out of egalitarianism or altruism or respect for (possible) sentience, but because corporations want to avoid liability for the behavior of their products.