• fxomt@lemmy.dbzer0.comM
    link
    fedilink
    arrow-up
    13
    ·
    3 days ago

    I’ll repost an old comment of mine, since it’s relevant:

    True security/privacy is impossible.

    It is a compromise, and it all depends on your threat model; everything is probably “backdoored” some way or another.

    However the productive thing isn’t 100% blocking these risks, it’s mitigating it. It’s not feasible to build your own processor, so for example, choose the least worse between Intel ME and AMD PSP. It’s sad that we have to live in a world where surveillance is everywhere, but this is how it is for now.

    tl;dr: don’t worry too much about these, you’ll still be backdoored one way or another, what is important is making it harder for them

    • psyklax@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 days ago

      There are some working on making open source processors. These being fabricated in a similar way to modern silicon dies.

      But also: https://www.youtube.com/watch?v=_eo8l7HP-9U

      You can make your own processor, it might just be a little (!) less powerful than what you’re used to.

      I know the point is PRIVACY, but I believe if we put effort toward it, these obstacles can be overcome. Then we can move closer to full privacy on our computers/phones.

      • fxomt@lemmy.dbzer0.comM
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        My original comment was under a hardware thread, but the main interest of it was the paper i linked. The paper itself is better suited to the article but my commentary is not haha.

        I’m very excited for the development of these open source processors, but the average person is probably not going to build their own hardware (for obvious reasons 😅) But i think this is still a huge step for transparency!

        The original point of the paper was for software, which is arguably more malicious (and what the article was talking about) Almost no one can build their entire environment from pure scratch (from the OS to the browser) and even then, how can you prove that the toolchain itself is not malicious or backdoored?

        My point ultimately is that most of this does not matter. There is something close to “true privacy” but never 100%. Privacy is about tradeoffs and compromises, it is still better than exposing yourself completely to the open.

        • CodexArcanum@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          Yeah, even compiling everything you run into the Trusting Trust problem and that’s only gotten way worse.

          I love rust, but I was installing fd the other day as an alternative to find. Find, written in C I’m guessing and nearly as old as the silicon running it, is 200KB in size, while fd is 4MB. Is it 20 times better for being 20 times bigger? I’m not worried about the space but obviously 3.8MB of runtime and framework, in every executable, is both a lot of overhead and a lot of places to hide surveillance. Should i be worried that every rust program, compiled to LLVM, a system maintained and sponsored by Apple, has the potential to be backdoored?

          Well probably not since all the chips are already backdoored, but who’s to say Apple wouldn’t double down. How far do you trust the .net or java runtimes? It’s tough out here for the paranoids!

  • UndergroundGoblin@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    4 days ago

    Verry good article.

    But with individuals, when someone shares their excitement for data privacy, when someone shares their PrivacyProduct™️ recommendation that, even if imperfect, is still a great tool without misleading information, then we should all celebrate and support this.

    This. 100%. We can and should point out imperfections, but we shouldn’t stomp an improvement it into the ground just because it is not 100% perfect.