• fxomt@lemmy.dbzer0.comM
    link
    fedilink
    arrow-up
    2
    ·
    4 days ago

    My original comment was under a hardware thread, but the main interest of it was the paper i linked. The paper itself is better suited to the article but my commentary is not haha.

    I’m very excited for the development of these open source processors, but the average person is probably not going to build their own hardware (for obvious reasons 😅) But i think this is still a huge step for transparency!

    The original point of the paper was for software, which is arguably more malicious (and what the article was talking about) Almost no one can build their entire environment from pure scratch (from the OS to the browser) and even then, how can you prove that the toolchain itself is not malicious or backdoored?

    My point ultimately is that most of this does not matter. There is something close to “true privacy” but never 100%. Privacy is about tradeoffs and compromises, it is still better than exposing yourself completely to the open.

    • CodexArcanum@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      Yeah, even compiling everything you run into the Trusting Trust problem and that’s only gotten way worse.

      I love rust, but I was installing fd the other day as an alternative to find. Find, written in C I’m guessing and nearly as old as the silicon running it, is 200KB in size, while fd is 4MB. Is it 20 times better for being 20 times bigger? I’m not worried about the space but obviously 3.8MB of runtime and framework, in every executable, is both a lot of overhead and a lot of places to hide surveillance. Should i be worried that every rust program, compiled to LLVM, a system maintained and sponsored by Apple, has the potential to be backdoored?

      Well probably not since all the chips are already backdoored, but who’s to say Apple wouldn’t double down. How far do you trust the .net or java runtimes? It’s tough out here for the paranoids!