• elmtonic@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    The cool thing to note here is how badly Yud here misunderstands what a normal person means when they say they have “100% certainty” in something. We’re not fucking infinitely precise Bayesian machines, 100% means exactly the same thing as 99.99%. It means exactly the same thing as “really really really sure.” A conversation between the two might go like this:

    Unwashed sheeple: Yeah, 53 is prime. 100% sure of that.

    Ellie Bayes-er: (grinning) Can you really say to be 100% sure? Do not make the mistake of confusing the map with the territory, [5000 words redacted]

    Unwashed sheeple: Whatever you say, I’m 99% sure.

    Eddielazer remains seated, triumphant in believing (epistemic status: 98.403% certainty) he has added something useful to the conversation. The sheeple walks away, having changed exactly nothing about his opinion.

    • maol@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Mr Yudkowsky is supposedly able to understand advanced maths but doesn’t know what rounding is. I think we did rounding in 3rd or 4th grade…

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        You might be comfortable using a single significant digit for any probabilities you pull out of your ass, but Yud ‘s methods are free of experimental error. He works using Aristotelian science, working stuff out from pure reason, which brought you bangers like “men have more teeth than women”. In Yud’s case, most of his ideas are unfalsifiable to begin with, so why not have seventeen nines’ worth of certainty in them? Literally can’t be false! Not even AWS would promise these kinds of SLAs!