• diz@awful.systems
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    21 hours ago

    It’s curious how if ChatGPT was a person - saying exactly the same words - he would’ve gotten charged with a criminal conspiracy, or even shot, as its human co-conspirator in Florida did.

    And had it been a foreign human in the middle east, radicalizing random people, he would’ve gotten a drone strike.

    “AI” - and the companies building them - enjoy the kind of universal legal immunity that is never granted to humans. That needs to end.

      • diz@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        17 hours ago

        In theory, at least, criminal justice’s purpose is prevention of crimes. And if it would serve that purpose to arrest a person, it would serve that same purpose to court-order a shutdown of a chatbot.

        There’s no 1st amendment right to enter into criminal conspiracies to kill people. Not even if “people” is Sam Altman.

        • atrielienz@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          16 hours ago

          In practice the justice system actually is reactionary. Either the actuality of a crime or the suspect of a crime being possible allows for laws to be created prohibiting that crime, marking it as criminal, and then law enforcement and the justice system as a whole investigate instances where that crime is suspected to be committed and litigation ensues.

          Prevention may be the intent, but the actuality is that we know this doesn’t prevent crime. Outside the jurisdiction of any justice system that puts such “safeguards” in place is a place where people will abuse that lack of jurisdiction. And people inside it with enough money or status or both will continue to abuse it for their personal gain. Which is pretty much what’s happening now, with the exception that they have realized they can try to preempt litigation against them by buying the litigants or part of the regulatory/judicial system.

          • diz@awful.systems
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            15 hours ago

            If it was a basement dweller with a chatbot that could be mistaken for a criminal co-conspirator, he would’ve gotten arrested and his computer seized as evidence, and then it would be a crapshoot if he would even be able to convince a jury that it was an accident. Especially if he was getting paid for his chatbot. Now, I’m not saying that this is right, just stating how it is for normal human beings.

            It may not be explicitly illegal for a computer to do something, but you are liable for what your shit does. You can’t just make a robot lawnmower and run over a neighbor’s kid. If you are using random numbers to steer your lawnmower… yeah.

            But because it’s OpenAI with 300 billion dollar “valuation”, absolutely nothing can happen whatsoever.