• Contentedness@lemmy.nz
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    ChatGPT didn’t nearly destroy her wedding, her lousy wedding planner did. Also whats she got against capital letters?

    • bitofhope@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Yea yea guns don’t kill people, bullet impacts kill people. Dishonesty and incompetence are nothing new, but you may note that the wedding planner’s unfounded confidence in ChatGPT exacerbated the problem in a novel way. Why did the planner trust the bogus information about Vegas wedding officiants? Is someone maybe presenting these LLM bots as an appropriate tool for looking up such information?

      • Pandemanium@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I think we should require professionals to disclose whether or not they use AI.

        Imagine you’re an author and you pay an editor $3000 and all they do is run your manuscript through ChatGPT. One, they didn’t provide any value because you could have done the same thing for free; and two, if they didn’t disclose the use of AI, you wouldnt even know your novel had been fed into one and might be used by the AI for training.

        • bitofhope@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I think we should require professionals not to use the thing currently termed AI.

          Or if you think it’s unreasonable to ask them not to contribute to a frivolous and destructive fad or don’t think the environmental or social impacts are bad enough to implement a ban like this, at least maybe we should require professionals not to use LLMs for technical information

      • ugo@feddit.it
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        But the article author wasn’t interfacing with chatgpt, she was interfacing with a human paid to help with the things the article author did not know. The wedding planner was a supposed expert in this interaction, but instead simply sent back regurgitated chatgpt slop.

        Is this the fault of the wedding planner? Yes. Is it the fault of chatgpt? Also yes.

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Scams are LLM’s best use case.

          They’re not capable of actual intelligence or providing anything that would remotely mislead a subject matter expert. You’re not going to convince a skilled software developer that your LLM slop is competent code.

          But they’re damn good at looking the part to convince people who don’t know the subject that they’re real.

  • blakestacey@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    “Comment whose upvotes all come from programming dot justworks dot dev dot infosec dot works” sure has become a genre of comment.

  • o7___o7@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    As a fellow Interesting Wedding Haver, I have to give all the credit in the world to the author for handling this with grace instead of, say, becoming a terrorist. I would have been proud to own the “Tracy did nothing wrong” tshirt.

  • DannyBoy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I can make a safe assumption before reading the article that ChatGPT didn’t ruin the wedding, but rather somebody that was using ChatGPT ruined the wedding.

      • self@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        do you think they ever got round to reading the article, or were they spent after coming up with “hmmmm I bet chatgpt didn’t somehow prompt itself” as if that were a mystery that needed solving

    • ebu@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 month ago

      “blame the person, not the tools” doesn’t work when the tool’s marketing team is explicitly touting said tool as a panacea for all problems. on the micro scale, sure, the wedding planner is at fault, but if you zoom out even a tiny bit it’s pretty obvious what enabled them to fuck up for as long as they did

        • self@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          almost all of your posts are exactly this worthless and exhausting and that’s fucking incredible

          • LainTrain@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 month ago

            I get the feeling you’re exactly the kind of person who shouldn’t have a proompt, much less a hammer

            • self@awful.systems
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              no absolutely, I shouldn’t ever “have a proompt”, whatever the fuck that means

              the promptfondlers really aren’t alright now that public opinion’s against the horseshit tech they love

              • froztbyte@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                1 month ago

                istg these people seem to roll “b-b-b-but <saltman|musk|sundar|…> gifted this technology to me personally, how could I possibly look this gift horse in the mouth” on the inside of their heads