I guess it thought OOP meant “clean” as in how do you dress the bird before you cook it. (As in: “clean a fish” means to filet a fish and prep it for cooking.)
But first it said they are usually clean. So that can’t be the context. If there was a context. But there is no context because AI is fucking stupid and all these c-suite assholes pushing it like their last bowel movement will be eating crow off of their golden parakeet about two years from now when all this nonsense finally goes away and the new shiny thing is flashing around.
There are signs of three distinct interpretations in the result:
On topic, the concept of cleaning a wild bird you are trying to save
Preparing a store bought Turkey (removing a label)
Preparing a wild bird that is caught
It’s actually a pretty good illustration of how AI assembles “information shaped text” and how smooth it can look and yet how dumb it can be about it. Unfortunately advocates will just say “I can’t get this specific thing wrong when I ask it or another LLM, so there’s no problem”, even as it gets other stuff wrong. It’s weird as you better be able to second guess the result, meaning you can never be confident in an answer you didn’t already know, but when that’s the case, it’s not that great for factual stuff.
For “doesn’t matter” content, it may do fine (generated alternatives to stock photography, silly meme pictures, random prattle from background NPCs in a game), but for “stuff that matters”, Generative AI is frequently more of a headache than a help.
It literally doesn’t matter. When the most-used search engine on the planet automatically suggests these specific actions without you even clicking on a specific site? We’re fucked. We had the chance to break up monopolies like Google, Microsoft and Facebook. We didn’t take it…
lol, definitely missed some important context.
I guess it thought OOP meant “clean” as in how do you dress the bird before you cook it. (As in: “clean a fish” means to filet a fish and prep it for cooking.)
Nobody ask it how to dress a baby
First you snap a shoulder socket…
Swiftly
Vegtables, garlic, basted with pan drippings.
Side of potatoes.
Even then those are bad cleaning instructions…
But first it said they are usually clean. So that can’t be the context. If there was a context. But there is no context because AI is fucking stupid and all these c-suite assholes pushing it like their last bowel movement will be eating crow off of their golden parakeet about two years from now when all this nonsense finally goes away and the new shiny thing is flashing around.
There are signs of three distinct interpretations in the result:
It’s actually a pretty good illustration of how AI assembles “information shaped text” and how smooth it can look and yet how dumb it can be about it. Unfortunately advocates will just say “I can’t get this specific thing wrong when I ask it or another LLM, so there’s no problem”, even as it gets other stuff wrong. It’s weird as you better be able to second guess the result, meaning you can never be confident in an answer you didn’t already know, but when that’s the case, it’s not that great for factual stuff.
For “doesn’t matter” content, it may do fine (generated alternatives to stock photography, silly meme pictures, random prattle from background NPCs in a game), but for “stuff that matters”, Generative AI is frequently more of a headache than a help.
A: Just rescued a bird B: Oh, can I see it? A: Sorry, already ate it yesterday
It literally doesn’t matter. When the most-used search engine on the planet automatically suggests these specific actions without you even clicking on a specific site? We’re fucked. We had the chance to break up monopolies like Google, Microsoft and Facebook. We didn’t take it…
No WE never had this option.
What the fuck are you talking about? Stop apologizing for AI, you clown.
Do you remove the “label” in step one of cleaning a fish? Please, tell us all where that is.
Wut?