“A computer made this” is so dumbed down from what is going on that it’s wrong. The actual process of generating images from noise is a very fascinating one and still seems like magic to me, but it is far from the computer creating something from nothing. Then again, to get metaphysical, humans who do artwork get their spark of creation somewhere from something they’ve experienced. Go too deep and it becomes a Matrix “what is real” discussion.
What messes with me is how many AI videos I’ve seen that are so similar to dreams. The hallucinations that AI produces are very similar to the ones our brains produce, and that makes me feel like more of a meat computer than usual.
What I’ve found even more fascinating is, particularly in earlier iterations of the technology, visual effects produced were remarkably similar to visual distortions people experience with certain drugs.
Easy to make a lot out of this where it’s not warranted, but at minimum it gives some interesting food for thought re: how visual processing works. Have seen people write about this, but am too dumb to actually understand.
so is calling it fabrication. something incapable of knowing what is true cannot lie.
also, gpts and image generators are fundamentally different technologies sharing very little code beyond the basic matrix manipulation stuff, so the definition of truth needs to be very different.
that’s literally how it works though, the software is trained to remove noise from images and then you feed it pure noise and tell it there’s an image behind it. If that’s not hallucination idk what would be.
eh is that really true though? in my experience our brains tend to go “wow, this looks exactly right but there’s something ineffably off about it and i hate it!”
“A computer made this” is so dumbed down from what is going on that it’s wrong. The actual process of generating images from noise is a very fascinating one and still seems like magic to me, but it is far from the computer creating something from nothing. Then again, to get metaphysical, humans who do artwork get their spark of creation somewhere from something they’ve experienced. Go too deep and it becomes a Matrix “what is real” discussion.
i always like to call it hallucination, it’s significantly closer to how it works both technically and in effect.
What messes with me is how many AI videos I’ve seen that are so similar to dreams. The hallucinations that AI produces are very similar to the ones our brains produce, and that makes me feel like more of a meat computer than usual.
What I’ve found even more fascinating is, particularly in earlier iterations of the technology, visual effects produced were remarkably similar to visual distortions people experience with certain drugs.
Easy to make a lot out of this where it’s not warranted, but at minimum it gives some interesting food for thought re: how visual processing works. Have seen people write about this, but am too dumb to actually understand.
I have thought the same thing for years now. I almost wish GenAI stayed as simple and shit.
Unrelated but kinda related, Symmetric Vision makes some wonderful psychedelic recreations, the most accurate by far.
King gizzard have a fantastic AI music video relating to a mushroom trip, its incredibly similar to intense hallucinations. https://www.youtube.com/watch?v=Njk2YAgNMnE
You’re a meat computer and always have been, flesh sack named after a famous abuser
I think fabrication is a better term than hallucination because of the double entendre of it being industrially fabricated and also being a lie.
that’s more of a comment on the usage than on the technology itself.
remember that google deepdream thing that would hallucinate dogs everywhere? it’s the same tech.
*shrugs
I think calling it a hallucination is anthropomorphizing the technology.
so is calling it fabrication. something incapable of knowing what is true cannot lie.
also, gpts and image generators are fundamentally different technologies sharing very little code beyond the basic matrix manipulation stuff, so the definition of truth needs to be very different.
that’s literally how it works though, the software is trained to remove noise from images and then you feed it pure noise and tell it there’s an image behind it. If that’s not hallucination idk what would be.
that removes the reference to how it actually functions though, at that point you might as well just stop being coy and call it “AI dogshit”
Yes, good point, and it’s incredible that so often the hallucination is close enough that our pattern-matching brains say, yes, that’s exactly right!
eh is that really true though? in my experience our brains tend to go “wow, this looks exactly right but there’s something ineffably off about it and i hate it!”