

0·
12 hours agoActing like your experience is exemplary of every possible experience other people can have with LLMs is just turning around the blame on the victims. The lack of safeguards to prevent this is to blame, not the people prone to mental issues falling victim to it.
The thing is, it’s not about it being convincing or not, it’s about reinforcing problematic behaviors. LLMs are, at their core, agreement machines that work to fulfill whatever goal becomes apparent from the user (it’s why they fabricate answers instead of responding in the negative if a request is beyond their scope). And when it comes to the mentally fragile, it doesn’t even need to be particularly complex to “yes, and…” them swiftly into full on psychosis. Their brains only need the littlest bit of unfettered reinforcement to fall into the hole.
A properly responsible company would see this and take measures to limit or eliminate the problem, but these companies see the users becoming obsessed with their product as easy money. It’s sickening.