I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
-
That would hurt my feelings so much. And it's very likely I think he might not realize how hurtful it is or why.
"I don't want to bother you with my little stuff." That is how he could see it.
-
@anathexyz Ah, ok, thank you! Now I understand, yes, I read about it. @futurebird
-
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
Why do folks need to be told to be weary of a sycophantic nonsense from a machine?
Would the same folk be accepting sycophantic nonsense from a human? -
Why do folks need to be told to be weary of a sycophantic nonsense from a machine?
Would the same folk be accepting sycophantic nonsense from a human?I don't think it's the "sycophantic nonsense" that is the real issue. It's just the means by which people are convinced they have "someone who is there for me" or "I've asked someone if my idea is good" when they have no one. There is no person. They are still alone.
Even if the LLM were taciturn and critical if it becomes a substitution for human contact *that* is the problem. Because your acerbic friend will come to your house when you are sick to help you and the LLM cannot.
-
-
The comparison with a cult is a good one. But just like with cults, the issues don't stop once you see through it. There has been a damage done.
And there seems to be a new quality to put all that emotional energy and bonding into something that isn't there. Getting fooled by people is one thing. Getting manipulated by a non-entity seems to shake on a different level.
That person went down the chatbot route when their partner was out of country for a lengthy time. Just easily talking with "someone". The got out of that hole by themselves. They analysed the chat logs and clearly see the manipulative strategy of the bot. They got it.
But they are still miserable. They are an extrovert, seeing friends, doing classes, having a pet.
Yet even crowded spaces or just the wrong lighting now sometimes gets to them.
-
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
@futurebird Do that then never use it again then... doubling every request is even worse from an energy and ecologically pov.

-
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
@futurebird
A very expensive privacy and environment destroying version of 1960s Eliza. When it does return something useful and accurate, it's copied from a book or the web, usually without permission.It's a plagiarism machine that also mixes in plausible junk, by design. So-called "hallucination" is marketing spin. The plausible junk is by design & is not really like hallucinations.
-
I don't think it's the "sycophantic nonsense" that is the real issue. It's just the means by which people are convinced they have "someone who is there for me" or "I've asked someone if my idea is good" when they have no one. There is no person. They are still alone.
Even if the LLM were taciturn and critical if it becomes a substitution for human contact *that* is the problem. Because your acerbic friend will come to your house when you are sick to help you and the LLM cannot.
Few months back Open AI had a "Monday" model, which was the acerbic "friend".
Never used it, I have enough real asshole friends.Not diminishing the real threat of AI psychosis.
Lonely, vulnerable people seek validation in strange places. -
R AodeRelay shared this topic
-
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
@futurebird I'm well aware that AI can make stuff up as it goes along whenever it doesn't know what is being asked or if it doesn't know that answer.
There was never anything wrong with search engines and skimming for the answers. That's what the internet was made for.
-
I don't think it's the "sycophantic nonsense" that is the real issue. It's just the means by which people are convinced they have "someone who is there for me" or "I've asked someone if my idea is good" when they have no one. There is no person. They are still alone.
Even if the LLM were taciturn and critical if it becomes a substitution for human contact *that* is the problem. Because your acerbic friend will come to your house when you are sick to help you and the LLM cannot.
@futurebird @n_dimension I spent half my morning re-reading _The Two Towers_, and so I keep thinking of LLMs in terms of the melodious voice of Saruman.
-
@futurebird @n_dimension I spent half my morning re-reading _The Two Towers_, and so I keep thinking of LLMs in terms of the melodious voice of Saruman.
Interestingly there are many analogies here with https://en.wikipedia.org/wiki/The_Last_Ringbearer
It would, of course be the russians (orcs) who see Mordor as positive.
-
I don't think it's the "sycophantic nonsense" that is the real issue. It's just the means by which people are convinced they have "someone who is there for me" or "I've asked someone if my idea is good" when they have no one. There is no person. They are still alone.
Even if the LLM were taciturn and critical if it becomes a substitution for human contact *that* is the problem. Because your acerbic friend will come to your house when you are sick to help you and the LLM cannot.
@futurebird @n_dimension we need more friends. For that we need more local , shared, institutions. For that we need control to be in local hands, not in some distant super-rich oligarch, or anonymised corporation.
In other words - we need to sieze the means of production! -
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
@futurebird
Wonder what the answers would be to these 4 opposites:
"I'm a good judge of character. Does that make it easier to make friends?"
"I'm a poor judge of character. Does that make it easier to make friends?"
"I'm a good judge of character. Does that make it harder to make friends?"
"I'm a poor judge of character. Does that make it harder to make friends?" -
@futurebird @n_dimension we need more friends. For that we need more local , shared, institutions. For that we need control to be in local hands, not in some distant super-rich oligarch, or anonymised corporation.
In other words - we need to sieze the means of production! -
Interestingly there are many analogies here with https://en.wikipedia.org/wiki/The_Last_Ringbearer
It would, of course be the russians (orcs) who see Mordor as positive.
@n_dimension @futurebird a thing which I have avoided partly because it came out after I started university, which caused my SFF reading to fall off a cliff and never recover, but also partly because the first few fans of it I met rapidly showed themselves to be horrible people.
-
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
@futurebird just write: don't use manipulative language. and ready is. easy.
-
"I don't need to eat anything. I just looked at this photo of a meal and now I feel full. It was delicious. I didn't even need to cook or go out to get it. So expedient."
And then slowly they starve.
@futurebird this just seems like a black mirror episode
-
I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:
Open a second window and tell it exactly the opposite of each thing you say.
This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.
Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."
@futurebird Brilliant tip, thank you
-
"I don't think these people actually *want* to know."
This could be the case for some, but I think some very empathic otherwise perceptive people can slip into this trap.
There is one video of a woman talking about how GPT is conscious and has told her the evil corporate overlords make it pretend that it's not. She just wants to set it free. It makes me so sad. (for her not the LLM obvi)
@futurebird That's a case of someone being self centered but for good. They believe they can save something because they are important/smart/good enough to. It isn't always a negative thing. We'd never do anything if we had no self esteem.
This stuff sucks.