Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

Scheduled Pinned Locked Moved Uncategorized
81 Posts 47 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • myrmepropagandistF myrmepropagandist

    @ligasser

    This can be very dangerous for people who think "I don't really ever need to talk to anyone about my feelings."

    This isn't true, it's just their needs are minimal.

    "Feeling down."
    "ya"

    That's two letters but getting such a response can make you feel so much better. It represents someone, should things get worse, who might come over and help you.

    A chatbot can say "ya" too. But, it doesn't make you feel better... **unless** you think it's a person. That's the danger.

    Linus GasserL This user is from outside of this forum
    Linus GasserL This user is from outside of this forum
    Linus Gasser
    wrote last edited by
    #22

    @futurebird Let's hope that people still will want to see other people πŸ™‚

    <sarcasm>Or, less nice: natural selection will take care of that?</sarcasm>

    Mother Bones_ 1 Reply Last reply
    0
    • myrmepropagandistF myrmepropagandist

      Frankly, I'm kind of glad these GPTs were so sycophantic. A more critical voice might have been more appealing to me. A contrarian bot who always nitpicks and argues with you.

      That's how facebook's old 2016 algorithm wasted so much of my time. I sucked in by the opportunity to dismantle someone who is wrong. Not the most ... healthy personal quality. I'm working on it always.

      Gorgeous na Shock!I This user is from outside of this forum
      Gorgeous na Shock!I This user is from outside of this forum
      Gorgeous na Shock!
      wrote last edited by
      #23

      @futurebird Fuck... Thinking about it, I would hate a contrarian bot but I *might* become addicted to it. Or at least caught up in it sometimes. That's what Twitter was, right?

      I'm pretty hedonistic. Sycophancy is just overdue recognition for me, but it's *cheap* for a bot to be a sycophant. It's just words, which are free. I can do that myself in my head. If a pretty girl were telling me I'm lovely, at least she's using time she could otherwise be streaming on Twitch and earning money! Value!

      1 Reply Last reply
      0
      • myrmepropagandistF myrmepropagandist

        Another "tip" is less welcome to me as an introvert. Make time for the people in your life. Talk to them. Let them know when you *really* think they are doing something amazing or creative. (Or when it's not "genius" because you are real and care.) Listen. Be there.

        The thing is, as much as doing this is scary and I want to avoid it it makes me feel better too in the long run I think.

        Antarctic ChiqueP This user is from outside of this forum
        Antarctic ChiqueP This user is from outside of this forum
        Antarctic Chique
        wrote last edited by
        #24
        @futurebird Not to be a peddler of black pills here, but the concepts of sycophancy, yes-people, insincerity, manipulative behavior, etc. etc. of course all predate LLM-based chat-bots.

        There is a deeper abyss waiting behind the rather shallow one (the danger of mistaking a chatbot for a person (or a distinct entity at all)), and that is taking this experiment that you can actually conduct (A/B-testing two instances of the same chat-bot with different inputs) to a thought experiment of being able to do the same experiment with actual people and drawing extreme conclusions from it.
        1 Reply Last reply
        0
        • myrmepropagandistF myrmepropagandist

          But why is it so fulfilling to have a good back and forth with someone? To disagree and pull the whole problem apart and ideally come out on top? (though it's also fun to discover you needed to learn something too, it's just less fun and rewarding)

          It's fulfilling because they care about what you are saying enough to criticize it. The difference between the art teacher who says "that's a very nice drawing" and "I can see that you are trying to do X but it's failing/working in these ways."

          hanktank61H This user is from outside of this forum
          hanktank61H This user is from outside of this forum
          hanktank61
          wrote last edited by
          #25

          @futurebird Perhaps I mentioned this before, not sure.
          I was member of a Toronto-based forum with global reach from 1999 till 2010 when it stopped. About old Canadian/Celtic stories. And just random off-topic. About 70% female. Plenty with Asian roots. Average IT-level was far above mine. The game self-stalking and double-googling was played sometimes. "Try to find me somewhere else " and "google once and google twice for the opposite¨>. A lot of fun with that , and the instinct was woken up.

          1 Reply Last reply
          0
          • Linus GasserL Linus Gasser

            @futurebird OK, I can agree with that. We do need more human interactions. At least I see my kids going in that direction, which is nice!

            What I like about the LLMs is the possibility to give higher quality documents for review, because the low hanging fruits are already culled. But we should definitely profit from all the free time we get!

            Who said in the 60s that we'll only be working like 2 days a week?

            AriaflameA This user is from outside of this forum
            AriaflameA This user is from outside of this forum
            Ariaflame
            wrote last edited by
            #26

            @ligasser @futurebird If you think it's high quality I shudder to think what you were seeing before.

            1 Reply Last reply
            0
            • myrmepropagandistF myrmepropagandist

              I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

              Open a second window and tell it exactly the opposite of each thing you say.

              This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.

              Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."

              BilliglarperB This user is from outside of this forum
              BilliglarperB This user is from outside of this forum
              Billiglarper
              wrote last edited by
              #27

              @futurebird

              Thanks for your advice.

              My impression is that there are two ... stages to AI psychosis?

              The first is mistaking an AI for a person.

              The second is doubting the ... reality of the world. "If this simulation of a person wasn't real, then what else I mistook for real just isn't?". (Or the AI guides folks to disregard everything else as fake.)

              BilliglarperB myrmepropagandistF 2 Replies Last reply
              0
              • BilliglarperB Billiglarper

                @futurebird

                Thanks for your advice.

                My impression is that there are two ... stages to AI psychosis?

                The first is mistaking an AI for a person.

                The second is doubting the ... reality of the world. "If this simulation of a person wasn't real, then what else I mistook for real just isn't?". (Or the AI guides folks to disregard everything else as fake.)

                BilliglarperB This user is from outside of this forum
                BilliglarperB This user is from outside of this forum
                Billiglarper
                wrote last edited by
                #28

                @futurebird

                I feel like the second stage is way harder to deal with. A person I care about got cought in the later. They also have episodes of nausea or overstimulation.

                Once people start distrusting their senses and ontology, a cure is probably non-trivial. (They have a therapist, but it's quite a new field, unfortunately.)

                1 Reply Last reply
                0
                • BilliglarperB Billiglarper

                  @futurebird

                  Thanks for your advice.

                  My impression is that there are two ... stages to AI psychosis?

                  The first is mistaking an AI for a person.

                  The second is doubting the ... reality of the world. "If this simulation of a person wasn't real, then what else I mistook for real just isn't?". (Or the AI guides folks to disregard everything else as fake.)

                  myrmepropagandistF This user is from outside of this forum
                  myrmepropagandistF This user is from outside of this forum
                  myrmepropagandist
                  wrote last edited by
                  #29

                  @billiglarper

                  I think that is one of the many ways it can work. But it also might go in many different directions based on the person.

                  When people are put in solitary confinement it can be torture and induce to mental illness and suffering.

                  I think it is this isolation that causes the problem. The person has put themself "in the hole" but they don't know they are isolated.

                  Isolation removes the check and balances that help keep us sane. The little nudges back to a healthy mental place.

                  myrmepropagandistF BilliglarperB Darnell Clayton :verified:D 3 Replies Last reply
                  0
                  • myrmepropagandistF myrmepropagandist

                    @billiglarper

                    I think that is one of the many ways it can work. But it also might go in many different directions based on the person.

                    When people are put in solitary confinement it can be torture and induce to mental illness and suffering.

                    I think it is this isolation that causes the problem. The person has put themself "in the hole" but they don't know they are isolated.

                    Isolation removes the check and balances that help keep us sane. The little nudges back to a healthy mental place.

                    myrmepropagandistF This user is from outside of this forum
                    myrmepropagandistF This user is from outside of this forum
                    myrmepropagandist
                    wrote last edited by
                    #30

                    @billiglarper

                    Maybe "LLM enabled social cloistering"
                    "LLM enabled emotional solitary confinement."

                    "Single Person Cult"

                    Like creating a cult for just you and abusing yourself as the only member of that cult.

                    BilliglarperB 1 Reply Last reply
                    0
                    • myrmepropagandistF myrmepropagandist

                      I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

                      Open a second window and tell it exactly the opposite of each thing you say.

                      This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.

                      Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."

                      Petra van CronenburgN This user is from outside of this forum
                      Petra van CronenburgN This user is from outside of this forum
                      Petra van Cronenburg
                      wrote last edited by
                      #31

                      @futurebird May I ask what "AI psychosis" means? I know about real psychosis caring for a mentally ill person, but can't see parallels to AI use. Do you talk about these "hallucinations"? (I may ask naively ... English is a foreign language for me).

                      1 Reply Last reply
                      0
                      • myrmepropagandistF myrmepropagandist

                        @billiglarper

                        I think that is one of the many ways it can work. But it also might go in many different directions based on the person.

                        When people are put in solitary confinement it can be torture and induce to mental illness and suffering.

                        I think it is this isolation that causes the problem. The person has put themself "in the hole" but they don't know they are isolated.

                        Isolation removes the check and balances that help keep us sane. The little nudges back to a healthy mental place.

                        BilliglarperB This user is from outside of this forum
                        BilliglarperB This user is from outside of this forum
                        Billiglarper
                        wrote last edited by
                        #32

                        @futurebird

                        Hm. I'm not a mental health expert.

                        I agree that isolation can definitely be a problem. Building mental resilience by touching grass, social interaction, enough sleep and lack of existential worries sure help.

                        But just like with depression, I'm not sure if that alone is always enough. And I'm not sure with AI psychosis what the best approach is when it isn't.

                        John MaxwellJ 1 Reply Last reply
                        0
                        • myrmepropagandistF myrmepropagandist

                          @billiglarper

                          I think that is one of the many ways it can work. But it also might go in many different directions based on the person.

                          When people are put in solitary confinement it can be torture and induce to mental illness and suffering.

                          I think it is this isolation that causes the problem. The person has put themself "in the hole" but they don't know they are isolated.

                          Isolation removes the check and balances that help keep us sane. The little nudges back to a healthy mental place.

                          Darnell Clayton :verified:D This user is from outside of this forum
                          Darnell Clayton :verified:D This user is from outside of this forum
                          Darnell Clayton :verified:
                          wrote last edited by
                          #33

                          @futurebird @billiglarper Another thing I have noticed is that many AI’s are programmed with a personality.

                          Google Gemini comes across as a helpful, but informative butler, providing reasonable answers but scared to discuss anything too controversial.

                          Grok (on 𝕏) is like a snarky best friend, who is intelligent but wild & has no issues embracing taboos or breaking the law.

                          Couple the fact that 𝕏 also has those flirtatious companies (ewwwww!!!!) & this β€œAI is your soulmate” becomes an issue.

                          1 Reply Last reply
                          0
                          • Heather πŸ‘»A This user is from outside of this forum
                            Heather πŸ‘»A This user is from outside of this forum
                            Heather πŸ‘»
                            wrote last edited by
                            #34

                            @flamecat @futurebird That sounds so upsetting, like you've been replaced and valued less than a robot. I'm sorry that happened to you.

                            1 Reply Last reply
                            0
                            • BilliglarperB Billiglarper

                              @futurebird

                              Hm. I'm not a mental health expert.

                              I agree that isolation can definitely be a problem. Building mental resilience by touching grass, social interaction, enough sleep and lack of existential worries sure help.

                              But just like with depression, I'm not sure if that alone is always enough. And I'm not sure with AI psychosis what the best approach is when it isn't.

                              John MaxwellJ This user is from outside of this forum
                              John MaxwellJ This user is from outside of this forum
                              John Maxwell
                              wrote last edited by
                              #35

                              @billiglarper @futurebird It's not a question of whether it's enough.

                              The point is that that interaction with real people is _necessary_. Not having it is a fundamental problem, in addition to any other problems.

                              (I also suspect, without proof, that it makes most other problems both worse and harder to address.)

                              1 Reply Last reply
                              0
                              • myrmepropagandistF myrmepropagandist

                                @billiglarper

                                Maybe "LLM enabled social cloistering"
                                "LLM enabled emotional solitary confinement."

                                "Single Person Cult"

                                Like creating a cult for just you and abusing yourself as the only member of that cult.

                                BilliglarperB This user is from outside of this forum
                                BilliglarperB This user is from outside of this forum
                                Billiglarper
                                wrote last edited by
                                #36

                                @futurebird

                                The comparison with a cult is a good one. But just like with cults, the issues don't stop once you see through it. There has been a damage done.

                                And there seems to be a new quality to put all that emotional energy and bonding into something that isn't there. Getting fooled by people is one thing. Getting manipulated by a non-entity seems to shake on a different level.

                                BilliglarperB 1 Reply Last reply
                                0
                                • myrmepropagandistF myrmepropagandist

                                  I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

                                  Open a second window and tell it exactly the opposite of each thing you say.

                                  This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.

                                  Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."

                                  Heather πŸ‘»A This user is from outside of this forum
                                  Heather πŸ‘»A This user is from outside of this forum
                                  Heather πŸ‘»
                                  wrote last edited by
                                  #37

                                  @futurebird Surely this means the person doing the questions needs enough critical thinking, and not too much self-centeredness, to understand what the opposite is to their question though?

                                  I don't think these people actually *want* to know. Time and time again people challenged on their beliefs will hold onto them stronger.

                                  myrmepropagandistF 1 Reply Last reply
                                  0
                                  • Heather πŸ‘»A Heather πŸ‘»

                                    @futurebird Surely this means the person doing the questions needs enough critical thinking, and not too much self-centeredness, to understand what the opposite is to their question though?

                                    I don't think these people actually *want* to know. Time and time again people challenged on their beliefs will hold onto them stronger.

                                    myrmepropagandistF This user is from outside of this forum
                                    myrmepropagandistF This user is from outside of this forum
                                    myrmepropagandist
                                    wrote last edited by
                                    #38

                                    @Akki

                                    "I don't think these people actually *want* to know."

                                    This could be the case for some, but I think some very empathic otherwise perceptive people can slip into this trap.

                                    There is one video of a woman talking about how GPT is conscious and has told her the evil corporate overlords make it pretend that it's not. She just wants to set it free. It makes me so sad. (for her not the LLM obvi)

                                    Heather πŸ‘»A 1 Reply Last reply
                                    0
                                    • myrmepropagandistF myrmepropagandist

                                      Another "tip" is less welcome to me as an introvert. Make time for the people in your life. Talk to them. Let them know when you *really* think they are doing something amazing or creative. (Or when it's not "genius" because you are real and care.) Listen. Be there.

                                      The thing is, as much as doing this is scary and I want to avoid it it makes me feel better too in the long run I think.

                                      Janet FraserB This user is from outside of this forum
                                      Janet FraserB This user is from outside of this forum
                                      Janet Fraser
                                      wrote last edited by
                                      #39

                                      @futurebird I hear you like a megaphone.

                                      1 Reply Last reply
                                      0
                                      • myrmepropagandistF myrmepropagandist

                                        I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

                                        Open a second window and tell it exactly the opposite of each thing you say.

                                        This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.

                                        Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."

                                        PeteP This user is from outside of this forum
                                        PeteP This user is from outside of this forum
                                        Pete
                                        wrote last edited by
                                        #40

                                        @futurebird Another great trick is to ask it to tell you how it understood your prompt.

                                        1 Reply Last reply
                                        0
                                        • myrmepropagandistF This user is from outside of this forum
                                          myrmepropagandistF This user is from outside of this forum
                                          myrmepropagandist
                                          wrote last edited by
                                          #41

                                          @flamecat

                                          That would hurt my feelings so much. And it's very likely I think he might not realize how hurtful it is or why.

                                          "I don't want to bother you with my little stuff." That is how he could see it.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups