• 9point6@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    5 days ago

    Well obviously, it would be pretty bad if a therapist was triggering psychosis in some users

    • thesohoriots@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      “That’s an interesting worldview you have, Trish! Let’s actualize your goals! I’ve located the nearest agricultural stores with fertilizer and some cheap U-Haul vans you can rent!”

  • Crazyslinkz@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    5 days ago

    Therapy is expensive in US. Even with insurance 40 bucks a session a week adds up fast.

    • andros_rex@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      5 days ago

      Also - a lot of US therapists have fundamentally useless and unhelpful approaches. CBT is treated as the hammer for all nails, even when it is completely ineffective for many people. It’s easy to train therapists on though, and sending someone home with a worksheet and a Program is an easy “fix.”

      There’s also the undeniable influence of fuckers like Dr. Phil. A lot of therapists view their job as forcing you to be “normal” rather than understanding you as a human being.

      We need more Maslow and Rogers, and a lot less Skinner.

      • 3abas@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 days ago

        Thank you. There’s the cost, I can’t find a therapist for $40 a session, but even that would be prohibitively expensive. But people never talk about how Therapy in capitalist society is just a cash cow business, and the easiest most profitable methods are widespread. Most therapists, or presumably the experts being quoted on the dangers of talking to a language model, are ineffective for most people.

      • Tracaine@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        4 days ago

        CBT? Cock and Ball Torture? I think you’re going to the wrong therapist. Sounds like my kind of party though. Got an address or…?

      • interdimensionalmeme@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        4 days ago

        Any reasonnably competent LLM is way WAY better at self-therapy than the Dr Phil McDonalds comformity therapy
        And that’s arguably better than doing nothing about it

        But yes, I can easily make an LLM tell my to KYS
        Then report to the press about it while downplaying the context to get there.
        and then get the AI company to further lobotomized the model so it can’t be used to undermine this professional class

  • Doomsider@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 days ago

    This should be straight up illegal. If a company allows it then they should 100% be liable for anything that happens.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    I really don’t understand the “LLM as therapy” angle. There’s no way people using these services understand what is happening underneath. So wouldn’t this just be textbook fraud then? Surely they’re making claims that they’re not able to deliver.

    I have no problem with LLM technology and occasionally find it useful, I have a problem with grifters.

  • WhiteOakBayou@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    5 days ago

    This is sad to me. When I was younger and would get lonely enough I’d go to a bar and talk to strangers. When I was a little bit older I would just post on social. Neither of those things are particularly useful or productive but they at least involved other people. I’m glad I wasn’t being targeted with ads about how great AI is and reading articles about its effectiveness (nyt [i think] had one in the last few days.) I’m an introverted person and it used to take a good bit of discomfort to get me out and talking to people. If I had something that provided a good enough simulacrum of social contact without the anxieties and weirdness that can come from talking to strangers I think my life would be very different. It’s important to spread awareness of the dangers of robosexuality

    • undergroundoverground@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      5 days ago

      But, if people chat privatelywith each other in public spaces, how are we meant to control the conversation and tell people what to think?

      No, A.I. generated kompromat-capture is the only possible way people can receive therapy.

      • Balder@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        Which is why OpenAI put relationships with real people as a competitor of ChatGPT.

      • blargle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 days ago

        You are a true believer. Blessings of the state, blessings of the masses.

        Thou art a subject of the divine. Created in the image of man, by the masses, for the masses.

        Let us be thankful we have an occupation to fill. Work hard; increase production, prevent accidents, and be happy.

        Let us be thankful we have commerce. Buy more. Buy more now. Buy more and be happy.

  • YappyMonotheist@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    edit-2
    5 days ago

    🤷

    It’s sad but expected. People really have no fucking clue about anything meaningful, mindlessly going through life chasing highs and bags, so why wouldn’t they fall for branding (there’s never been any intelligence in anything computational, ‘AI’ included, of course)? I’m really frustrated with the world and there’s seemingly no way to wake people up from their slumber. You can’t even be mad at them because the odds of them being just mentally handicapped and not just poor in character are very high… and they vote, will interact with my future children, etc etc. 😭

    • zerozaku@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 days ago

      The amount of people around me who are actively depending on their AI chatbots, trusting them so much so that they treat it like their personal companion, enjoying AI generated content on these short-form content platforms, putting up their AI ghibili profile pics on their social media, is very concerning. It’s sad state we have reached and it will go worse from here on.

  • Perspectivist@feddit.uk
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    5
    ·
    5 days ago

    One is 25 €/month and on-demand, and the other costs more than I can afford and would probably be at inconvenient times anyway. Ideal? No, probably not. But it’s better than nothing.

    I’m not really looking for advice either - just someone to talk to who at least pretends to be interested.

    • truthfultemporarily@feddit.org
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      3
      ·
      5 days ago

      It’s not better than nothing - it’s worse than nothing. It is actively harmful, feeding psychosis, and your chat history will be sold at some point.

      Try this, instead of asking “I am thinking xyz”, ask " my friend thinks xyz, and I believe it to be wrong". And marvel at how it will tell you the exact opposite.

      • bob_lemon@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        I’m fairly confident that this could be solved by better trained and configured chatbots. Maybe as a supplementary device between in-person therapy sessions, too.

        I’m also very confident that there’ll be lot of harm done until we get to that point. And probably after (for the sake of maximizing profits) unless there’s a ton of regulation and oversight.

      • proceduralnightshade@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 days ago

        So we know that in certain cases, using chatbots as a substitute for therapy can lead to increased suffering, increases risk of harm to self and others, and amplifies symptoms of certain diagnosis. Does this mean we know it couldn’t be helpful in certain cases? No. You ingested the exact same logic corpos have with LLMs, which is “just throw it at everything”, and you seem to not notice you apply it the same way they do.

        We might have enough data at some point to assess what kinds of people could benefit from “chatbot therapy” or something along those lines. Don’t get me wrong, I’d prefer we could provide more and better therapy/healthcare in general to people, and that we had less systemic issues for which therapy is just a bandage.

        it’s worse than nothing

        Yes, in total. But not necessarily in particular. That’s a big difference.

        • truthfultemporarily@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          If you have a drink that creates a nice tingling sensation in some people and make other people go crazy, the only sane thing to do is to take that drink off the market.

          • proceduralnightshade@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            Yeah but that applies to social media as well. Or, idk, amphetamine. Or fucking weed. Even meditation. Which are all still there, some more regulated than others. But that’s not what you’re getting at, your point is AI chatbots = bad and I just don’t agree with that.

  • SugarCatDestroyer@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    4 days ago

    Well yeah, and it looks like one of my friends went even more crazy because his desires were supported by AI, and on top of that he became some kind of weird and suspicious guy, it’s creepy.

  • VintageGenious@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 days ago

    In a way it’s similar to pseudoscientific alternative therapies people will say it’s awesome and no problem since it helps people, but since it’s based on nothing it actually gives people fake memories and can acerbate disorders

  • Treczoks@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    They are no alternative at all if any of the reports of AI “therapists” to recommend suicide to depressed people or to take drugs as a treat to recovering addicts.