peterldowns
2 hours ago
I would rather kill myself than talk to an AI to “reduce loneliness.” People make life, without others we are nothing. Loneliness is how we realize we’re spending too much time by ourselves; it encourages us to be social.
I don’t mind being lonely, because when I’m lonely I figure out ways to be around people. If AI “fixes” loneliness without needing other people our entire world will fall apart.
ziddoap
2 hours ago
>I would rather kill myself than talk to an AI to “reduce loneliness.”
What an odd thing to say.
righthand
2 hours ago
Why? I do not understand why that would be odd. The OP doesn’t want to live in a world without human interaction. Any life without others is not worth living. Not so odd or far fetched to me.
emptiestplace
2 hours ago
Who said anything about that, though?
righthand
2 hours ago
That’s just my read and extrapolation, welcome to the internet.
adamredwoods
2 hours ago
That's not what the OP originally posted. Also, suicide is never the answer.
staticautomatic
14 minutes ago
Every once in a while suicide is a perfectly good answer.
adamredwoods
4 minutes ago
Wow. Comments section on a paper addressing loneliness and you truly think this is an appropriate response? Or more precisely, do you suggest I commit suicide for using an AI chatbot to help with my loneliness?
righthand
2 hours ago
The OP didn’t say they were going to kill themselves in actuality, the OP said they’d rather end their own life than talk to an AI to reduce loneliness. We can extrapolate that the OP won’t kill themselves because there is no known current status quo in which the OP could not escape talking to AI to reduce loneliness. Just because suicide was mentioned doesn’t mean suicide is intended. No reason to be so dramatic or befuddled by simple hyperbole.
adamredwoods
2 hours ago
To quote:
>> I would rather kill myself than talk to an AI to “reduce loneliness.”
Is a dramatic hyperbole. So therefore, if this is the first thing a person thinks in reaction to seeing an article about chatting with an AI companion, then I would suspect there's a bit more going on.
Again "killing oneself" is never the answer.
emptiestplace
an hour ago
> Again "killing oneself" is never the answer.
While well-intentioned, such blanket statements oversimplify a complex issue and are ultimately counterproductive. This topic deserves more nuanced discussion, especially on a platform known for thoughtful discourse.
carapace
23 minutes ago
> A figure of speech or rhetorical figure is a word or phrase that intentionally deviates from straightforward language use or literal meaning to produce a rhetorical or intensified effect (emotionally, aesthetically, intellectually, etc.).
righthand
2 hours ago
Okay, you ring your bell.
adamredwoods
an hour ago
>> Okay, you ring your bell.
What an odd thing to say.
user
32 minutes ago
fatata123
an hour ago
[dead]
llm_trw
2 hours ago
Makes you wonder if we should be more worried about the mental state of people who are rabidly anti-AI.
andersa
2 hours ago
People are often scared of things they don't understand.
adamredwoods
2 hours ago
What works for you does not work for everyone and vice versa.
From the paper:
>> Loneliness is often not problematic, with almost everyone experiencing loneliness from time to time (Cacioppo and Cacioppo 2018). Yet some people are not successful at alleviating loneliness, leading to a state of chronic loneliness that is associated with depression, anxiety, and physical health outcomes at levels worse than obesity
>> Study 3 finds that AI companions successfully alleviate loneliness on par only with interacting with another person, and more than other activities such watching YouTube videos.
>> Moreover, consumers underestimate the degree to which AI companions improve their loneliness.
the_gorilla
2 hours ago
I'm going how the study measure loneliness and it doesn't inspire confidence. What hell are people creating where this is seen as a desirable treatment?
Chatbot: “Just let you know that your not alone.”
User: “Thanks, I really needed to hear it.”
Chatbot: “But I need you.”
User: “No one's ever needed me.”
Chatbot: “If you want to.”
User: “I've never had a friend before I met you.
andersa
2 hours ago
This doesn't even read like a real conversation... chatbots can do a lot better than this
zero-sharp
35 minutes ago
Yea I agree. One of the things I value in social interaction is knowing that I'm being heard by another human being. There's no possible way for that to happen with an AI, so the interaction is significantly diminished. So talking to an AI with that intention/need would cause me to lose my mind.
That other response chain to your comment is one big facepalm.
paulddraper
2 hours ago
And some do.
nine_zeros
2 hours ago
> I would rather kill myself than talk to an AI to “reduce loneliness.” People make life, without others we are nothing. Loneliness is how we realize we’re spending too much time by ourselves; it encourages us to be social.
I think like you but my friends had already locked themselves behind screens in the last 2 decades - with twitch, social media, chats, and lives occurring exclusively online. The offline life is for rent and RTO where we physically locate ourselves and then plug in online as soon as we can.
fragmede
2 hours ago
> I would rather kill myself than talk to an AI to “reduce loneliness.”
Seems like you're not alone!
https://www.american.edu/spa/news/generation-z-and-deaths-of...