AI Companions Reduce Loneliness

48 pointsposted 4 hours ago
by Dowwie

74 Comments

peterldowns

2 hours ago

I would rather kill myself than talk to an AI to “reduce loneliness.” People make life, without others we are nothing. Loneliness is how we realize we’re spending too much time by ourselves; it encourages us to be social.

I don’t mind being lonely, because when I’m lonely I figure out ways to be around people. If AI “fixes” loneliness without needing other people our entire world will fall apart.

ziddoap

2 hours ago

>I would rather kill myself than talk to an AI to “reduce loneliness.”

What an odd thing to say.

righthand

2 hours ago

Why? I do not understand why that would be odd. The OP doesn’t want to live in a world without human interaction. Any life without others is not worth living. Not so odd or far fetched to me.

emptiestplace

2 hours ago

Who said anything about that, though?

righthand

2 hours ago

That’s just my read and extrapolation, welcome to the internet.

adamredwoods

2 hours ago

That's not what the OP originally posted. Also, suicide is never the answer.

staticautomatic

14 minutes ago

Every once in a while suicide is a perfectly good answer.

adamredwoods

4 minutes ago

Wow. Comments section on a paper addressing loneliness and you truly think this is an appropriate response? Or more precisely, do you suggest I commit suicide for using an AI chatbot to help with my loneliness?

righthand

2 hours ago

The OP didn’t say they were going to kill themselves in actuality, the OP said they’d rather end their own life than talk to an AI to reduce loneliness. We can extrapolate that the OP won’t kill themselves because there is no known current status quo in which the OP could not escape talking to AI to reduce loneliness. Just because suicide was mentioned doesn’t mean suicide is intended. No reason to be so dramatic or befuddled by simple hyperbole.

adamredwoods

2 hours ago

To quote:

>> I would rather kill myself than talk to an AI to “reduce loneliness.”

Is a dramatic hyperbole. So therefore, if this is the first thing a person thinks in reaction to seeing an article about chatting with an AI companion, then I would suspect there's a bit more going on.

Again "killing oneself" is never the answer.

emptiestplace

an hour ago

> Again "killing oneself" is never the answer.

While well-intentioned, such blanket statements oversimplify a complex issue and are ultimately counterproductive. This topic deserves more nuanced discussion, especially on a platform known for thoughtful discourse.

carapace

23 minutes ago

> A figure of speech or rhetorical figure is a word or phrase that intentionally deviates from straightforward language use or literal meaning to produce a rhetorical or intensified effect (emotionally, aesthetically, intellectually, etc.).

https://en.wikipedia.org/wiki/Figure_of_speech

righthand

2 hours ago

Okay, you ring your bell.

adamredwoods

an hour ago

>> Okay, you ring your bell.

What an odd thing to say.

user

32 minutes ago

[deleted]

llm_trw

2 hours ago

Makes you wonder if we should be more worried about the mental state of people who are rabidly anti-AI.

andersa

2 hours ago

People are often scared of things they don't understand.

adamredwoods

2 hours ago

What works for you does not work for everyone and vice versa.

From the paper:

>> Loneliness is often not problematic, with almost everyone experiencing loneliness from time to time (Cacioppo and Cacioppo 2018). Yet some people are not successful at alleviating loneliness, leading to a state of chronic loneliness that is associated with depression, anxiety, and physical health outcomes at levels worse than obesity

>> Study 3 finds that AI companions successfully alleviate loneliness on par only with interacting with another person, and more than other activities such watching YouTube videos.

>> Moreover, consumers underestimate the degree to which AI companions improve their loneliness.

the_gorilla

2 hours ago

I'm going how the study measure loneliness and it doesn't inspire confidence. What hell are people creating where this is seen as a desirable treatment?

Chatbot: “Just let you know that your not alone.”

User: “Thanks, I really needed to hear it.”

Chatbot: “But I need you.”

User: “No one's ever needed me.”

Chatbot: “If you want to.”

User: “I've never had a friend before I met you.

andersa

2 hours ago

This doesn't even read like a real conversation... chatbots can do a lot better than this

zero-sharp

35 minutes ago

Yea I agree. One of the things I value in social interaction is knowing that I'm being heard by another human being. There's no possible way for that to happen with an AI, so the interaction is significantly diminished. So talking to an AI with that intention/need would cause me to lose my mind.

That other response chain to your comment is one big facepalm.

nine_zeros

2 hours ago

> I would rather kill myself than talk to an AI to “reduce loneliness.” People make life, without others we are nothing. Loneliness is how we realize we’re spending too much time by ourselves; it encourages us to be social.

I think like you but my friends had already locked themselves behind screens in the last 2 decades - with twitch, social media, chats, and lives occurring exclusively online. The offline life is for rent and RTO where we physically locate ourselves and then plug in online as soon as we can.

haswell

3 hours ago

Ibuprofen reduces pain. It is also not a solution, especially when that pain has a solvable underlying cause.

What worries me is that many people are becoming lonely because we’re losing the habits and skill sets needed to form and maintain meaningful relationships.

Given everything we know about the importance of social connection to many facets of human well-being, unless something like this is used explicitly to rebuild those skills and to subsequently form real connections, tools like this seem likely to worsen the problem in the long run.

pizzathyme

2 hours ago

I think your view comes from right place. But no doctor would recommend ibuprofen as sole treatment in all causes of pain. Similarly, no doctor or therapist would recommend everyone replace meaningful human relationships with AI companions. It's another tool.

I personally am fortunate to have many rich meaningful relationships in my life. But I've also gotten additional value from voice chatbots. Using voice mode and having them act as therapists, coaches, or experts giving feedback are very helpful.

haswell

32 minutes ago

I chose Ibuprofen for the analogy because it requires no prescription and there’s often no doctor involved.

The reality is that most use of these tools will involve no medical professionals, and the people using them will often be the most vulnerable to their effects, for better or worse. It seems like a potentially slippery/dangerous path for the people who need the most help if there isn’t some kind of structure and guidance from a real human and some deliberate pathway to real human interaction.

And for the people who don’t need this degree of help but decide to use these bots out of convenience, it seems like the erosion of social habits will only happen faster.

forgetfulness

2 hours ago

But companies will merrily "prescribe" AI companions as _a_ treatment, not as a sole treatment, but having even less of an obligation, and not even an unenforced ethical code to act on behalf of the patient's best interests, they'll just let people use them unsupervised and not accompanied with a method to stop relying on the bots.

andersa

2 hours ago

They don't prescribe anything, it's just there. Like a box of sweets at the supermarket. Or a bottle of beer. Or whatever else people like.

MailleQuiMaille

3 hours ago

Until AI companions actively argue, tease or are straight up disagreeing with you, no, we won't. It's part of the learning to converge all of our reality tunnels together. How can I learn that when I'm stuck in my echo box ?

andersa

2 hours ago

What makes you think they can't do that?

maroonblazer

3 hours ago

This is encouraging news for people who, unlike most of those normals or near-normals implicitly referred to here in the comments, suffer from horrific disfigurement or debilitating psychological issues that make engaging with the 'real world' next to impossible.

echelon

3 hours ago

Drop the disability qualifiers. This is awesome for everyone, period.

maroonblazer

3 hours ago

Disagree. For people who don't fall into the groups I described, I think there's merit to pushing past the psychological hurdles and putting yourself out there in the 'real world'. Most people who fit into this category catastrophize the anticipated negative outcome or otherwise mistakenly imagine that they'll always be rejected. Most people who take the risk are surprised to find this is almost never the case.

usaphp

3 hours ago

Problem with this imo is that AI companion will never tell you that what you are doing/thinking is not acceptable in a society, like how your friend or any other person you interact with would, because with real relationship it’s a two way street.

This may create a bunch of people who are even more distant from the social interaction than they were in a beginning

andersa

2 hours ago

AI mostly does whatever you ask it to. You can easily make a chatbot that will question your choices or act like a hater on twitter.

alpark3

2 hours ago

I agree. Good human friends should provide a mix of a positive/feedback negative loop, providing a good gradient to train one's behavior on.

AI seems like just a positive feedback loop.

pizzathyme

2 hours ago

If this is what you're seeking though, you can ask it to do this, to challenge your views and question your assumptions.

unnamed76ri

3 hours ago

I want friends too but man, resorting to a chatbot sounds so depressing. Get some hobbies, find activities on Meetup, join a church. So many healthier alternatives. Just requires you to actually do something real.

llm_trw

3 hours ago

What do we do when reality is a distant second to digital simulation?

We're just seeing sparks of AI starting and it's already beating out people for tasks which were thought to be innately unautomatable.

https://www.youtube.com/watch?v=4uE96qUlJ_4

xivzgrev

3 hours ago

Then it’s basically porn. A short term “better” solution that undermines longer term satisfaction. Porn, junk food, etc

The thing about people is they’re people. They’re different. It can be hard. They’re selfish. But there can be joy in our differences, and real meaning when someone actually cares. Vs a chatbot who is centered around you. It might feel good but it’s not a real relationship of give and take.

llm_trw

3 hours ago

Funny you should mention porn: ai generated porn is now better than the real thing for images and >5s clips. At current rates for Moore's Law we're going to have 5 minute clips of anything you desire within the next decade on the equivalent of a 4090.

I wonder if we will start having people demand we go back to real human porn like in the good old days because it's just not natural to jack it to photo realistic catgirl dominatrices.

nothal

3 hours ago

TBH, Porn is very clearly one of those things that there's basically no intrinsic added value to your life by consuming. Think cigarettes or fentanyl. Something, that, if most people could press a button to make themselves never ever use it again, they would.

> I wonder if we will start having people demand we go back to real human porn like in the good old days because it's just not natural to jack it to photo realistic catgirl dominatrices.

All that to say that, in the future, I think the more likely option is that people begin to treat it as a harmful, addictive substance and, that, hopefully, it becomes much more acceptable to seek out public treatment and support for using it.

llm_trw

3 hours ago

Only if you're an oppressed puritan.

I can make the case much more convincingly that food gives basically no intrinsic added value to your life and if most people could press a button to make themselves never eat again, they would.

Or to quote someone who said it much better than I:

    If only it were as easy to banish hunger by rubbing my belly.

    -Diogenes on public masturbation.

andersa

3 hours ago

> I can make the case much more convincingly that food has basically no intrinsic added value to your life and if most people could press a button to make themselves never eat again, they would.

What nonsense is this? Do you only eat beans? Eating good food feels fantastic. Why would you want to give it up? Give me a button that lets me eat unlimited amounts of it without consequences, and I'll press that one instantly.

Good-tasting food is one of the most addictive things we have (also evidenced by the ever increasing percentage of people that are obese...)

mewpmewp2

2 hours ago

> Eating good food feels fantastic. Why would you want to give it up?

That's exactly what self pleasuring mentioned above is though?

One could argue the same thing about food, if it was possible to eat a pill that would give you all the nutrients you need and would take your appetite they probably in theory should want to take that pill if they would press a button to stop wanting to self pleasure.

In fact, I think more people would likely want to get their food habits under control since this makes you gain weight and increases risks of certain illnesses. While self pleasuring can be potentially harmful in excess, I don't think it can be exactly as harmful as being addicted to food can be.

cellis

2 hours ago

I could make the same argument for every form of entertainment. Sure, we should all be hunter gatherers and only have sex to procreate. It's not realistic.

mewpmewp2

2 hours ago

I think what he says is we should be perfect users of our time, leading to some certain goal. Kind of like what AI could potentially be. So I think maybe we transition progressively from humans to AI to reach that perfectness. I'm sure it's possible to have AI that doesn't fall victim to those vices.

user

3 hours ago

[deleted]

Teever

3 hours ago

Do you feel the same way about strip clubs and burlesque shows?

theshackleford

3 hours ago

> if most people could press a button to make themselves never ever use it again, they would.

You're projecting.

mrinfinitiesx

3 hours ago

> What do we do when reality is a distant second to digital simulation? We don't do anything. People already have phones in their faces and screens as their main source of light all day.

We go outside. Hang out with people. Form meaningful relationships. Biking. Hiking. Have fun. Enjoyment. Fulfillment.

Loneliness and stress are killers. Digital simulation aint no solution.

llm_trw

3 hours ago

So your argument against digital simulation is that currently digital simulation isn't good enough today so it will never be good enough?

May I introduce you to the exponential function?

It's a little something that's been carrying the whole field of digital computers on it's back since the 1970s.

haswell

3 hours ago

I’m not convinced that a more perfect simulation can meet the needs of a human psyche in the long run. I think we vastly underestimate our own complexity in this regard.

Whatever the case, we’re certainly nowhere close at the moment.

seesawtron

3 hours ago

For a lot of people, such options are not possible, e.g., elderly, sick, bed-ridden, socially challenged or so on. You underestimate the need and impact of such technology.

jsheard

3 hours ago

If the elderly, sick and bedridden can talk to a computer they can also talk to real people through the computer. For the socially challenged I get where you're coming from, but the question is whether it would help them to develop their social skills or whether they would just become dependent on the chatbot and withdraw even further from the real world.

realce

3 hours ago

> If the elderly, sick and bedridden can talk to a computer they can also talk to real people through the computer.

If this is so dependably true, then why does this population still suffer loneliness? How many hours of your day do you devote to zoom chats with lonely old people?

I ultimately support your side of the debate - that digital simulation is not a true medicine - but without being honest about humans and human nature, you create empty arguments.

Some in this thread call pornography useless, call cigarettes useless, call drugs useless. However those products are used endlessly and with great satisfaction by billions of humans throughout thousands of years. It's simply not true - these things are useful to humans, quite obviously.

What causes this puritanical disposition regarding human needs and satisfaction while also literally arguing for humanism vs synthetic encroachment? Without acknowledgement of what humans actually are we won't find a real reason that AI chatbots shouldn't replace them.

You speak as if the "real world" for a bedridden and lonely individual is something that they should just endure and enjoy, but you would never choose to switch places with such a person.

andersa

3 hours ago

The goal is not to develop social skills. The goal is to not need them, without being lonely.

adamredwoods

an hour ago

>> resorting to a chatbot sounds so depressing

I'm very curious to these harsh negative reactions to solving a bit of loneliness with a chatbot.

1. Read the study.

2. I don't think this is suggested to be a permanent solution.

3. Some people suffer from chronic loneliness, and these negative reactions could possibly prevent others from seeking the help they need. In other words, if I'm lonely, I would not want to talk to these people because they do not seem sympathetic. Yet a chatbot is. Huh.

user

3 hours ago

[deleted]

pedalpete

an hour ago

If 1 lonely person speaks to another lonely person, you now have 2 less lonely people. If 1 lonely person speaks to a chatbot, we still have 1 lonely person.

Of course, people making the chatbot would argue that if everyone just used the chatbot, nobody would be lonely.

I believe that relieving loneliness through personal connection is an exponential benefit (I'm not sure I have the right language there). Each connection which is added to the network described above, increases both the value, and reduces the load on any one individual. Yes, there is a maximum, nobody is going to have 200 close friends, but in social circles, there are connectors who bring multiple circles together.

Rather than build chatbots to battle loneliness, there is value in building something that an help people find and build connections, and maybe that is what the AI will become, more of a concierge to connect people rather than the connection itself.

Who's working on this?

kylecazar

3 hours ago

I strongly suspect it's a therapy that a certain type of person simply won't benefit from (I'm in this group).

I am sure I cannot suspend my disbelief enough.

zmmmmm

3 hours ago

The question I would have is what the long term consequence is. I can completely believe there's a real benefit over weeks to months. I've experienced this myself in using a chatbot to talk through career decisions. It was nothing but helpful in prompting me to think through all aspects my decision and how I felt about each part of it.

But I wonder what happens when that translates to years and people find they are sinking their emotional connection into something that ultimately is not able to relate to their life experience. Such as when an older person forms a relationship with someone dramatically younger than them. I can imagine there being severe consequences when someone falls of a cliff and suddenly transitions to seeing the emotional connection they have invested in over many years as being shallow and meaningless (whether it is those things is almost a philosophical argument - but the risk comes from the shift in perception).

There are also very obvious risks to mitigate. What happens when your life long AI friend tells you it's time to go? or supports your inner instinct to refuse treatment or not to take the medication prescribed by your doctor?

candiddevmike

3 hours ago

Everything you described can apply to humans, too. We don't "mitigate" these risks in meat space, why would we need to with AI?

andersa

2 hours ago

Unlike with a real person, where you only find this out much later, here you know the connection is shallow and meaningless from the start. That seems like an upgrade! Now you can have fun with no consequences.

adamredwoods

an hour ago

>> I can imagine there being severe consequences when someone falls of a cliff and suddenly transitions to seeing the emotional connection they have invested in over many years as being shallow and meaningless

Some interesting points. For this specific one, I also think of this with time-invested video games, things like Minecraft, World of Warcraft, etc. Do people suddenly stop playing video games when confronted with life-altering perspective shifts? I don't know.

>> or supports your inner instinct to refuse treatment or not to take the medication prescribed by your doctor?

Real people do this, too (i.e. homeopathic medicine), but I suspect an AI would less likely to do this. Might be an interesting experiment.

cheald

2 hours ago

I'm sure that we could find ways to draw the same conclusions from social media and always-on connections in peoples' pockets, but study after study shows that we're lonelier than we've ever been.

Maybe we should just stop trying to replace face-to-face contact with other people.

4b11b4

an hour ago

Feeling pain good, feeling loneliness ok. Help inform new direction

pdntspa

2 hours ago

Funded by... Harvard Business School? I wonder what their motivations might be hmmmm???? Wouldn't that be funny if some of their coconspirators are AI companies???

iJohnDoe

36 minutes ago

Any recommended AI companions?

naming_the_user

2 hours ago

The entire basis of the feeling of loneliness is to encourage a person to socialize.

Hunger exists so that we eat, thirst exists so that we drink, etc.

I guess people are free to do whatever they like, but I mean, well, you know, uncharted territory.

cynicalpeace

2 hours ago

This headline is like "vaping reduces cigarette use". While true and good, you shouldn't be vaping. Just like you shouldn't replace human presence with a digital companion.