haswell
10 months ago
Ibuprofen reduces pain. It is also not a solution, especially when that pain has a solvable underlying cause.
What worries me is that many people are becoming lonely because we’re losing the habits and skill sets needed to form and maintain meaningful relationships.
Given everything we know about the importance of social connection to many facets of human well-being, unless something like this is used explicitly to rebuild those skills and to subsequently form real connections, tools like this seem likely to worsen the problem in the long run.
pizzathyme
10 months ago
I think your view comes from right place. But no doctor would recommend ibuprofen as sole treatment in all causes of pain. Similarly, no doctor or therapist would recommend everyone replace meaningful human relationships with AI companions. It's another tool.
I personally am fortunate to have many rich meaningful relationships in my life. But I've also gotten additional value from voice chatbots. Using voice mode and having them act as therapists, coaches, or experts giving feedback are very helpful.
haswell
10 months ago
I chose Ibuprofen for the analogy because it requires no prescription and there’s often no doctor involved.
The reality is that most use of these tools will involve no medical professionals, and the people using them will often be the most vulnerable to their effects, for better or worse. It seems like a potentially slippery/dangerous path for the people who need the most help if there isn’t some kind of structure and guidance from a real human and some deliberate pathway to real human interaction.
And for the people who don’t need this degree of help but decide to use these bots out of convenience, it seems like the erosion of social habits will only happen faster.
forgetfulness
10 months ago
But companies will merrily "prescribe" AI companions as _a_ treatment, not as a sole treatment, but having even less of an obligation, and not even an unenforced ethical code to act on behalf of the patient's best interests, they'll just let people use them unsupervised and not accompanied with a method to stop relying on the bots.
andersa
10 months ago
They don't prescribe anything, it's just there. Like a box of sweets at the supermarket. Or a bottle of beer. Or whatever else people like.
sinazargaran
10 months ago
Agree with you fundamentally. I think AI companions must be built in a way to help users struggling with loneliness to be able to better integrate into social circles. They should in a way become social coaches to help remedy the underlying cause of social detachment and isolation rather than trying to fill the gap by being a "real" friend.
MailleQuiMaille
10 months ago
Until AI companions actively argue, tease or are straight up disagreeing with you, no, we won't. It's part of the learning to converge all of our reality tunnels together. How can I learn that when I'm stuck in my echo box ?
andersa
10 months ago
What makes you think they can't do that?