gklitz
a year ago
I’ve sort of lost interest in AGI. It was always an interest because of all the cool things I imagined it would be able to do, but here we are now without AGI but with so many of the cool things I was imagining. So I care less if it's AGI or not. It’s extremely useful none the less. And as models get better and better and as tools improve we’ll see more and more value added to society even if it isn’t AGI.
nytesky
a year ago
I have developed ethical qualms about AGI. We end up either enslaving an intelligence, breeding it into submission, or end up with a powerful adversary.
az09mugen
a year ago
This. AGI-enthusiastics are disconnected from reality as they want something but don't grasp the consequences of having that said thing. Exactly like kids would want a dog because they only see they could play with it. But don't see the fact the dog needs to eat, go out for a walk, and sometimes go to the vet, can sometimes bite. AGI-enthousiastics are not ready to face an AGI, and no one of them is ready to take responsability for an AGI, given we'll see one in the next 50 years.
polishdude20
a year ago
Exactly. We are trying to develop something so human-like or more powerful so that it does our bidding. We're calling it "technology" and a "tool" but it becomes more than that at some point.
_xerces_
a year ago
I'm still waiting to see any (net) value to society from our current AI. So far I believe it is negative despite being a paid ChatGPT user myself.