TheAceOfHearts
3 hours ago
> At the core of most definitions you’ll find the idea of a machine that can match humans on a wide range of cognitive tasks.
I expect this definition will be proven incorrect eventually. This definition would best be described as a "human level AGI", rather than AGI. AGI is a system that matches a core set of properties, but it's not necessarily tied to capabilities. Theoretically one could create a very small resource-limited AGI. The amount of computational resources available to the AGI will probably be one of the factors what determines whether it's e.g. cat level vs human level.
Dylan16807
12 minutes ago
That definition gives me a headache. If it's not up to the level of a human then it's not "general". If you cut down the resources so much that it drops to cat level, then it's a cut down model related to an AGI model and no more.
dr_dshiv
2 hours ago
That’s like Peter Norvig’s definition of AGI [1] which is defined with respect to general purpose digital computers. The general intelligence refers to the foundation model that can be repurposed to many different contexts. I like that definition because it is clear.
Currently, AGI is defined in a way where it is truly indistinguishable from superintelligence. I don’t find that helpful.
[1] https://www.noemamag.com/artificial-general-intelligence-is-...
user
3 hours ago
turtletontine
2 hours ago
What does this even mean? How can we say a definition of “AGI” is “correct” or “incorrect” when the only thing people can agree on, is we don’t have AGI yet?