andy99
a month ago
I just commented about this in another thread. I know there has been some walking back e.g. of the significance of a Turing test but I think overall the goalposts for AI have shifted in the other way, to narrowing down the definition of intelligence to something like “being really good at some set of defined tasks” which coincidentally is basically the strong point of neural networks.
We seem hyperfocused on finding more tasks to train neural networks to do. This of course leads to a moving goalpost effect like in the article, but they’re moving along an axis that doesn’t measure intelligence.
My other comment: https://news.ycombinator.com/item?id=46445511