KaiserPro
15 hours ago
One of the sad things about tech is that nobody really looks at history.
The same kinds of essays were written about trains, planes and nuclear power.
Before lindbergh went off the deepend, he was convinced that "airmen" were gentlemen and could sort out the world's ills.
The essay contains a lot of coulds, but doesn't touch on the base problem: human nature.
AI will be used to make things cheaper. That is, lots of job losses. must of us are up for the chop if/when competent AI agents become possible.
Loads of service jobs too, along with a load of manual jobs when suitable large models are successfully applied to robotics (see ECCV for some idea of the progress for machine perception.)
But those profits will not be shared. Human productivity has exploded in the last 120 years, yet we are working longer hours for less pay.
Well AI is going to make that worse. It'll cause huge unrest (see luddite riots, peterloo, the birth of unionism in the USA, plus many more)
This brings us to the next thing that AI will be applied to: Murdering people.
Andril is already marrying basic machine perception with cheap drones and explosives. its not going to take long to get to personalised explosive drones.
AI isn't the problem, we are.
The sooner we realise that its not a technical problem to be solved, but a human one, we might stand a chance.
But looking at the emotionally stunted, empathy vacuums that control either policy or purse strings, I think it'll take a catastrophe to change course.
kranke155
15 hours ago
We are entering a dystopia and people are still writing these wonderful essays about how AI will help us.
Microtargeted psychometrics (Cambridge Analytica, AggregateIQ) have already made politics in the West an unending barrage of information warfare. Now we'll have millions of autonomous agents. At some point soon in the future, our entire feed will be AI content or upvoted by AI or AI manipulating the algorithm.
It's like you said - this essay reads like peak AI. We will never have as much hope and optimism about the next 20 years as we seem to have now.
Reminds me of a graffiti I saw in London, while the city's cost of living was exploding and making the place unaffordable to anyone but a few:
"We live in a Utopia. It's just not ours."
xpe
11 hours ago
Is such certainty warranted? I don’t think so; it strains credibility.
I’m very concerned about many future scenarios. But I admit the necessity of probabilistic assessments.
musicale
5 hours ago
As long as there is money to be made or power to be gained, any technology will be used to get it, especially if the negative effects are externalities that the technology developers and users are not liable for.
xpe
11 hours ago
When reading the article, the author makes it very clear that AI can cut both ways. See the intro, you may have overlooked it.
MichaelZuo
13 hours ago
Your looking at it from a narrow perspective.
There are millions of middle class households living pretty comfortable lives in Africa, India, China, ASEAN, and Central Asia that were living hand-to-mouth 20 years ago.
And I don’t mean middle class by developing country standards, I mean middle class by London, UK, standards.
So it pretty much is a ‘utopia’ for them, assuming they can keep it.
Of course that’s cold comfort for households in London regressing to the global average, but that’s the inherent nature of rising above and falling towards averages.
sitkack
12 hours ago
But now you are missing the GPs point.
MichaelZuo
12 hours ago
How so?
ManuelKiessling
13 hours ago
I do not agree with the following:
> But those profits will not be shared. Human productivity has exploded in the last 120 years, yet we are working longer hours for less pay.
I am, however, criticizing this in isolation — that is, my goal is not to invalidate (nor validate, for that matter) the rest of your text; only this specific point.
So, I do not agree. We are clearly working a lot less hours than 120 or even 60 years ago, and we are getting a lot more back for it.
The problem I have with this is that the framing is often wrong — whether some number on a paycheck goes up or down is completely irrelevant at the end of the day.
The only relevant question boils down to this: how many hours of hardship do I have to put in, in order to get X?
And X can be many different things. Like, say, a steak, or a refill at the gas station, or a bread.
Now, I do not have very good data at hand right here and right now, but if my memory and my gut feeling serves me right, the difference is significant, often even dramatic.
For example, for one kilogram of beef, the average German worker needs to toil about 36 minutes nowadays.
In 1970, it was twice as much time that needed to be worked before the same amount of beef could be afforded.
In the seventies, Germans needed to work 145 hours to be able to afford a washing machine.
Today, it’s less than 20 hours!
And that’s not even taking into account the amount of „more progress“ we can afford today, with less toil.
While one can imagine that in 1970, I could theoretically have something resembling a smartphone or a lane- and distance-keeping car getting produced for me (by NASA, probably), I can’t even begin to imagine how many hours, if not millennia, I would have needed to work in order to receive a paycheck that would have paid for it.
We get SO much more for our monthly paycheck today, and so many more people do (billions actually), it’s not even funny.
musicale
6 hours ago
Wage increases have been lagging productivity increases for about 60 years.[1]
For the past 35 years or so, automation seems to have driven wage inequality, and AI is headed down the same path.[2,3]
[1] https://www.epi.org/productivity-pay-gap/
[2] https://news.mit.edu/2020/study-inks-automation-inequality-0...
[3] https://www.technologyreview.com/2022/04/19/1049378/ai-inequ...
CaptainFever
4 hours ago
Note that the famous EPI graph in your first source appears to be controversial with mainstream economists: https://www.reddit.com/r/AskEconomics/comments/12kk79k/what_...
(However, there seems to be other sources that say the same thing, despite criticism of EPI's graph. So I'm not sure. It seems to be consensus that it has been stagnating or not keeping up with productivity for low skilled workers, though? But I can't find much about wage or compensation compared to productivity for high skilled workers [e.g. programmers, AI engineers]).
Furthermore, your comment focused on wages, where your parent comment specifically claimed that wages were less important than "how much work it takes to buy something similar, like steak, compared to the past". It misses the point.
Finally, I admit I haven't seen the latter two sources, but in my layman's opinion, inequality isn't necessarily bad if all needs are met.
musicale
5 hours ago
> We get SO much more for our monthly paycheck today
Many goods are cheaper, and electronics are vastly better, but housing, health care, and higher education costs have outpaced inflation for decades.
thrance
3 hours ago
You are ignoring the poorest half the planet, how long does a Bengladeshi woman has to work to buy a washing machine? How many of them even will ever own one? And who made the washing machine you own? Was it made in your country? Or in China?
jaredhallen
7 hours ago
We get more material goods, to be sure. But do we get more satisfaction out of our lives? More happiness? More joy?
chairmansteve
7 hours ago
You are right, but the average person is less happy than in 1970.
j7ake
2 hours ago
Average meaning not gay, not ethnic minority, and not living in China or Soviet Union?
CaptainFever
4 hours ago
Source?
jimkleiber
15 hours ago
> The essay contains a lot of coulds, but doesn't touch on the base problem: human nature.
> AI isn't the problem, we are.
I think when we frame it as human _nature_, then yes, _we_ look like the problem.
But what if we frame it as human _culture_? Then _we_ aren't the problem, but rather our _behaviors/beliefs/knowledge/etc_ are.
If we focus on the former, we might just be essentially screwed. If we focus on the latter, we might be able to change things that seem like nature but might be more nurture.
Maybe that's a better framing: the base problem is human nurture?
laurex
14 hours ago
I think this is an important distinction. Yes, humans have some inbuilt weaknesses and proclivities, but humans are not required to live in or develop systems in which those weaknesses and proclivities are constantly exploited for the benefit/power of a few others. Throughout human history, there have been practices of contemplation, recognition of interdependence, and ways of increasing our capacity for compassion and thoughful response. We are currently in a biological runaway state with extraction, but it's not the only way humans have of behaving.
exe34
14 hours ago
> Throughout human history, there have been practices of contemplation, recognition of interdependence, and ways of increasing our capacity for compassion and thoughful response.
has this ever been widespread in society? I think such people have always been few and far between?
keyringlight
14 hours ago
The example that comes to mind is post-WW2 Germany, but that was apparently a hard slog to change the minds of the German people. I really doubt any organization could do something similar presenting an opposing viewpoint to the companies (and their resources) behind and using AI
throwaway14356
13 hours ago
you are living in it.
The default state is to have extremely poor hard working people and extremely rich not working ones.
No one would have dared to dream of the luxury working people enjoy today. It took some doing! We use to sell people not to long ago. Kids in coal mines. The work week was 6-7 days 12-14 hours. One coin per day etc
The fight isn't over, the owner class won the last few rounds but there remains much to take for either side.
exe34
8 hours ago
what does "practices of contemplation, recognition of interdependence, and ways of increasing our capacity for compassion and thoughful response." have to do with luxury? it sounds like you're arguing for the opposite? scrolling Instagram isn't the contemplation I have in mind.
achrono
14 hours ago
Sure. But why do you think changing human nurture is any easier than changing human nature? I suspect that as your set of humans in consideration tends to include the set of all humans, the gap between changeability of human nature vs changeability of human nurture reduces to zero.
Perhaps you are implying that we sign up for a global (truly global, not global by the standards of Western journalists) campaign of complete and irrevocable reform in our behavior, beliefs and knowledge. At the very least, this implies simply killing off a huge number of human beings who for whatever reason stand in the way. This is not (just) a hypothesis -- some versions of this have been tried and tested. *
wrs
13 hours ago
Arguably, human nature hasn't changed much in thousands of years. But there has been plenty of change in human culture/nurture on a much smaller timescale. E.g., look at a graph of world literacy rates since 1800. A lot of human culture is an attempt to productively subvert or attenuate the worse parts of human nature.
Now, maybe the changes in this case would need to happen even quicker than that, and as you point out there's a history of bad attempts to change cultures abruptly. But it's nowhere near correct to say that the difficulty is equal.
beepbooptheory
13 hours ago
In general I think concepts like politics, art, community, etc try to capture certain discrete ways we are all nurtured. Like I am not even sure you're point here, there is nothing more totalitarian than reducing people to their "nature", it is arguably its precise conceit if it has one, that such a thing is possible. And the fact that totalitarianism is constantly accompanied by force and violence seems to be the biggest critique you can make of all sorts of "human nature" reductions.
And like what is even the alternative here? What's your freedom of belief worth when your essentially just a behaviorist anyway?
tbrownaw
14 hours ago
> I think when we frame it as human _nature_, then yes, _we_ look like the problem.
But what if we frame it as human _culture_? Then _we_ aren't the problem, but rather our _behaviors/beliefs/knowledge/etc_ are.
If we focus on the former, we might just be essentially screwed. If we focus on the latter, we might be able to change things that seem like nature but might be more nurture.
Maybe that's a better framing: the base problem is human nurture?
This is about the same as saying that leaders can get better outcomes by surrounding themselves with yes-men.
Just because asserting a different set of facts makes the predicted outcomes more desirable, doesn't mean that those alternate facts are better for making predictions with. What matters is how congruent they are to reality.
xpe
11 hours ago
> AI isn't the problem, we are.
I see major problems with the statement above. First, it is a false dichotomy. That’s a fatal flaw.
Second, it is not specific enough to guide action. Pretend I agree with the claim. How would it inform better/worse choices? I don’t see how you operationalize it!
Third, I don’t even think it is useful as a rough conceptual guide; it doesn’t “carve reality at the joints” so to speak.
roenxi
13 hours ago
> AI will be used to make things cheaper. That is, lots of job losses. must of us are up for the chop if/when competent AI agents become possible.
> But those profits will not be shared. Human productivity has exploded in the last 120 years, yet we are working longer hours for less pay.
Don't you have to pick one? It seems a bit disjointed to simultaneously complain that we are all losing our jobs and that we are working too many hours. What type of future are we looking for here?
If machines get so productive that we don't need to work, everyone losing their jobs isn't a long-term problem and may not even be a particularly damaging short-term one. It isn't like we have less stuff or more people who need it. There are lots of good equilibriums to find. If AI becomes a jobs wrecking ball I'd like to see the tax system adjusted so employers are incentivised to employ large numbers of people for small numbers of hours instead of small numbers of people for large numbers of hours - but that seems like a relatively minor change and probably not an especially controversial one.
KaiserPro
4 hours ago
I did not argue that cogently.
At each "node" of the industrial revolution, lots of workers were displaced. (weavers, scriveners, printers, car mechanics). Those workers were highly paid because they were highly skilled.
Productivity made things cheaper to make, because it removed the expensive labour required to make it.
That means less well paid jobs with controlled hours. This leads to poorly paid jobs with high competition and poor hours.
Yes new highly paid jobs were created, but not for the people who were dispossessed by the previous "node".
> If machines get so productive that we don't need to work, everyone losing their jobs isn't a long-term problem
who is going to pay for them? not the people who are making the profits.
roenxi
4 hours ago
> who is going to pay for them? not the people who are making the profits.
They kinda have to. If I'm so productive at producing food that I grow enough for 2,000 people, I have to find 2,000 people to feed. Or the land sits fallow and I get nothing at all. There is a lot of wiggle room at the margins, but overall it is hard to get away with producing more stuff and warehousing it.
We'd expect to see a bunch of silly jobs where people are peeling grapes and cooling the wealthy with fronds; but the capitalists can't hoard the wealth because they can't really do anything with it.
jacobedawson
3 hours ago
"The works of the roots of the vines, of the trees, must be destroyed to keep up the price, and this is the saddest, bitterest thing of all. Carloads of oranges dumped on the ground. The people came for miles to take the fruit, but this could not be. How would they buy oranges at twenty cents a dozen if they could drive out and pick them up? And men with hoses squirt kerosene on the oranges, and they are angry at the crime, angry at the people who have come to take the fruit.
A million people hungry, needing the fruit- and kerosene sprayed over the golden mountains. And the smell of rot fills the country. Burn coffee for fuel in the ships. Burn corn to keep warm, it makes a hot fire. Dump potatoes in the rivers and place guards along the banks to keep the hungry people from fishing them out. Slaughter the pigs and bury them, and let the putrescence drip down into the earth."
John Steinbeck, The Grapes of Wrath
N8works
13 hours ago
Yes. I used to share your viewpoint.
However, recently, I've come to understand that is AI is about the inherently unreal and that authentic human connection is really going to be where it's at.
We build because we need it after all, no?
Don't give up. We have already won.
Vecr
13 hours ago
I think KaiserPro is saying authentic human connection doesn't "pay the bills", so to speak. If AI is "about the unreal" as you say, what if it makes everything you care about unreal?
swatcoder
15 hours ago
> One of the sad things about tech is that nobody really looks at history.
First, while I often write much of the same sentiment about techno-optimism and history, you should remember that you're literally in the den of Silicon Valley startup hackers. It's not going to be an easily heard message here, because the site specifically appeals to people who dream of inspiring exactly these essays.
> The sooner we realise that its not a technical problem to be solved, but a human one, we might stand a chance.
Second... you're falling victim to the same trap, but simply preferring some kind of social or political technology instead of a mechanical or digital one.
What history mostly affirms is that prosperity and ruin come and go, and that nothing we engineer last for all that long, let alone forever. There's no point in dreading it, whatever kind of technology you favor or fear.
The bigger concern is that some of the acheivements of modernity have made the human future far more brittle than it has been in what may be hundreds of thousands of years. Global homogenization around elaborate technologies -- whether mechanical, digital, social, political or otherwise -- sets us up in a very "all or nothing" existential space, where ruin, when it eventually arrives, is just as global. Meanwhile, the purge of diverse, locally practiced, traditional wisdom about how to get by in un-modern environments steals the species of its essential fallback strategy.
germinalphrase
13 hours ago
“Meanwhile, the purge of diverse, locally practiced, traditional wisdom about how to get by in un-modern environments steals the species of its essential fallback strategy“
While potentially true, that same wisdom was developed in a world that itself no longer exists. Review accounts of natural wildlife and ecological bounty from even 100 years ago, and it’s clear how degraded our natural world has become in such a very short time.
tbrownaw
10 hours ago
> Global homogenization around elaborate technologies -- whether mechanical, digital, social, political or otherwise -- sets us up in a very "all or nothing" existential space, where ruin, when it eventually arrives, is just as global.
What is the minimum population size needed in order to have, say, computer chips? Or even a ball-point pen? I'd imagine those are a bit higher that what's needed to have pencils, which I've heard is enough that someone wrote a book about it.
> Meanwhile, the purge of diverse, locally practiced, traditional wisdom about how to get by in un-modern environments steals the species of its essential fallback strategy.
Is it really a "purge" if individuals are just not choosing to waste time learning things they have no use for?
amelius
13 hours ago
History tells us that humans will not tolerate any "creature" to exist that is smarter than them, so that is where the story will end.
tbrownaw
11 hours ago
How exactly does this show up in history? When did we meet something smarter than us, and what did we do that was different than we were doing to less-smart things at the time?
amelius
3 hours ago
We couldn't live side by side with Neanderthals. Hell, we can't even live peacefully with a different race.
realce
15 hours ago
3 links above this one is https://www.washingtonpost.com/nation/2024/10/08/exoskeleton...
mythrwy
14 hours ago
But will AI be eventually used to change human nature itself?
startupsfail
12 hours ago
The responsibility the airmen take when they take passengers off the ground (holding their lives in their hands) is a serious one.
The types of Trump are unlikely to get a license or accumulate enough Pilot In Command hours an not be an accident, and the experience itself changes the person.
If I have a choice of who to trust, between an airman or not airman, I’d likely choose an airman.
And I’m not sure what you are referring to about Lindbergh, but among other things he was a Pulitzer Prize winning author, environmentalist and following Pearl Harbor he had fought against the aggressors.
KaiserPro
4 hours ago
You neatly sum up his world view. He thought that Airmen were the pinnacle of civility. He didn't want a fight using planes, because that would be unsporting.
> And I’m not sure what you are referring to about Lindbergh
Well, he thought that he jews were pushing for WWII.
https://alba-valb.org/wp-content/uploads/2017/04/Lindbergh_1... He also wanted to maintain friendly relations with Goering.
alexashka
11 hours ago
For a non-trivial number of people, having power/status over others is what they like.
For a non-trivial number of people, they don't care what happens to others, as long as their tribe benefits.
As long as these two issues are not addressed, very little meaningful progress is possible.
> Looking at the emotionally stunted, empathy vacuums that control either policy or purse strings, I think it'll take a catastrophe to change course.
A catastrophe won't solve anything because you'll get the same people who love power over others in power and people who don't mind fucking over others right below them, which is where humanity has always been.