danpalmer
8 hours ago
If you write policy about AI you're doing it wrong. AI is an implementation, but policies must be written for outcomes.
Discrimination by law enforcement, exclusion from loan approval, bad moderation on social networking, cheating on exams, creating fake news or media about people, swallowing up user data... all the negative social impact of AI can be achieved without it, and much of it is already illegal anyway.
Legislation that is predicated on AI will fail in the long run. Legislation that focuses on the actual negative outcomes will stand the test of time much more.
lm28469
7 hours ago
> all the negative social impact of AI can be achieved without it,
With the big differences being massive automatisation, huge reduction of cost and no one to blame when things go wrong... It's like saying a nuke and a knife are the same because they both kill
ndriscoll
an hour ago
Someone is to blame for approving the use (or unapproved use) of a tool/process that breaks the law, same as today. When I worked in a regulated industry, we kept records of all inputs and decisions made and an auditor would do random checks that the results matched our documented methodology (afaik that documentation was submitted to/approved by either an auditor or the regulator).
andy99
3 hours ago
Agreed, I think it’s more a lens (one of many) that helps show what’s possible with technology and what may require legal protections.
For example things like privacy and surveillance laws obviously need updating in the face of advances in networking, data collection at scale, etc. Same with copyright in the face of plentiful copying.
But good laws will as you say address what is now possible or dangerous, as opposed to any specific implementation or general purpose technology involved. The tech just sets the context for what protections are needed.
khafra
7 hours ago
One outcome which is not unique to AI, but fairly exclusive: The value of human cognitive labor eventually drops below subsistence income. This isn't here, yet; but it's a hard problem so we should be devoting substantial resources to solutions before it hits.
ludicrousdispla
6 hours ago
"No one in this world, so far as I know — and I have researched the records for years, and employed agents to help me — has ever lost money by underestimating the intelligence of the great masses of the plain people. Nor has anyone ever lost public office thereby."
Cheer2171
8 hours ago
Oh, so we can't address any specific problems with any technology, because we should actually be fixing all of society at the root of all those problems. So while you wait for our broken political system to solve those root causes, enjoy feeling smug about not having implemented any imperfect, temporary bandaids to stop some bleeding.
Are you working on fixing those root problems? Or after dismissing short term policy bandaids, are you going to go back to working in an industry where you will probably make more money in the short run if governments don't do any tech regulation in the short run?
Your commitment to the long run will lead to paralysis and do nothing in the long run.
danpalmer
8 hours ago
If there are problems that are specific to AI then sure we should legislate about it. For example defining what "fair use" is for AI training, that's a clearly new area.
But most of the pushback I've seen to AI in policy is so over-fit to current AI that it would be trivial to work around it. You can argue that we'd be letting perfect be the enemy of good, but I think we'd be making policies that will be out of date by the time they even make it into law, and that we'll never make any progress at all.
That said, I'm all for being proven wrong. The US tends to write highly specific legislation so I'm sure it'll try a few of these. The EU tends to write much more vague legislation specifically for this reason. We'll see how they end up working.
lm28469
6 hours ago
> So while you wait for our broken political system
Yeah we better let these important topics in the hands of very stable people like Musk or Thiel, they for sure know what the people want
> make more money in the short run if governments don't do any tech regulation in the short run?
"Money money money money", homo sapiens decerebration under capitalism is quite something to witness. Maybe just maybe there is more to life than raw productivity and money... The root causes you're talking about are greed and an unbound quest for "progress", piling more in top will certainly not help