throwup238
a year ago
I’m confused by this news story and the response here. No one seems to understand OpenAI’s corporate structure or non profits at all.
My understanding: OpenAI follows the same model Mozilla does. The nonprofit has owned a for-profit corporation called OpenAI Global, LLC that pays taxes on any revenue that isn’t directly in service of their mission (in a very narrow sense based on judicial precedent) since 2019 [1]. In Mozilla’s case that’s the revenue they make from making Google the default search engine and in OpenAI’s case that’s all their ChatGPT and API revenue. The vast majority (all?) engineers work for the for-profit and always have. The vast majority (all?) revenue goes through the for-profit which pays taxes on that revenue minus the usual business deductions. The only money that goes to the nonprofit tax-free are donations. Everything else is taxed at least once at the for-profit corporation. Almost every nonprofit that raises revenue outside of donations has to be structured more or less this way to pay taxes. They don’t get to just take any taxable revenue stream and declare it tax free.
All OpenAI is doing here is decoupling ownership of the for-profit entity from the nonprofit. They’re allowing the for profit to create more shares and distribute them to entities other than the non-profit. Or am I completely misinformed?
[1] https://en.wikipedia.org/wiki/OpenAI#2019:_Transition_from_n...
throwaway314155
a year ago
It's about the narrative they tried to create. The spin. It doesn't matter much if they were technically behaving as a for-profit entity previously. What matters is that they wanted the public (and likely, their talent) to _think_ that they weren't even interested in making a profit as this would be a philosophical threat to the notion of any sort of impartial or even hopefully benevolent originator of AGI (a goal which is laid plainly in their mission statement).
As you've realized, this should have been (and was) obvious for a long time. But that doesn't make it any less hypocritical or headline worthy.
cdchn
a year ago
>What matters is that they wanted the public (and likely, their talent) to _think_ that they weren't even interested in making a profit as this would be a philosophical threat to the notion of any sort of impartial or even hopefully benevolent originator of AGI (a goal which is laid plainly in their mission statement)
And now they want to cast off any pretense of that former altruistic yolk now that they have a new, better raison d'etre to attract talent: making absolutely unparalleled stacks of cash.
nejkbnek
a year ago
Just FYI it's "yoke" when it's a burden, "yolk" when it's an egg
salad-tycoon
a year ago
Maybe, they do seem to have egg on their face.
kombookcha
a year ago
It sort of remains to be seen whether they can actually make that cash - they're no longer the only game in town, and while they have an obvious adoption and brand name recognition advantage in their industry, they've also been running real hot on investor funding on the assumption that more processing power is what it's gonna take for them to stay competitive and continue improving. But they're gonna have to fight Google and Microsoft on that front. If there is a general plateau coming up in all of these models, or they fail to keep pace with their extremely well-supplied competition, it might not be so easy to convert their position into money.
hackernewds
a year ago
Note Elon made a significant "donation" early into OpenAI given their non-profit designation and intentions, in return receiving zero equity. The donation was also received tax free
stonogo
a year ago
He received a seat on the board, which he surrendered when Tesla started investing in AI.
pj_mukh
a year ago
Occam's razor: I think Sam's personal narrative is the correct one. He built a non-profit that took off in a way that he didn't expect it and now a for-profit is the best way to run the lightning they've caught.
In terms of profit, AFAICT, Sam doesn't have designs on building extra large yachts and his own space agency but what he wants is to be the one at the stead of building what he considers is world-changing tech. One could rationally call this power-hungry but one could also rationally call this just helicopter parenting of a tech you've helped built. And for that a for-profit that is allowed to maximize profits to re-invest in the tech is the optimal setup (esp if all the competitors are doing the same)
Is this a different org than when it started? Yes. Was this a dupe from the beginning? I don't think so.
"But why can't he have a more worldly-aligned board looking over his shoulder?"
Because we live in California and have a bad taste for governance by committee or worse: governance by constant non-representative democracy (see: Housing).
If this now completely comes off the wheels, I still think Congressional action can be a stopgap, but atleast for now, this restructure makes sense to me.
feoren
a year ago
Sam: "I'm not in it for the money. I have principles."
World: "But what if it was like, a lot of money?"
Sam: "Oh alright you convinced me. Fuck my principles."
pj_mukh
a year ago
What do you do with a a lot of money past a point? A corporate controlled AGI being just a stop on the way to build another private space agency seems like a...letdown.
talldayo
a year ago
To be honest, I would take a private space agency 7 days out of the week with that kind of capital. We have no fundamental proof that LLMs will scale to the intelligence levels that we imagine in our heads. The industry application for LLMs is even weaker than computer vision, and the public sentiment is almost completely against it. Sam's product is hype; eventually people are going to realize that Q* and Strawberry were marketing moves intended to extend OpenAI's news cycle relevancy and not serious steps towards superintelligence. We were promised tools, and they're shipping toys.
I could tell you in very plain terms how a competitor to Boeing and SpaceX would benefit the American economy. I have not even the faintest fucking clue what "AGI" even is, or how it's profitable if it resembles the LLMs that OpenAI is selling today.
pj_mukh
a year ago
I would agree with you that a space agency is also useful (maybe more useful some days of the week). Sam disagrees and thinks he can do better without a non-profit board now. I'm glad we live in a world where he gets to try and we get to tax him and his employees to do other things we consider useful.
cdchn
a year ago
Private space agency and LLMs both seem like big industries going nowhere driven by sci-fi hopes and dreams.
blendergeek
a year ago
Its interesting how first impressions can be so deceiving. The world's largest private space agency (SpaceX) has completely changed the game in rural internet connectivity. Once upon a time, large chunks of the US had no reliable high speed internet. SpaceX has brought high-speed low-latency internet to every corner of the globe, even the middle of the ocean and Antarctica. This isn't going nowhere even if it seems that way.
latexr
a year ago
> SpaceX has brought high-speed low-latency internet to every corner of the globe
Which sounds all well and good until you realise it’s at the complete whims of one highly misinformed and reactionary individual.
He’s one made-up article away from turning sides and fucking everything up.
https://en.wikipedia.org/wiki/Starlink_in_the_Russo-Ukrainia...
https://en.wikipedia.org/wiki/Views_of_Elon_Musk#Russian_inv...
inemesitaffia
a year ago
It's not going nowhere though.
sfblah
a year ago
Not sure I agree with you here. I use LLMs all the time for work. I've never once used a space agency for anything.
macintux
a year ago
GPS, weather forecasting, tv broadcasting…I’ve been using a space agency for as long as I’ve been alive.
blendergeek
a year ago
My Dad uses SpaceX to work from home every day.
starspangled
a year ago
SpaceX is not a private space agency though, it is a private space launch and satellite communications company, which has revolutionized access to space and access to communication, providing enormous social benefit.
People use SpaceX every day even if they never connected to a starlink -- the lower costs that governments pay for space launches means more money for other things, not to mention no longer paying Russia for launches or engines.
cdchn
a year ago
I think they're both overhyped by sci-fi optimism but I would agree (even being mostly an AI minimalist) the impact of LLMs (and their improvement velocity) is a lot meaningful to me right now. I mean satellites are cool and all.
hackernewds
a year ago
This comment reeks of Steve Ballmer opinion of Apple and the Internet early days. If you work at any decent technology company, you realize AI applications every where and the pending mass layoffs. Or nimbler startups replicating their work more efficiently.
BrianHenryIE
a year ago
Let me recommend my favourite TikTok/YouTube channel of late, The Forest Jar
what annoys you?: https://www.youtube.com/watch?v=k9Le1ibX2zY
> if you went to a group of investors and pitched a board game where the winners get space ships and the losers die, they'd call you crazy. But if you suggested to those same investors that perhaps we shouldn't organize our entire society that way, they'd call you crazy.
pj_mukh
a year ago
"where the winners get space ships and the losers die,"
The Social Security budget is $1.4 Trillion, just the federal welfare budget is >$1Trillion (not including state budgets), and then there's medicare. Meanwhile, the NASA budget is <$25B (with SpaceX's operating budget and profits being a fraction of that)
I wish we lived in that simple of a world. But we don't.
permo-w
a year ago
this is a complete non-sequitur. the US social security budget does not go to one person or a small group of oligarchs
pj_mukh
a year ago
“the winners get space ships and the losers die, they'd call you crazy”
Was the original quote. I just showed that you could get rid of all rockets ever made and it would be a rounding error for any funds used to save the “losers” from dying.
How is that a non-sequitur?
latexr
a year ago
> What do you do with a a lot of money past a point?
Feed the hungry. House the homeless. Give away money unconditionally to those in need. Build hospitals in poor countries. Fight disinformation on crucial topics (such as climate change). Provide disaster relief. Not build more power hungry technology that exacerbates our current problems.
Do literally anything positive for another person, that does not harm others.
The list is pretty big when one isn’t selfish; there’s no law forcing anyone to build space agencies.
A lack of imagination is not an excuse.
pj_mukh
a year ago
"Feed the hungry. House the homeless"
Funnel $10B in housing to Los Angeles and you'll build less than 100 units of housing, because the inflationary push of that money would balloon the cost of per unit housing. I don't want to imagine the effect of that on middle class housing.
Funnel $10B of food to xyz famine region and you've undercut local farmers for generations. Happens all the time [1]. And that's assuming you can get the aid past local corruption.
These problems aren't as simple as people assume, and I'm low-key happy young naive Billionaire's are avoiding these issues instead of trying to throw their weight around.
FWIW: Sam's already funneled a bunch of money into green energy production[2].
[1]: https://haitisolidarity.net/in-the-news/how-the-united-state...
[2]: https://www.cnbc.com/2024/09/25/sam-altman-backed-nuclear-st...
exceptione
a year ago
> Funnel $10B in housing to Los Angeles and you'll build less than 100 units of housing, because the inflationary push of that money would balloon the cost of per unit housing. I don't want to imagine the effect of that on middle class housing.
Doesn't make sense to me. An uptick in construction work will not be an inflation balloon. More disposable income doesn't mean 1:1 more spending.
If you build a lot of (social) housing, you put at worst a lot of people a roof above their head.
Families having less financial stress might lower crime rate and improve children school scores. They might save to start businesses or find their other talents.
For some, this might be a downside tough. It makes workers more educated, healthier, more stable, less desperate and less dependent on bosses, plus they might be less angry so politically less exploitable too.
pj_mukh
a year ago
"An uptick in construction work will not be an inflation balloon. "
There's a massive shortage in construction workers [1], so yes there will be? The few construction workers we do have can demand higher wages (yay!) but then will they be outbidding other mid-income folks for housing with those increased wages? Sounds like an inflation spiral to me.
My statement wasn't against social housing, I love social housing. We just haven't cracked the code in scaling housing (and subsequent maintenance) yet. And the problem is about 80% political will, billionaire cash is useless here.
[1]: https://www.abc.org/News-Media/News-Releases/abc-2024-constr...
exceptione
a year ago
On a macro scale, that has hardly any impact, and I think it would be even immeasurable.
It is rather the other way around. Higher rents / house prices will make sure only people with higher wages can afford to live there. That means your bagel or coffee will be more expensive there too.
pj_mukh
a year ago
I didn’t say macro scale, I said Los Angeles that’s the problem.
Pretty much everything required to build housing, wood, labor, pre-approved land is in a massive shortage that we can’t spend money to fix.
So more money to simple pump demand for all those things will have a massive inflationary impact.
exceptione
a year ago
Nope, LA is too much part of the macro economy to make such an impact. Wood and labor doesn't have to come from LA and even if that would double (it won't) there would be a round zero impact on inflation in LA. The land to be build was going to be sold anyways, you just get one bidder more, or several bidders less if the council makes requirements like x% social housing.
Please, forget anything you are worrying about here, it does not apply.
pj_mukh
a year ago
Literally every problem I mentioned is at its worst state possible. People with millions and billions simply waiting to buy materials or get land approvals. It’s a well know intractable problem [1] and really the crux of the problem.
If just these problems could be solved the state has more than enough funds to house everyone. What billionaires do would be wholly irrelevant (like it is now)
[1] https://www.constructiondive.com/news/construction-materials...
latexr
a year ago
I didn’t say “do these things inefficiently”. If we know better, we can do better. It’s like if I said “use the money to fix the potholes in this road” and you replied “but if you shove all that asphalt in the same hole, it will create a mound that stops cars from going through”. Yeah, don’t dump everything in the same place without thinking.
Start by collaborating with organisations which are entrenched in studying these issues and the impact of the solutions. If you have the money you can pay them to help and guide the effort, don’t act like if you know everything.
pj_mukh
a year ago
Yes this has basically been the modus operandi of the gates foundation and it took them 10 years to make a dent on Malaria. They still have no clue how to “efficiently” reduce famines.
They won’t touch American housing problems with a 10ft pole. That should tell you something.
Go to Berkeley, tell them a Billionaire wants to build housing for the homeless in their neighborhood. See what happens.
It’s a hard pill to swallow but the best thing billionaires can do is let us tax them and then butt out go fly rockets. The political problems is upto the rest of us.
exceptione
a year ago
The housing problem in the USA is mostly a NIMBY. It is difficult to get projects from the ground.
latexr
a year ago
> They won’t touch American housing problems with a 10ft pole.
Why do you keep insisting on the USA? It’s not the only country in the world.
> Go to Berkeley
I will not. I’m not American.
> It’s a hard pill to swallow but the best thing billionaires can do is let us tax them
Maybe it’s a hard pill to swallow for the billionaire, but I personally agree and think you’re right. However, this conversation started with someone asking “what do you do with a a lot of money past a point” and offering only a private space agency as an alternative to working on AGI. My point was there are many other problems worth pursuing.
pj_mukh
a year ago
My point was every other problem would be made worse by a billionaire pushing his/her money in there. Everyone is a couple of billions of money funneled away from becoming the next George Soros.
If you don’t think NIMBYism and degrowth is a problem in your country yet, just give it a couple of years. It just hit England, you’re next. No billionaire can save you.
KSteffensen
a year ago
Cure cancer? Solve this climate change thing?
vasco
a year ago
Kid Rock did it first, but a golden toilet would be my answer.
downrightmike
a year ago
And that is why SkyNet decided immediately to destroy everyone.
grahamj
a year ago
Sam: That was child's play for me
ninepoints
a year ago
Anyone who had any respect for Sam "Give me your eyeball data" Altman was always delusional.
huevosabio
a year ago
I don't think the narrative makes sense. It was clear from way back in 2016 that training would take a ton of resources. Researchers were already been sucked into FAANG labs because they had the data, the compute, and the money. There was never a viable way for a true non-profit to make world-changing, deep learning-based AI models.
When seen through the rearview mirror, the whole narrative screams of self-importance and duplicity. GPT-2 was too dangerous, and only they were trust-worthy enough to possess. They were trust-worthy because this was a non-profit, so "interest aligned with humanity". This charade has continued even to barely some months ago.
grey-area
a year ago
He didn’t build it.
eberfreitas
a year ago
Can we please, move this comment to the top?
namaria
a year ago
Occam's razor has never meant "let's take discourse at face value".
That's not least complexity. That's least effort.
user
a year ago
halJordan
a year ago
It isnt a tax thing or a money thing, its a control and governance thing.
The board of the non-profit fired Altman and then Altman (& MS) rebelled, retook control, & gutted the non-profit board. Then, they stacked the new non-profit board with Altman/MS loyalists and now they're discharging the non-profit.
It's entirely about control. The board has a legally enforceable duty to its charter. That charter is the problem Altman is solving.
burnte
a year ago
The problem is that OpenAI calls itself OpenAI when it's completely sealed off, and calls itself a non-profit when, as you say, almost everything about is for profit. Basically they're whitewashing their image as an organization with noble goals when it's simply yet another profit motivated company. It's fine if that's what they are and want to be, but the lies are bothersome.
rilien
a year ago
It's like North Korea calling themselves Democratic People's Republic of Korea. The name doesn't mean anything
joe_the_user
a year ago
There's a now-quintessential HN post format, "Poster criticizing X don't seem to under [spray of random details about X that don't refute the criticism - just cast posts as ignorant]".
In this case, Mozilla as a non-profit owning a for-profit manages to more or less fulfill the non-profit's mission (maintaining an open, alternative browser). OpenAI has been in a hurry to abandon it's non-profit mission for a while and the complex details of its structure doesn't change this.
seizethecheese
a year ago
“Decoupling” is such a strange euphemism for removing an asset worth north of $100b from a nonprofit.
throwup238
a year ago
OpenAI Global LLC is the $100b asset. It’s not being removed, the nonprofit will still own all the shares it owns now until it decides to sell.
sangnoir
a year ago
The shares will be diluted - the LLC used to be 100% owned by the non-profit; and now there's no bottom.
Aeolun
a year ago
Normally shareholders aren’t ok with that.
b800h
a year ago
I was under the impression that in UK law at least, (and obviously not in this case) the trustees of a non-profit would be bound to work in the best interests of that non-profit. And so allowing an asset like this to somehow slip out of their control would be the sort of negligence that would land you in very hot water. I'd be interested to know how this isn't the case here.
upwardbound
a year ago
I think it is the case here, and I hope Elon Musk persists in his lawsuits about this. As a large donor to the nonprofit in its early days he’s one of the people with the strongest standing to sue / strongest claim for damages.
Obviously Elon is mostly doing this suit as a way to benefit Grok AI but honestly I don’t mind that; competitors are supposed to keep each other in check, and this is a good and proper way for companies to provide checks & balances to each others’ power and it’s one reason why monopolies are bad is the absence of competitor-enforced accountability.
Lawsuit: https://www.reuters.com/technology/elon-musk-revives-lawsuit...
https://www.reuters.com/technology/elon-musk-revives-lawsuit-against-sam-altman-openai-nyt-reports-2024-08-05/stale2002
a year ago
> somehow slip out of their control would be the sort of negligence that would land you in very hot water.
> how this isn't the case here.
Its not the case because they are doing the opposite of what you are suggesting. They are increasing the value of the asset that they own.
Sure, the asset itself is being diluted, but the individual parts that it owns are more valuable.
It is perfectly reasonable for a non profit to prefer to own 30% of a 100 billion dollar asset, lets say, compared to 100% of a 10 billion dollar asset.
mlsu
a year ago
Isn't the goal of a non-profit by its very definition... not profit?
The goal of the openAI non-profit is something something control the development of AI for the good of all humanity, then it seems that they explicitly shouldn't care about making $20 billion, and explicitly should care about maintaining control of openAI.
If you listen to their rhetoric, $20 billion is peanuts compared to the lightcone and the kardashev scale and whatever else.
stale2002
a year ago
> Isn't the goal of a non-profit by its very definition... not profit?
Yes, and if you have a bunch more money then you can do more non profit activities that help the world.
Getting as much money as possible, so that the money can be used for your great cause, is the best way to effectively run a non profit.
> then it seems that they explicitly shouldn't care about making $20 billion
Of course they should, because that 20 billion dollars can be used for its goal more effectively than having control over a lower value asset.
> compared to the lightcone and the kardashev scale and whatever else.
You are pre-supposing that openAI's model itself is some magic, infinitely valuable asset already.
Its not. If it were, then it would already be worth 10 trillion dollars. But its not worth that.
Therefore the money is worth more than the asset. There are lots of other AI groups around here. OpenAI is just one of them, and they are not infinitely valuable.
achierius
a year ago
While I'm sure this argument makes sense in some utilitarian world-model or another, it is definitively _not_ one that has been accepted by the courts, largely because both federal and state governments have explicitly legislated against nonprofits doing "general moneymaking" as part of their mission. We already have legal vehicles for that, they're called for-profit companies, they pay tax, and donations to them are not tax deductible.
> Getting as much money as possible, so that the money can be used for your great cause, is the best way to effectively run a non profit.
In anywhere but Silicon Valley is a great way to violate Unrelated Business Income limits and get your charitable status revoked. It is not sufficient that a non-profits "goals" be charitable, their day-to-day activities must be as well, and it's not acceptable to put off those activities until some future date when you'll "make up" for all the regular for-profit work.
stale2002
a year ago
> doing "general moneymaking"
Good thing this wouldn't be that. Instead, it would be about promoting the cause.
And yes, non profits are allowed to own assets and maximize the value of those assets.
Of course their mission also matters and they should push towards that. But throwing away billions and billions of dollars for nothing isn't the way to do that.
> their day-to-day activities must be as well
Yes.... and they should also do that.
That has absolutely nothing to do with refusing to sabotage your non profit by throwing away a bunch of money for no reason though.
Of course the non profit should work towards their goal in their day to day activities.
> until some future date
Who said anything about waiting for a future date? Of course their current actions should push towards their goal.
That still has nothing to do with refusing to set money on fire for no reason though.
If anything, I think that the people who were attempting to set their valuable assets on fire and sabotage the non-profit are the ones who should be prosecuted by the legal system to the fullest extend legally allowed for going against the mission and intentionally engaging in charity fraud.
At one point, some of those board members said something about how that they were seriously considering shutting the whole thing down. I would absolutely consider that to be extremely illegal charity fraud, deserving of jail time if they did that.
user
a year ago
user
a year ago
dangitman
a year ago
[dead]
nfw2
a year ago
> "All OpenAI is doing here is decoupling ownership of the for-profit entity from the nonprofit."
Yes, but going from being controlled by a nonprofit to being controlled by a typical board of shareholders seems like a pretty big change to me.
mr_toad
a year ago
> All OpenAI is doing here is decoupling ownership of the for-profit
All? As far as I know this is unprecedented.
A1kmm
a year ago
Maybe at this scale.
But unfortunately charities and not-for-profits putting their core business into a company, and then eventually selling it off is not unprecedented. For example, The Raspberry Pi Foundation was a not-for-profit organisation around Raspberry Pi. They formed a LLC for their commercial operations, then gradually sold it off before eventually announcing an IPO: https://www.raspberrypi.org/blog/what-would-an-ipo-mean-for-....
I think it is terrible that not-for-profits are just being used as incubators for companies that eventually take the core mission and stop primarily serving the public interest.
There are of course other examples of charities or not-for-profits that put part of their core operations in a company and don't sell out, instead retaining 100% ownership - for example Mozilla. However, I think there should be some better way for impactful not-for-profits to have some revenue generating aspects in line with their mission (offset by allowing temporary surplus to cover future expenses, or by other expenses).
throwup238
a year ago
> Maybe at this scale.
I don't think it's unprecedented, even at this scale.
Novo Nordisk, the pharmaceutical company behind Semaglutide (aka Ozempic) with a market cap >$600 bilion, was founded by the Novo Nordisk Foundation before going public. The latter now has an endowment of over $150 billion and owns a significant fraction of the public company.
mossTechnician
a year ago
This might be unpopular, but I don't think Mozilla's behavior is inherently good[1]. And even if it has, I don't think it should be used as a litmus test for other corporate behavior. Every company should stand on its own against scrutiny.
[1] "The Mozilla Foundation has no members" https://hacktivis.me/articles/mozilla-foundation-has-no-memb...
hackernewds
a year ago
How is it possible to make tax free "donations" for profit making applications? You seem to imply there is nothing nefarious about the setup. Except the non-profit designation doesn't actually perform no social services, instead stand as a business structure to skirt taxation. Change my mind
throwup238
a year ago
> How is it possible to make tax free "donations" for profit making applications?
The nonprofit invests the tax free donations into the for-profit. It gets to keep its equity just like any other investor and as long as that equity held by the non-profit, there's no taxable event. If the nonprofit sells its share - since it was closely involved in the creation and management of the for-profit as an active investment - it becomes a taxable event under the unrelated business income tax rules. Until that time, the only "profit making applications" like ChatGPT and the API are run by the for-profit, which - I repeat - pays its taxes.
I genuinely don't understand why people think they've skirted taxation except out of sheer ignorance of how non-profits actually work. A 501(c)(3) is not some magic Monopoly "get-out-of-tax-free" card and the IRS isn't stupid, there's a ton of rules for tax exemption. They'd have a much easier time with tax avoidance if they were an actual for profit corporation with billions of dollars because GAAP rules are a lot more forgiving than non-profit regulations.
bbor
a year ago
Good questions!
Right now, OpenAI, Inc. (California non-profit, lets say the charity) is the sole controlling shareholder of OpenAI Global LLC (Delaware for-profit, lets say the company). So, just to start off with the big picture: the whole enterprise was ultimately under the sole control of the non-profit board, who in turn was obligated to operate in furtherance of "charitable public benefit". This is what the linked article means by "significant governance changes happening behind the scenes," which should hopefully convince you that I'm not making this part up.
To get really specific, this change would mean that they'd no longer be obligated to comply with these CA laws:
https://leginfo.legislature.ca.gov/faces/codes_displayText.x...
https://oag.ca.gov/system/files/media/registration-reporting...
And, a little less importantly, comply with the guidelines for "Public Charities" covered by federal code 501(c)(3) (https://www.law.cornell.edu/uscode/text/26/501) covered by this set of articles: https://www.irs.gov/charities-non-profits/charitable-organiz... . The important bits are:
The term charitable is used in its generally accepted legal sense and includes relief of the poor, the distressed, or the underprivileged; advancement of religion; advancement of education or science; erecting or maintaining public buildings, monuments, or works; lessening the burdens of government; lessening neighborhood tensions; eliminating prejudice and discrimination; defending human and civil rights secured by law; and combating community deterioration and juvenile delinquency.
... The organization must not be organized or operated for the benefit of private interests, and no part of a section 501(c)(3) organization's net earnings may inure to the benefit of any private shareholder or individual.
I'm personally dubious about the specific claims you made about revenue, but that's hard to find info on, and not the core issue. The core issue was that they were obligated (not just, like, promising) to direct all of their actions towards the public good, and they're abandoning that to instead profit a few shareholders, taking the fruit of their financial and social status with them. They've been making some money for some investors (or losses...), but the non-profit was, legally speaking, only allowed to permit that as a means to an end.Naturally, this makes it very hard to explain how the nonprofit could give up basically all of its control without breaking its obligations.
All the above covers "why does it feel unfair for a non-profit entity to gift its assets to a for-profit", but I'll briefly cover the more specific issue of "why does it feel unfair for OpenAI in particular to abandon their founding mission". The answer is simple: they explicitly warned us that for-profit pursuit of AGI is dangerous, potentially leading to catastrophic tragedies involving unrelated members of the global public. We're talking "mass casualty event"-level stuff here, and it's really troubling to see the exact same organization change their mind now that they're in a dominant position. Here's the relevant quotes from their founding documents:
OpenAI is a non-profit artificial intelligence research company. Our goal is to advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return. Since our research is free from financial obligations, we can better focus on a positive human impact...
It’s hard to fathom how much human-level AI could benefit society, and it’s equally hard to imagine how much it could damage society if built or used incorrectly. Because of AI’s surprising history, it’s hard to predict when human-level AI might come within reach. When it does, it’ll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest.
From their 2015 founding post: https://openai.com/index/introducing-openai/ We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity or unduly concentrate power. Our primary fiduciary duty is to humanity...
We are concerned about late-stage AGI development becoming a competitive race without time for adequate safety precautions. Therefore, if a value-aligned, safety-conscious project comes close to building AGI before we do, we commit to stop competing with and start assisting this project. We will work out specifics in case-by-case agreements, but a typical triggering condition might be “a better-than-even chance of success in the next two years.”
From their 2018 charter: https://web.archive.org/web/20230714043611/https://openai.co...Sorry for the long reply, and I appreciate the polite + well-researched question! As you can probably guess, this move makes me a little offended and very anxious. For more, look at the posts from the leaders who quit in protest yesterday, namely their CTO.
throwup238
a year ago
> I'm personally dubious about the specific claims you made about revenue, but that's hard to find info on, and not the core issue. The core issue was that they were obligated (not just, like, promising) to direct all of their actions towards the public good, and they're abandoning that to instead profit a few shareholders, taking the fruit of their financial and social status with them. They've been making some money for some investors (or losses...), but the non-profit was, legally speaking, only allowed to permit that as a means to an end.
Look at your OpenAI invoices. They're paid to OpenAI LLC, not OpenAI Inc. I can't find confirmation on openai.com what the exact relationship between OpenAI Global LLC and OpenAI LLC is but the former is on their "Our Structure" page and the latter is in their data processing addendum so it's probably the subsidiary in charge of operating the services while Global does training and licenses it downstream. OpenAI Global was the one that made that big $10 billion deal with Microsoft
That obligation is why they had to spin off a for-profit corporation. Courts are very strict in their interpretation of what "unrelated business income" is and the for-profit LLC protects the non-profit's tax exempt status.
> "why does it feel unfair for a non-profit entity to gift its assets to a for-profit"
What assets were gifted, exactly? They created the for-profit shortly after GPT2 (in 2019) and as far as I can tell that's the organization that has developed the IP that's actually making money now.
I honestly don't understand how this isn't in the interest of the nonprofit's mission. It's currently a useless appendage and will never have any real power or resources until either OpenAI is in the black and sending profit up to it, or they can sell OpenAI shares. I don't think the charity has any more claim over GPT4 than Google does, having invented transformers.
If this next round of funding goes through at $100-150 billion valuation, OpenAI Inc will probably be (on paper at least) the second wealthiest charity on the planet after the Novo Nordisk Foundation. This restructuring opens the way for the nonprofit to sell its shares and it's going to be a hell of a lot of money to dedicate towards their mission - instead of watching its subsidiary burn billions of dollars with no end in sight.
bbor
a year ago
Thanks for another polite response! I think this is the fundamental misunderstanding:
What assets were gifted, exactly? ...It's currently a useless appendage and will never have any real power or resources until either OpenAI is in the black and sending profit up to it, or they can sell OpenAI shares.
The charity will gift its control of a $100B company for some undisclosed "minority stake" (via a complex share dilution scheme), in exchange for nothing other than "the people we're gifting it to have promised to do good with it". It's really that simple. The charity never was intended to draw profit from the for-profit, and the "never have any real power" contention is completely inaccurate -- they have direct, sole control over the whole enterprise. This restructuring opens the way for the nonprofit to sell its shares and it's going to be a hell of a lot of money to dedicate towards their mission
Even putting aside the core issue above (they won't have many shares to sell), the second part of my comment comes back here: what would they buy with all that money? Anthropic? Their explicit mission is to beat for-profit firms in the race to AGI so convincingly that an arms race is avoided. How could they possibly accomplish this after gifting/selling away control of the most capable AI system on the planet?Finally, one tiny side point:
Courts are very strict in their interpretation of what "unrelated business income" is and the for-profit LLC protects the non-profit's tax exempt status.
I'm guessing you're drawing on much more direct experience than I am and I don't question that, but this seems like a deceptive framing. Normal charities have no issues with unrelated business income, because they don't run "trades or businesses", they just spend their money. I know that selling ChatGPT subscriptions is a large income source, but it's far from the only way to pursue their mission -- and is wildly insufficient, anyway. They in no way were forced to do it to meet their obligations.Again, I'm a noob, so I'll cite the IRS-for-dummies page on the topic for onlookers: https://www.irs.gov/charities-non-profits/unrelated-business...
throwup238
a year ago
> The charity will gift its control of a $100B company for some undisclosed "minority stake" (via a complex share dilution scheme), in exchange for nothing other than "the people we're gifting it to have promised to do good with it". It's really that simple. The charity never was intended to draw profit from the for-profit.
That's not my interpretation. It sounds like the restructuring is part of the deal that values OpenAI at $100-150 billion [1]. They're not "gifting" away anything any more than a Series A investor gifts something to a Series B investor. They're restructuring so that their stake will be worth more afterwards than it is now, regardless of the percentages. That's what every company owner goes through when the company raise a VC round, goes public, or even just offers employees stock options. That doesn't change because the owner is a nonprofit and it sounds like until they restructure, the for-profit is worth nowhere near that crazy 12 figure number.
"Minority stake" just means that they won't have enough to control the corp out right with 50%+1 which is probably what everyone wants to justify the investment. Reuters TFA says "The plan is still being hashed out with lawyers and shareholders and the timeline for completing the restructuring remains uncertain" so we don't really know what the post valuation numbers look like or who is getting what. We also don't know how the voting vs non-voting shares will split. Losing majority control after multiple multi-billion dollar rounds is the norm so if Microsoft's previous $10 bil investment converts and 10-20 pts go to the employee pool, it's a perfectly fair deal.
> Even putting aside the core issue above (they won't have many shares to sell), the second part of my comment comes back here: what would they buy with all that money? Anthropic? Their explicit mission is to beat for-profit firms in the race to AGI so convincingly that an arms race is avoided. How could they possibly accomplish this after selling control of the most capable AI system on the planet?
As far as I can tell, without this deal OpenAI LLC goes bankrupt under the rumored $5b/yr losses and the charity loses all relevance when the bankruptcy court fire sales the IP. With this deal, it can create a secondary market and use the funds to focus on its actual mission. The only way that his deal doesn't make sense (in my mind) is if you believe that GPT4/o1/whatever are the keys to fulfilling OpenAI's mission and it becomes impossible if it loses control. Personally I find that very hard to believe, which might be the actual disconnect we're having here.
What would they buy? They'd hire people to do research under their umbrella instead of a for-profit one, fund compute infrastructure for researchers who don't have billions for H100s, give grants to organizations, or spin off more startups. Even if it's a 20% stake of a $100 billion, that's enough for a ivy league sized endowment that can fund AI research for generations. If it's a 49% stake of $150 billion, that makes it the second wealthiest charity after Novo Nordisk Foundation - which continues to do tons of biomedical research even though it doesn't have total control over the public Novo Nordisk or the semaglutide IP.
OpenAI would become one of the largest grant giving organizations in the world overnight using just the interest from the endowment. Imagine the equivalent to 50-100% of the NSF's annual grant budget going just to AI research!
> Normal charities have no issues with unrelated business income, because they don't run "trades or businesses", they just spend their money.
See my other reply [2] for a sample of some nonprofits that have for-profit arms. A significant fraction of them do, especially those that offer some sort of product or service (I don't have hard stats but I'd venture it's the majority of the major charities that do the latter). "Normal" charities that just disburse grant money and spends donations is only one type among many.
[1] https://www.reuters.com/technology/artificial-intelligence/o...
bbor
a year ago
Hmm, fair points, especially on there not being a strict norm for charities. Thanks for taking more time to clarify.
I’d say we disagree about the following somewhat indeterminate points:
1. Whether OpenAI has/could-have-had staying power without raising immense amounts of venture capital. I will readily admit that they’ve gone so far down this road that they are now somewhat trapped by their massive investments and contracts, not to mention losing almost all their top researchers.
2. Whether the people in charge of this deal can be trusted to propose a fair outcome for the charity other than “their mission ineffably lives on in us” (and as a corollary, whether a fair outcome is likely).
3. Whether the private technical assets of OpenAI (GPT, DALLE, and Sora) are meaningfully unique in their potential for impact, knowing what the public knows in the current moment — which I will admit is far from the complete competitive picture.
4. Whether OpenAI’s mission could be meaningfully achieved by passing out grants to a diverse body of scientists.
I’d be happy to “debate” (lol) any of those particulars if you want, but I think it’s otherwise best to leave it at “we assess the known facts differently”. If we let some time pass, that’ll at least settle question 2…
throwup238
a year ago
> 2. Whether the people in charge of this deal can be trusted to propose a fair outcome for the charity other than “their mission ineffably lives on in us” (and as a corollary, whether a fair outcome is likely).
Since Sam Altman is on the board of OpenAI Inc, I expect this deal will be under extreme scrutiny for self dealing. He has flown under the radar so far by not taking any equity but this changes the second he does (IANAL). California, Delaware, and the feds will be looking closely at the deal.
I don't think the danger for OpenAI the charity is as great as people make it out to be. They'll be able to do a lot more than hand out grants with an 11 figure endowment.
simantel
a year ago
> Almost every nonprofit that raises revenue outside of donations has to be structured more or less this way to pay taxes.
I don't think that's true? A non-profit can sell products or services, it just can't pay out dividends.
throwup238
a year ago
If those products and services are unrelated business income, they have to pay taxes on it: https://www.irs.gov/charities-non-profits/unrelated-business...
What counts as “related” to the charity’s mission is fuzzy but in practice the courts have been rather strict. They don’t have to form for-profit subsidiaries to pay those taxes but it helps to derisk the parent because potential penalties include loss of nonprofit status.
For example, the nonprofit Metropolitan Museum of Modern Art has a for-profit subsidiary that operates the gift shop. National Geographic Society has National Geographic Partners which actually owns the TV channel and publishes the magazine. Harvard and Stanford have the Harvard Management Company and Stanford Management Company to manage their endowments respectively. The Smithsonian Institute has Smithsonian Enterprises. Mayo Clinic => Mayo Clinic Ventures. Even the state owned University of California regents have a bunch of for-profit subsidiaries.
wubrr
a year ago
What leverage does Sam Altman have to get equity now? Does he personally have control over that decision?
user
a year ago