c-linkage
6 days ago
This seems like a tragedy of the commons -- GitHub is free after all, and it has all of these great properties, so why not? -- but this kind of decision making occurs whenever externalities are present.
My favorite hill to die on (externality) is user time. Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.
Externalities lead to users downloading extra gigabytes of data (wasted time) and waiting for software, all of which is waste that the developer isn't responsible for and doesn't care about.
Aurornis
5 days ago
> Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time.
I don’t know what you mean by software houses, but every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric
This has been common wisdom for decades. I don’t know how many times I’ve heard the repeated quote about how Amazon loses $X million for every Y milliseconds of page loading time, as an example.
rovr138
5 days ago
There was a thread here earlier this month,
> Helldivers 2 devs slash install size from 154GB to 23GB
https://news.ycombinator.com/item?id=46134178
Section of the top comment says,
> It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!
and the reply to it has,
> They’re not the ones bearing the cost. Customers are.
viraptor
5 days ago
There was also the GTA wasting minutes to load/parse JSON files at startup. https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...
And Skylines rendering teeth on models miles away https://www.reddit.com/r/CitiesSkylines/comments/17gfq13/the...
Sometimes the performance is really ignored.
darubedarob
5 days ago
Wasn' there a website with formula on how much time things like the GTA bug costed humanity as a whole? Like 5 minutes × users× sessionsperday accumulated?
It cost several human lifetimes if i remember correctly. Still not as bad as windows update which taking the time times wage has set the gdp of a small nation on fire every year..
mattmanser
4 days ago
I met a seasoned game dev who complained to me he was only ever hired at the end of projects to speed up the code a bunch of mid/junior level game devs the company had used to actually make the game. Basically he said there was only so much time he'd get given, and he'd have to go for low hanging fruit and might miss stuff.
We've only got a couple of game dev shops in my city, so not sure how common that is.
whstl
4 days ago
Sweatshops love junior devs, as they never complain, never make suggestions and always take the blame for bugs.
A senior joining when time is tight makes sense, they don’t want anyone to rock the boat, just to plug the holes.
AuthAuth
3 days ago
Im pretty sure that CS rendering teeth a mile away turned out to be false. But it was repeated so much and the game release state was (and still is) so bad people assumed it to be true.
ux266478
5 days ago
That's not how it works. The demand for engineering hours is an order of magnitude higher than the supply for any given game, you have to pick and choose your battles because there's always much, much more to do. It's not bizarre that nobody verified texture storage was being done in an optimal way at launch, without sacrificing load times at the altar or visual fidelity, particularly given the state the rest of the game was in. Who the hell has time to do that when there are crashes abound and the network stack has to be rewritten at a moments notice?
Gamedev is very different from other domains, being in the 90th percentile for complexity and codebase size, and the 99th percentile for structural instability. It's a foregone conclusion that you will rewrite huge chunks of your massive codebase many, many times within a single year to accomidate changing design choices, or if you're lucky, to improve an abstraction. Not every team gets so lucky on every project. Launch deadlines are hit when there's a huge backlog of additional stuff to do, sitting atop a mountain of cut features.
swiftcoder
5 days ago
> It's not bizarre that nobody verified texture storage was being done in an optimal way at launch
The inverse, however, is bizarre. That they spent potentially quite a bit of engineering effort implementing the (extremely non-optimal) system that duplicates all the assets half a dozen time to potentially save precious seconds on spinning rust - all without validating it was worth implementing in the first place.
MBCook
5 days ago
Was Helldivers II built from the ground up? Or grown from the v1 codebase?
The first was on PS3 and PS4 where they had to deal with spinning disks and that system would absolutely be necessary.
Also if the game ever targeted the PS4 during development, even though it wasn’t released there, again that system would be NEEDED.
esseph
5 days ago
It's a completely different game, engine, etc.
rovr138
5 days ago
Yes.
They talk about it being an optimization. They also talk about the bottleneck being level generation, which happens at the same time as loading from disk.
darubedarob
5 days ago
Gamedev engineering hours are also in endless oversupply thanks to myDreamCream brain.
jesse__
5 days ago
[flagged]
degamad
4 days ago
> > you will rewrite huge chunks of your massive codebase
> You're not rewriting Unreal
Do you consider the Unreal engine code to be part of "your codebase"?
jesse__
4 days ago
It's a dependency .. do you not consider dependencies as part of your codebase?
atiedebee
5 days ago
I think they meant the gameplay side of things instead of the engine
whstl
4 days ago
Unity rewrote and discontinued lots of major systems several times in a row in the last 10 years.
I’d be careful before telling people to “get a grip”.
ux266478
4 days ago
[flagged]
jesse__
4 days ago
[flagged]
saghm
5 days ago
I don't think it's quite that simple. The reason they had such a large install size in the first place was due to concern about the load times for players using HDDs instead of SSDs; duplicating the data was intended to be a way to avoid making some players load into levels much more slowly than others (which in an online multiplayer game would potentially have repercussions for other players as well). The link you give mentions that this was based on flawed data (although it's somewhat light on those details), but that's means the actual cause was a combination of a technical mistake and the presence of care for user experience, just not the experience of the majority at the expense of the smaller but not insignificant minority. There's certainly room for argument about whether this was the correct judgement call to make or that they should have been better at recognizing their data was flawed, but it doesn't really seem like it fits the trends of devs not giving a shit about user experience. If making perfect judgement calls and never having flawed data is the bar for proving you care about users, we might as well just give up on the idea that any companies will ever reach it.
godelski
5 days ago
How about GitHub actions with safe sleep that took over a year to accept a trivial PR that fixed a bug that caused actions to hang forever because someone forgot that you need <= instead of == in a counter...
Though in this case GitHub wasn't bearing the cost, it was gaining a profit...
kibwen
5 days ago
> They’re not the ones bearing the cost. Customers are.
I think this is uncharitably erasing the context here.
AFAICT, the reason that Helldivers 2 was larger on disk is because they were following the standard industry practice of deliberately duplicating data in such a way as to improve locality and thereby reduce load times. In other words, this seems to have been a deliberate attempt to improve player experience, not something done out of sheer developer laziness. The fact that this attempt at optimization is obsolete these days just didn't filter down to whatever particular decision-maker was at the reins on the day this decision was made.
dijit
5 days ago
I worked in e-commerce SaaS in 2011~ and this was true then but I find it less true these days.
Are you sure that you’re not the driving force behind those metrics; or that you’re not self-selecting for like-minded individuals?
I find it really difficult to convince myself that even large players (Discord) are measuring startup time. Every time I start the thing I’m greeted by a 25s wait and a `RAND()%9` number of updates that each take about 5-10s.
jama211
5 days ago
Discord’s user base is 99% people who leave it running 100% of the time, it’s not a typical situation
dijit
5 days ago
I think that they make the startup so horrible that people are more likely to leave it running.
hexer292
5 days ago
As a discord user, it's the kind of platform that I would want to have running to receive notifications, sort of like the SMS of gaming.
A large part of my friend group use discord as the primary method of communication, even in an in person context (was at a festival a few months ago with a friend, and we would send texts over discord if we got split up) so maybe its not a common use case.
solarkraft
5 days ago
It leads to me dreading having to start it (or accidentally starting it - remember IE?) and opting for the browser instead.
jama211
5 days ago
I strongly doubt that!
jama211
4 days ago
Hikacking my own comment to mention that the normal thing on forums when a reasonable person reads an unreasonable comment is they move on, which can make the comment stand unopposed which gives it credence it doesn’t deserve. I believe if more of us actually showed our disagreement out loud as I have here, it could change things sometimes.
spockz
5 days ago
I have the same experience on windows. On the other hand, starting up discord on my cachyos install is virtually instant. So maybe there is also a difference between the platform the developers use and that their users use.
godelski
5 days ago
I have plenty of responses to an angry comment I made several months ago that supports your point.
I made a slight at Word taking like 10 seconds to start and some people came back saying it only takes 2, as if that still isn't 2s too long.
Then again, look at how Microsoft is handling slow File Explorer speeds...
saagarjha
4 days ago
I never said that 2s wasn’t too long. I just said your environment was broken if it took 10.
yndoendo
4 days ago
There is a high chance the extra nuts and bolts added to Windows, which slow it down, are IT required softwoods, settings, and security enhancements.
Took me almost a year to get a separate laptop laptop for office and development. Their Enhanced Security prevented me from testing administrative code features and broke Visual Studios bug submission system, which Microsoft requires you to use for posting software bugs.
By the way, I can brake Windows simply by running their PowerShell utilities to configure NICs. Windows is not the stable product people think it is.
saagarjha
4 days ago
This was on macOS
godelski
4 days ago
Wild. How'd you even find me
saagarjha
4 days ago
I read the comments on this post
drob518
5 days ago
Yep, indeed. Which is the main reason I don’t run Discord.
jama211
5 days ago
I strongly doubt that. The main reason you don’t run it is likely because you don’t have strong motivation to do so, or you’d push through the odd start up time.
oceanplexian
5 days ago
Just going to throw out an anecdote that I don’t use it for the same reason.
It’s closed unless I get a DM on my phone and then I suffer the 2-3 minute startup/failed update process and quit it again. Not a fan of leaving their broken, resource hogging app running at all times.
Flere-Imsaho
4 days ago
For me, I really dislike the fact Discord is completely closed off to the wider internet, and Discord, the company, has absolute control: from a privacy and freedom of speech point of view. This goes against the core ideas of a free and open internet.
I'll admit that the Discord service is really good from a UX point of view.
ponector
5 days ago
Contrary, every consumer facing product I've worked had no performance metrics tracked. And for enterprise software it was even worse as the end user is not the one who makes a decision to buy and use software.
>>what you mean by software houses
How about Microsoft? Start menu is a slow electron app.
julianz
5 days ago
The Start menu is not an Electron app. Don't believe everything you read on the internet.
Spooky23
5 days ago
That makes the usability and performance of the windows start menu even more embarrassing.
The decline of Windows as a user facing product is amazing, especially as they are really good at developing things they care about. The “back of house” guts of Windows has improved alot, for example. They should just have a cartoon Bill Gates pop up like clippy and flip you the bird at this point.
jiggawatts
5 days ago
Much worse is that the search function built into the start menu has been broken in different ways in every major release of Windows since XP, including Server builds.
It has both indexing failures and multi-day performance issues for mere kilobytes of text!
Conan_Kudo
5 days ago
The Start menu is React Native, but Outlook is now an Electron app.
odo1242
5 days ago
React Native, not Electron. Though it is slower than it was
kevin_thibedeau
4 days ago
That's even more damning that they can't dogfood their own GUI toolkits for the primary UI of their own OS.
kortilla
5 days ago
People believing it says something about the start menu
TehShrike
5 days ago
hey, haven't seen that one in the wild for a little bit :-D https://www.smbc-comics.com/comic/aaaah
kortilla
5 days ago
The comic artist seems pretty ignorant to think that it’s not meaningful.
What falsehoods people believe and spread about a particular topic is an excellent way to tell what the public opinion is on something.
Consider spreading a falsehood about Boeing QA getting bonuses based on number of passed planes vs the same falsehood about Airbus. If the Boeing one spreads like wildfire, it tells you that Boeing has a terrible track record of safety and that it’s completely believable.
Back to the start menu. It should be a complete embarrassment to MSFT SWEs that people even think the start menu performance is so bad that it could be implemented in electron.
In summary: what lies spread easily is an amazing signal on public perception. The SMBC comic is dumb.
Flimm
4 days ago
It's less meaningful than you think. Widespread prejudice does give you signal on public sentiment, but it doesn't give you much signal on whether the prejudice happens to coincide with reality or not, compared to other methods. People should be open to having their prejudices corrected by more relevant information.
kortilla
3 days ago
We’re talking about new prejudices, not old.
saagarjha
4 days ago
AAAAAAAAAAAAAA
philipallstar
5 days ago
> How about Microsoft? Start menu is a slow electron app.
If your users are trapped due to a lack of competition then this can definitely happen.
pjmlp
5 days ago
If only community actually gathered around the true Linux distribution instead of endless forks.
philipallstar
4 days ago
Exactly. Let's start by listing all the true Linux distributions and we can go from there!
xp84
2 days ago
> every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric
Must be nice. In my career, all working on webapps, I've seen a few leaders popping in to ask us to fix a particularly egregious performance issue if the right customers complain, but aside from those finely-targeted and limited-attention-span drives to "improve performance" it seems the answer for the past decade or so is just to assume everyone is on at least a gigabit connection, stick fingers in ears, and just keep adding more node modules. If the developers' disks get full because node_modules got too big, buy a bigger SSD and keep going. (ok that last part is slight hyperbole but I also don't think frontend devs would be deterred from their ravenous appetite for libraries by a full disk).
moregrist
5 days ago
> I don’t know how many times I’ve heard the repeated quote about how Amazon loses $X million for every Y milliseconds of page loading time, as an example.
This is true for sites that are trying to make sales. You can quantify how much a delay affects closing a sale.
For other apps, it’s less clear. During its high-growth years, MS Office had an abysmally long startup time.
Maybe this was due to MS having a locked-in base of enterprise users. But given that OpenOffice and LibreOffice effectively duplicated long startup times, I don’t think it’s just that.
You also see the Adobe suite (and also tools like GIMP) with some excruciatingly long startup times.
I think it’s very likely that startup times of office apps have very little impact on whether users will buy the software.
delaminator
5 days ago
They even made it render the screen but still be unusable to make it look like it was running.
epmatsw
3 days ago
Every SSRed app these days…
Yoric
3 days ago
Can confirm at least for Firefox. When I worked on it, I've spent literal years shaving seconds from startup, or shutdown, or milliseconds from tab switching.
Everybody likes to hate Telemetry, and yes, it can be abused, but that's how Mozilla (and its competitors) manage to make user's life more comfortable.
j_w
5 days ago
Clearly Amazon doesn't care about that sentiment across the board. Plenty of their products are absurdly slow because of their poor engineering.
mindslight
5 days ago
> every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric
Are they evaluating the shape of that line with the same goal as the stonk score? Time spent by users is an "engagement" metric, right?
eviks
5 days ago
The issue here is not tracking, but developing. Like, how do you explain the fact that whole classes of software have gotten worse on those "key metrics"? (and that includes web-selling webpages)
croes
5 days ago
Then why do many software house favor cloud software over on premise?
They often have a recognizable delay to user data input compared to local software
kevin_thibedeau
4 days ago
The MBAs hate capital expenditures and love operating expenditures. They'll make strategic blunders like over-dependence on external services just to satisfy their warped mindset.
venturecruelty
5 days ago
>I don’t know what you mean by software houses, but every consumer facing software product I’ve worked on has tracked things like startup time and latency for common operations as a key metric.
Then respectfully, uh, why is basically all proprietary software slow as ass?
pjmlp
5 days ago
An exception that confirms the rule.
ekjhgkejhgk
6 days ago
I wouldn't call it tragedy of the commons, because it's not a commons. It's owned by microsoft. They're calculating that it's worth it for them, so I say take as much as you can.
Commons would be if it's owned by nobody and everyone benefits from its existence.
dahart
5 days ago
> so I say take as much as you can. Commons would be if it’s owned by nobody
This isn’t what “commons” means in the term ‘tragedy of the commons’, and the obvious end result of your suggestion to take as much as you can is to cause the loss of access.
Anything that is free to use is a commons, regardless of ownership, and when some people use too much, everyone loses access.
Finite digital resources like bandwidth and database sizes within companies are even listed as examples in the Wikipedia article on Tragedy of the Commons. https://en.wikipedia.org/wiki/Tragedy_of_the_commons
nkmnz
5 days ago
No, the word and its meaning both point to the fact that there’s no exclusive ownership of a commons. This is importantl, since ownership is associated with bearing the cost of usage (i.e., deprecation) which would lead an owner to avoid the tragedy of the commons. Ownership is regularly the solution to the tragedy (socialism didn’t work).
The behavior that you warn against is that of a free rider that make use of a positive externality of GitHub’s offering.
dahart
5 days ago
That is one meaning of “commons”, but not all of them, and you might be mistaking which one the phrase ‘tragedy of the commons’ is using.
“Commons can also be defined as a social practice of governing a resource not by state or market but by a community of users that self-governs the resource through institutions that it creates.”
https://en.wikipedia.org/wiki/Commons
The actual mechanism by which ownership resolves tragedy of the commons scenarios is by making the resource non-free, by either charging, regulating, or limiting access. The effect still occurs when something is owned but free, and its name is still ‘tragedy of the commons’, even when the resource in question is owned by private interests.
bawolff
5 days ago
How does that differ from what the person you are arguing against is saying?
dahart
5 days ago
Ownership, I guess. The 2 parent comments are claiming that “tragedy of the commons” doesn’t apply to privately owned things. I’m suggesting that it does.
Edit: oh, I do see what you mean, and yes I misunderstood the quote I pulled from WP - it’s talking about non-ownership. I could pick a better example, but I think that’s distracting from the fact that ‘tragedy of the commons’ is a term that today doesn’t depend on the definition of the word ‘commons’. It’s my mistake to have gotten into any debate about what “commons” means, I’m only saying today’s usage and meaning of the phrase doesn’t depend on that definition, it’s a broader economic concept.
nkmnz
5 days ago
No, it’s not.
dahart
5 days ago
What’s not what? Care to back up your argument with any links? I already pointed out that examples in the WP article for ‘Tragedy of the Commons’ use private property. https://en.wikipedia.org/wiki/Tragedy_of_the_commons#Digital... Are you contradicting the Wikipedia article? Why, and on what basis?
nkmnz
4 days ago
I'm contradicting your interpretation of the Wikipedia article. It does not support your initial statement that a) Github's (or any other company's) free tier constitutes a commons and/or b) the "overuse" of said free tiers by free riders could be the base of a tragedy of the commons (ToC). The idea is absurd, since there is no commons and also no tragedy. To the contrary. Commons have an external or natural limit to how much they can provide in a given time without incurring cost in the form of depreciation. But there is no external or natural limit to the free tier. The free tier is the result of the incentives under which the Github management operates and it is fully at their discretion, so the limits are purely internal. Other than in the case of commons, more usage can actually increase the amount of resources provided by the company for the users of the free tier, because a) network effects and b) economies of scale (more users bring more other users; more users cost less per user).
If Github realizes that the free tier is too generous, they can cut it anytime without it being in any way a "tragedy" for anybody involved - having to pay for stuff or service you want to consume is not the "T" in ToC! The T is that there are no incentives to pay (or use less) without increasing the incentives for everyone else to just increase their relative use! You not using the github free tier doesn't increase the usage of Github for anybody else - if it has any effect at all, it might actually decrease the usage of Github because you might not publish something that might in turn attract other users to interact.
dahart
4 days ago
Wikipedia does use Wikipedia, a privately owned organization, as an example of a digital commons.
The ‘tragedy’ that the top comment referred to is losing unlimited access to some of GitHub’s features, as described in the article (shallow clones, CPU limits, API rate limits, etc.). The finiteness, or natural limit, does exist in the form of bandwidth, storage capacity, server CPU capacity, etc.. The Wikipedia article goes through that, so I’m left with the impression you didn’t understand it.
bawolff
4 days ago
> Wikipedia does use Wikipedia, a privately owned organization
The Wikimedia organization does not actually own wikipedia. They do not control editorial policy nor own the copyright of any of the contents. They do not pay any of the editors.
dahart
4 days ago
They do own the servers. The rest of your comment is what demonstrates why Wikipedia counts as “commons”. Much of the same can be said for GitHub too.
nkmnz
4 days ago
It is really annoying that you're shifting the goal post by bringing up Wikipedia (as an example, not the article), which is very much different from Github in many ways. Still, Wikipedia is not a common good in my book, but at least in the case of Wikipedia I can understand the reasoning and it's a much more interesting case.
But let's stick with Github. On which of the following statements can we agree?
Z1) A "Commons" is a system of interacting market participants, governed by shared interests and incentives (and sometimes shared ownership). Github, a multi billion subsidiary of the multi trillion dollar company Microsoft, and I, their customer, are not members of the same commons; we don't share many interests, we have vastly different incentives, and we certainly do not share any ownership. We have a legally binding contract that each side can cancel within the boundaries of said contract under the applicable law.
Z2) A tragedy in the sense of the Tragedy of the Commons is that something bad happens even though everyone can have the best intentions, because the system lacks a mechanism would allow to a) coordinate interests and incentives across time, and b) to reward sustainable behavior instead of punishing it.
A) Github giving away stuff for free while covering the cost does not constitute a common good from... 1. a legal perspective 2. an ethical perspective 3. an economic perspective
B) If a free tier is successful, a profit maximizing company with a market penetration far from saturation will increase the resources provided in total, while there is no such mechanism or incentive for any participant in a market involving a common good, e.g. there will be no one providing additional pasture for free if an Allmende is already destroying the existing pasture through overgrazing.
C) If a free tier is unsuccessful because it costs more than it enables in new revenue, a company can simply shut it down – no tragedy involved. No server has been depreciated, no software destroyed, no user lost their share of a commonly owned good.
D) More users of a free tier reduce net loss / increase net earnings per free user for the provider, while more cattle grazing on a pasture decrease net earnings / increase net loss per cow.
E) If I use less of Github, you don't have any incentive to use more of it. This is the opposite of a commons, where one participant taking less of it puts out an incentive to everybody else to take their place and take more of it.
F) A service that you pay for with your data, your attention, your personal or company brand and reach (e.g. with public repositories), is not really free.
G) The tiny product samples that you can get for free in perfume shops do not constitute a common good, even though they are limited, "free" for the user, and presumably beneficial even for people not involved in the transaction. If you think they were a common good, what about Nestlé offering Cheerios with +25% more for free? Are those 20% a common good just because they are free? Where do you draw the line? Paying with data, attention, and brand + reach is fine, but paying for only 80% of the produce is not fine?
H) The concepts of "moral hazard" and "free riders" apply to all your examples, both Github and Wikipedia. The concept of a Commons (capital C) is neither necessary nor helpful in describing the problems that you want to describe wrt to free services provided by either Github of Wikipedia.
dahart
4 days ago
Nope, no goal posts were moved, Wikipedia and GitHub are both private entities that offer privately funded free services to everyone, and due to the widespread free access, both have been considered to be examples of digital commons by others. I didn’t make up the Wikipedia example, it’s in Wikipedia being offered as one of the canonical examples of digital commons, and unfortunately for you it pokes a hole in your argument. If your ‘book’ disagrees with the WP article, you’re free to fix it (since WP is a digital commons), and you’re also free to use it to re-evaluate whether your book needs updating.
You seem to be stuck on definitions of ‘commons’, and unfortunately that’s not a compelling argument for reasons I’ve already stated. Also unfortunate that there are fundamental terminology flaws, or made up definitions, or straw men arguments, or incorrect statements, or opinions in every single item you listed.
“Tragedy of the Commons” is a phrase that became an economic term of art a long time ago. It’s now an abstract concept, and gets used to mean (as well as defined by) any situation in which a community of people overusing shared resources causes any loss of access to those shared resources for anyone else in the community. “The tragedy of the commons is an economic theory claiming that individuals tend to exploit shared resources so that demand outweighs supply, and it becomes unavailable for the whole.” (Investopedia) I’ve already cited multiple sources that define it that way, and so far you’ve shared no evidence to the contrary.
There are also tons of examples online where the phrase has been used to refer to small, local, or privatized resources, I found a dozen in like one minute, so I already know it’s incorrect to claim that people don’t use the phrase in the way I’m suggesting.
Even though the phrase does not depend on any strict definition of commons (or of tragedy), none of your argument addresses the fact that what’s common in, say, Germany is not freely available to Iranians, for example. Land is often used in ‘tragedy of the commons’ examples. Hardin’s original example was sheep grazing on “public” land, and yet there is really no such thing as common land anywhere on this planet, all of it is claimed by subgroups, e.g., countries, and is private is some sense. The idea of commons, and even some of the alternate dictionary definitions, make explicit note that the word is relative to a specific community of people. Nothing you’ve said addresses that fact, and it means that ‘Tragedy of the Commons’ has always referred to resources that are not common in a global context. GitHub and Wikipedia are more common than “public” land in America in that global sense, because they’re used by and available to more people than US land is.
What I can agree with is that it’s common for people to mean things like land, air, and water, when using or referring to the phrase, and I agree those things count as commons.
nkmnz
4 days ago
You're confusing public goods with common goods. That's your personal tragedy of the commons.
> “The tragedy of the commons is an economic theory claiming that individuals tend to exploit shared resources so that demand outweighs supply, and it becomes unavailable for the whole.” (Investopedia)
EXACTLY. This is NOT what is happening in the case of Github. As explained plenty of times, Github has the incentive to INCREASE their supply, making MORE available for the whole, if the whole demands MORE. Also, they are a centralized, coordinated entity, that can change the rules for the whole flock, which is one of the famous coordination problems associated with common goods. They can also discriminate between their contractual partners and optimize for multi-period results for reducing moral hazards and free-riding. It must be stupidity to not see these fundamental difference on the systems level.
> I didn’t make up the Wikipedia example, it’s in Wikipedia being offered as one of the canonical examples of digital commons
Yeah, the example in the article is Wikipedia, not Github. That's your example. All my statements refer to 100% to Github and probably only 90% to Wikipedia. That said, there are true digital commons, e.g. the copper cables connecting the houses in your street. Unsufficient number of bands in old wifi standards.
Since Dunning-Kruger has entered the chat, I'm going to leave. Have a good day; you will have a hard time having serious conversations if you do not accept that it helps everyone to favor precise language over watering down the meaning of concepts, like some social scientists and journalists seem to prefer for self-marketing purposes.
dahart
3 days ago
> You’re confusing public goods with common goods.
Am I? Where did I do that? The distinction between common and public is defined as whether or not the thing can succumb to tragedy of the commons. If public goods are “non-rivalrous”, then land is not a public good, it’s a common good, right? And “common” land is owned by nation states, or by smaller geographic communities, is it not? Therefore, ownership is always involved and the land is not available for use by people from other nation states, right?
Above, you said “there’s no exclusive ownership of a commons”. But sheep grazing on “commons” land is generally land owned exclusively by a country, nation, state, province, city, etc.. I assume what you meant was that no one person or sub-group within the geographical community owns the commons.
> This is NOT what is happening in the case of GitHub.
That’s not true, the article we’re commenting on gave examples of at least three different specific things that GitHub has limited in response to overuse, and the comment that started this thread was reacting to that fact. If they have incentive to increase their supply, why didn’t they actually do it? Logic can’t override history.
> there are true digital commons, e.g. the copper cables connecting the houses in your street
That’s not true, that’s not a commons at all, and not what the phrase “digital commons” means. In the US, the cables are owned by the telcom providers that installed them, they are private property. Maybe there are public cables where you live, but in that case, it seems like maybe you are the one confusing public and common goods. The phrase ‘digital commons’ generally speaking refers to digital goods, not physical goods. (But there is some leakage into the physical world, which is why some digital commons are susceptible to the tragedy of the commons.) https://en.wikipedia.org/wiki/Digital_commons (Do note that GitHub is listed there as an example of a digital commons.)
> It must be stupidity to not see these fundamental difference on the systems level
FWIW, you’ve flatly broken HN guidelines here, and this reflects extremely poorly on you and your argument. From my point of view, I can only interpret this lack of civility to mean you you’re frustrated about not being able to answer my questions or form a convincing argument.
Please review, and strive for better: https://news.ycombinator.com/newsguidelines.html
xp84
2 days ago
GP shouldn't have said something insulting, but I do think it's you who are being obtuse here in not acknowledging that this is at least very different than the field everyone can graze on that gets overgrazed, that is the most simple and widely-accepted type of commons. It's probably not worth arguing semantics at all ("is this a commons?") because there isn't a "Tragedy of the Commons" central authority that could ever adjudicate that. Any definition of commons could be used; the only thing that matters is if the definitions are useful to define what's going on and to compare it to other situations.
In this case, GitHub can very cheaply add enforceable rules and force heavy users to consume only what they consider a tolerable amount of resources. The majority who don't need an outsized amount of resources will never be affected by this. That is why there is no 'tragedy' here.
It would be as if the grazing field were outfitted with sheep-facial-recognition and could automatically and at trivial cost, gently drone-airlift any sheep outside the field after they consume 3x what a normal sheep eats each day. In what most of us think of as a ToC situation, there is little that can be done besides closing the field or subdividing it into tiny, private plots which are policed.
dahart
a day ago
The singular point of debate here from my side has been whether the phrase ‘tragedy of the commons’ applies to cases where the ‘commons’ are owned to the exclusion of some people, and nothing else. I don’t believe I have failed to acknowledge the differences between physical and digital commons, but let me correct that impression now: GitHub certainly is very different from a sheep-grazing field in almost every way. GitHub is even different from Wikipedia in many ways, just like GP said. I am arguing those differences, no matter how large, do not matter purely in terms of whether you can call these a ‘commons’, and I’ve supported that opinion by showing evidence that other people call both GitHub and Wikipedia a ‘digital commons’. If any definition of commons can be used, including privately owned land that is made available to the public, then I think you and I agree completely. The Wikipedia article about this phrase actually points out what I’ve been saying here, that common land does not exist.
There is a central authority on this topic: the paper by Hardin that coined the phrase. It’s worth a read. He defined ‘tragedy’ to be in the dramatic sense, e.g., a Greek or Shakespearean tragedy: “We may well call it ‘the tragedy of the commons,’ using the word ‘tragedy’ as the philosopher Whitehead used it: ‘The essence of dramatic tragedy is not unhappiness. It resides in the solemnity of the remorse-less working of things.’”
Hardin did not define ‘commons’, but he used multiple examples of things that are owned to the exclusion of others, and he even pointed out that a bank robber thinks of a bank as a commons. He himself blurred the line of what a commons means, and his actual argument depends only on the idea that commons means something shared and nothing more. In fact, he was making a point about human behavior, and his argument is stronger when ‘commons’ refers to any shared resources that can be exhausted by overuse at all. Hardin would have had a good chuckle over this extremely silly debate.
The actual points Hardin was making behind his phrase ‘Tragedy of the Commons’ were that Adam Smith’s ‘Invisible Hand’ economics, and Libertarian thinking, are provably wrong, and that we should abolish the UN’s Universal Declaration of Human Rights, specifically the right to breed freely, because he believes these things will certainly lead to overpopulation of the earth and thus increased human suffering. The only actual ‘commons’ he truly cared about in this paper is the earth’s space and food supply. The question of ownership is wholly and utterly irrelevant to his phrase.
GitHub adding rules that curtails people does limit some people’s access, that’s the point. How many people it affects I don’t know, and I don’t think it’s especially relevant, but note that in this case one single GitHub user being limited might affect many many people - Homebrew was one of the examples.
“Tragedy” never referred to the magnitude of the problem, as you and GP are assuming. Hardin’s “tragedy” refers to the human character flaw of thinking that shared things are preferable to limitations, because he argues that we end up with uncontrolled (worse) limitations anyway. His “tragedy” is the inevitability of loss, the irony of misguided belief in the very idea of a commons.
bawolff
5 days ago
I'm not sure i agree that the Wikipedia article supports your position.
Certainly private property is involved in tragedy of the commons. In the classic shared cattle ranching example, the individual cattle are private property, only the field is held in common.
I generally think that tragedy of the commons requires the commons, to, well, be held in common. If someone owns the thing that is the commons, its not a commons but just a bad product. (With of course some nit picking about how things can be de jure private property while being defacto common property)
In the microsoft example, windows becoming shitty software is not a tragedy of the commons, its just MS making a business decision because windows is not a commons. On the other hand, computing in general becoming shitty, because each individual app does attention grabbing dark patterns, as it helps the induvidual apps bottom line while hurting the ecosystem as a whole, would be a tragedy of the commons, as user attention is something all apps hold in common and none of them own.
dahart
4 days ago
One of the examples of digital commons in the article is Wikipedia itself, which is privately owned, so now you can be sure the Wikipedia article does backup my claim at least a little.
The Microsoft example in this subthread is GitHub, not Windows. Windows is not a digital commons, because it’s neither free nor finite. Github is (or was) both. That is the criteria that Wikipedia is using to apply the descriptor ‘commons’: something that is both freely available to the public, and comes in limited supply, e.g. bandwidth, storage, databases, compute, etc.
Wikipedia’s article seems to be careful to not discuss ownership nor define the tragedy of the commons in terms of ownership, presumably because the phrase describes something that can still happen when privately owned things are made freely available. I skimmed Investopedia’s article on Tragedy as well, and it seems similarly to not explicitly discuss ownership, and even brings up the complicated issue of lack of international commons. That’s an interesting point: whatever we call commons locally may not be a commons globally. That suggests that even the original classic notion of tragedy of the commons often involves a type of private ownership, i.e. overfishing a “public” lake is a lake owned by a specific country, cattle overusing a “public” pasture is land owned by a specific country, and these resources might not be truly common when considered globally.
nkmnz
4 days ago
The use of Github is not "something that is both freely available to the public". If you're not the customer, you're the product.
dahart
4 days ago
What use of GitHub are you talking about? The use of GitHub by @c-linkage at the top of the thread was, in fact, based on GitHub being free to use. And GitHub’s basic services are free to use. I really don’t know what you mean.
Your oft-repeated customer vs product platitude doesn’t seem to apply to GitHub, at least not to it’s founding and core product offering. You are the customer, and GitHub doesn’t advertise. It’s a freemium model, the free access is just a sort of loss leader to entice paid upgrades by you, the customer.
TeMPOraL
6 days ago
Still, because reality doesn't respect boundaries of human-made categories, and because people never define their categories exhaustively, we can safely assume that something almost-but-not-quite like a commons, is subject to an almost-but-not-quite tragedy of the commons.
bee_rider
5 days ago
That seems to assume some sort of… maybe unfounded linearity or something? I mean, I’m not sure I agree that GitHub is nearly a commons in any sense, but let’s put that aside as a distraction…
The idea of the tragedy of the commons relies on this feedback loop of having these unsustainably growing herds (growing because they can exploit the zero-cost-to-them resources of the commons). Feedback loops are notoriously sensitive to small parameter changes. MS could presumably impose some damping if they wanted.
TeMPOraL
5 days ago
> That seems to assume some sort of… maybe unfounded linearity or something
Not linearity but continuity, which I think is a well-founded assumption, given that it's our categorization that simplifies the world by drawing sharp boundaries where no such bounds exist in nature.
> The idea of the tragedy of the commons relies on this feedback loop of having these unsustainably growing herds (growing because they can exploit the zero-cost-to-them resources of the commons)
AIUI, zero-cost is not a necessary condition, a positive return is enough. Fishermen still need to buy fuel and nets and pay off loans for the boats, but as long as their expected profit is greater than that, they'll still overfish and deplete the pond, unless stronger external feedback is introduced.
Given that the solution to tragedy of the commons is having the commons owned by someone who can boss the users around, GitHub being owned by MS makes it more of a commons in practice, not less.
kortilla
5 days ago
No, it’s not a well-founded assumption. Many categories like these were created in the first place because there is a very obvious discontinuous step change in behavior.
You’re fundamentally misunderstanding what tragedy of the commons is. It’s not that it’s “zero-cost” for the participants. All it requires a positive return that has a negative externality that eventually leads to the collapse of the system.
Overfishing and CO2 emissions are very clearly a tragedy of the commons.
GitHub right now is not. People putting all sorts of crap on there is not hurting github. GitHub is not going to collapse if people keep using it unbounded.
Not surprisingly, this is because it’s not a commons and Microsoft oversees it, placing appropriate rate limits and whatnot to make sure it keeps making sense as a business.
thayne
5 days ago
And indeed MS/GitHub does impose some "damping" in the form of things like API request throttling, CPU limits on CI, asking Homebrew not to use shallow cloning, etc. And those limits are one of the reasons given why using git as a database isn't good.
lo_zamoyski
5 days ago
There is an analogy in the sense that for the users a resource is, for certain practical intents and purposes, functionally common. Social media is like this as well.
But I would make the following clarifications:
1. A private entity is still the steward of the resource and therefore the resource figures into the aims, goals, and constraints of the private entity.
2. The common good is itself under the stewardship of the state, as its function is guardian of the common good.
3. The common good is the default (by natural law) and prior to the private good. The latter is instituted in positive law for the sake of the former by, e.g., reducing conflict over goods.
TeMPOraL
5 days ago
> There is an analogy in the sense that for the users a resource is, for certain practical intents and purposes, functionally common. Social media is like this as well.
I think it's both simpler and deeper than that.
Governments and corporations don't exist in nature. Those are just human constructs, mutually-recursive shared beliefs that emulate agents following some rules, as long as you don't think too hard about this.
"Tragedy of the commons" is a general coordination problem. The name itself might've been coined with some specific scenarios in mind, but for the phenomenon itself, it doesn't matter what kind of entities exploit the "commons"; the "private" vs. "public" distinction itself is neither a sharp divide, nor does it exist in nature. All that matters is that there's some resource used by several independent parties, and each of them finds it more beneficial to defect than to cooperate.
In a way, it's basically a 3+-player prisonner's dilemma. The solution is the same, too: introducing a party that forces all other parties to cooperate. That can be a private or public or any other kind of org taking ownership of the commons and enforcing quotas, or in case of prisonners, a mob boss ready to shoot anyone who defects.
lo_zamoyski
4 days ago
It was not my intent to be exhaustive, but to make a few points that left it up to the reader to relate them appropriately to your post in order to enrich thinking about the subject.
But it appears we cannot avoid getting into the weeds a bit…
> Governments and corporations don't exist in nature.
This is not as simple as you seem to think.
The claim “don’t exist in nature” is vague, because the word “nature” in common speech is vague. What is “natural”? Is a beehive “natural” Is a house “natural”? Is synthetic water “natural”? (I claim that the concept of “nature” concerns what it means to be some kind of thing. Perhaps polystyrene has never existed before human beings synthesized it, but it has a nature, that is, it means something to be polystyrene. And it is in the nature of human beings to make materials and artifacts, i.e., to produce technology ordered toward the human good.)
So, what is government? Well, it is an authority whose central purpose is to function as the guardian and steward of the common good. I claim that parenthood is the primordial form of human government and the family as the primordial form of the state. We are intrinsically social and political animals; legitimate societies exist only when joined by a common good. This is real and part of human nature. The capacity to deviate from human nature does not disprove the norm inherent to it.
Now, procedurally we could institute various particular and concrete arrangements through which government is actualized. We could institute a republican form of government or a monarchy, for example. These are historically conditioned. But in all cases, there is a government. Government qua government is not some arbitrary “construct”, but something proper to all forms and levels of human society.
> "Tragedy of the commons" is a general coordination problem.
We can talk about coordination once we establish the ends for which such coordination is needed, but there is something more fundamental that must be said about the framing of the problem of the “tragedy”. The framing does not presume a notion of human beings as moral agents and political and social creatures. In other words, it begins with a highly individualist, homo economicus view of human nature as rationally egoist and oriented toward maximizing utility, full stop. But I claim that is not in accord with human nature and thus the human good, even if people can fall into such pathological patterns of behavior (especially in a culture that routinely reinforces that norm).
As I wrote, human beings are inherently social animals. We cannot flourish outside of societies. A commons that suffers this sort of unhinged extraction is an example of a moral and a political failure. Why? Because it is unjust, intemperate, and a lack of solidarity to maximize resource extraction in that manner. So the tragedy is a matter of a) the moral failure of the users of that resource, and b) the failure of an authority to regulate its use. The typical solution that’s proposed is either privatization or centralization, but both solutions presuppose the false anthropology of homo economicus. (I am not claiming that privatization does not have a place, only that the dichotomy is false.)
Now, I did say that the case with something like github is analogical, because functionally, it is like a common resource, just like how social media functions like a public square in some respects. But analogy is not univocity. Github is not strictly speaking a common good, nor is social media strictly a public square, because in both cases, a private company manages them. And typically, private goods are managed for private benefit, even if they are morally bound not to harm the common good.
That intent, that purpose, is central to determining whether something is public or private, because something public has the common benefit as its aim, while something private has private benefit as its aim.
reactordev
6 days ago
An A- is still an A kind of thinking. I like this approach as not everything perfectly fits the mold.
ttiurani
6 days ago
The whole notion of the "tragedy of the commons" needs to be put to rest. It's an armchair thought experiment that was disproven at the latest in the 90s by Elinor Ostrom with actual empirical evidence of commons.
The "tragedy", if you absolutely need to find one, is only for unrestricted, free-for-all commons, which is obviously a bad idea.
wongarsu
5 days ago
A high-trust community like a village can prevent a tragedy of the commons scenario. Participants feel obligations to the community, and misusing the commons actually does have real downsides for the individual because there are social feedback mechanisms. The classic examples like people grazing sheep or cutting wood are bad examples that don't really work.
But that doesn't mean the tragedy of the commons can't happen in other scenarios. If we define commons a bit more generously it does happen very frequently on the internet. It's also not difficult to find cases of it happening in larger cities, or in environments where cutthroat behavior has been normalized
TeMPOraL
5 days ago
> A high-trust community like a village can prevent a tragedy of the commons scenario. Participants feel obligations to the community, and misusing the commons actually does have real downsides for the individual because there are social feedback mechanisms.
That works while the size of the community is ~100-200 people, when everyone knows everyone else personally. It breaks down rapidly after that. We compensate for that with hierarchies of governance, which give rise to written laws and bureaucracy.
New tribes break off old tribes, form alliances, which form larger alliances, and eventually you end up with countries and counties and vovoidships and cities and districts and villages, in hierarchies that gain a level per ~100x population increase.
This is sociopolitical history of the world in a nutshell.
lukan
5 days ago
"and eventually you end up with countries and counties and vovoidships and cities and districts and villages, in hierarchies that gain a level per ~100x population increase."
You say it like this is a law set in stone, because this is what happened im history, but I would argue it happened under different conditions.
Mainly, the main advantage of an empire over small villages/tribes is not at all that they have more power than the villages combined, but that they can concentrate their power where it is needed. One village did not stand a chance against the empire - and the villages were not coordinated enough.
But today we would have the internet for better communication and coordination, enabling the small entieties to coordinate a defense.
Well, in theory of course. Because we do not really have autonomous small states, but are dominated by the big players. And the small states have mowtly the choice which block to align with, or get crushed. But the trend might go towards small again.
(See also cheap drones destroying expensive tanks, battleships etc.)
ajuc
5 days ago
Internet is working exactly the opposite way to what your describing - it's making everything more centralized. Once we had several big media companies in each country and in each big city. Now we have Google and Facebook and tik tok and twitter and then the "whatevers".
NETWORK effect is a real thing
lukan
5 days ago
Yes, but there is a difference between having the choice of joining FB or not having a choice at all when the empire comes to claim you (like in Ukraine).
8note
5 days ago
FB is part of the empire though, and it is coming for us.
canadians need an anti-imperial radio-canada run alternative. we arent gonna be able to coordinate against the empire when the empire has the main control over the internet.
when the americans come a knocking, we're gonna wish we had chinese radios
xorcist
5 days ago
> That works while the size of the community is ~100-200 people,
Yet we regularly observe that working with millions of people; we take care of our young, we organize, when we see that some action hurt our environment we tend to limit its use.
It's not obvious why some societies break down early and some go on working.
TeMPOraL
5 days ago
> Yet we regularly observe that working with millions of people; we take care of our young, we organize, when we see that some action hurt our environment we tend to limit its use.
That's more like human universals. These behaviors generally manifest to smaller or larger degree, depending on how secure people feel. But those are extremely local behaviors. And in fact, one of them is exactly the thing I'm talking about:
> we organize
We organize. We organize for many reasons, "general living" is the main one but we're mostly born into it today (few got the chance to be among the founding people of a new village, city or country). But the same patterns show up in every other organizations people create, from companies to charities, from political interests groups to rural housewives' circles -- groups that grow past ~100 people split up. Sometimes into independent groups, sometimes into levels of hierarchies. Observe how companies have regional HQs and departments and areas and teams; religious groups have circuits and congregations, etc. Independent organizations end up creating joint ventures and partnerships, or merge together (and immediately split into a more complex internal structure).
The key factor here is, IMO, for everyone in a given group to be in regular contact with everyone else. Humans are well evolved for living in such small groups - we come with built-in hardware and software to navigate complex interpersonal situations. Alignment around shared goals and implicit rules is natural at this scale. There's no space for cheaters and free-loaders to thrive, because everyone knows everyone else - including the cheater and their victims. However, once the group crosses this "we're all a big family, in it together" size, coordinating everyone becomes hard, and free-loaders proliferate. That's where explicit laws come into play.
This pattern repeats daily, in organizations people create even today.
user
5 days ago
AnthonyMouse
5 days ago
I get the feeling it's the combination of Schelling points and surplus. If everyone else is being pro-social, i.e. there is a culture of it, and the people aren't so hard up that they can reasonably afford to do the same, then that's what happens, either by itself (Hofstadter's theory of superrationality) or via anything so much as light social pressure.
But if a significant fraction of the population is barely scraping by then they're not willing to be "good" if it means not making ends meet, and when other people see widespread defection, they start to feel like they're the only one holding up their end of the deal and then the whole thing collapses.
This is why the tendency for people to propose rent-seeking middlemen as a "solution" to the tragedy of the commons is such a diabolical scourge. It extracts the surplus that would allow things to work more efficiently in their absence.
vlovich123
5 days ago
I’ve heard stories from communist villages where everyone knew everyone. Communal parks and property was not respected and frequently vandalized or otherwise neglected because it didn’t have an owner and it was treated as something for someone else to solve.
It’s easier to explain in those terms than assumptions about how things work in a tribe.
ttiurani
5 days ago
> But that doesn't mean the tragedy of the commons can't happen in other scenarios.
Commons can fail, but the whole point of Hardin calling commons a "tragedy" is to suggest it necessarily fails.
Compare it to, say, driving. It can fail too, but you wouldn't call it "the tragedy of driving".
We'd be much better off if people didn't throw around this zombie term decades after it's been shown to be unfounded.
lo_zamoyski
5 days ago
Even here, the state is the steward of the common good. It is a mistaken notion that the state only exists because people are bad. Even if people were perfectly conscientious and concerned about the common good, you still need a steward. It simply wouldn’t be a steward who would need to use aggressive means to protect the common good from malice or abuse.
jandrewrogers
5 days ago
> A high-trust community like a village can prevent a tragedy of the commons scenario.
No it does not. This sentiment, which many people have, is based on a fictional and idealistic notion of what small communities are like having never lived in such communities.
Empirically, even in high-trust small villages and hamlets where everyone knows everyone, the same incentives exist and the same outcomes happen. Every single time. I lived in several and I can't think of a counter-example. People are highly adaptive to these situations and their basic nature doesn't change because of them.
Humans are humans everywhere and at every scale.
yellow_postit
5 days ago
While an earlier poster is over stating Ostrom’s Nobel prize winning work — it is regularly shown that averting the tragedy of the commons is not as insurmountable as the original coining of the phrase implied.
Saline9515
6 days ago
Ostrom showed that it wasn't necessarily a tragedy, if tight groups involved decided to cooperate. This common in what we call "trust-based societies", which aren't universal.
Nonetheless, the concept is still alive, and anthropic global warming is here to remind you about this.
dpark
5 days ago
She not “disprove” the existence of the tragedy of the commons. What she established was that controlling the commons can be done communally rather than through privatization or through government ownership.
Communal management of a resource is still government, though. It just isn’t central government.
The thesis of the tragedy of the commons is that an uncontrolled resource will be abused. The answer is governance at some level, whether individual, collective, or government ownership.
> The "tragedy", if you absolutely need to find one, is only for unrestricted, free-for-all commons, which is obviously a bad idea.
Right. And that’s what people are usually talking about when they say “tragedy of the commons”.
gmfawcett
5 days ago
Ostrom's results didn't disprove ToC. She showed that common resources can be communally maintained, not that tragic outcomes could never happen.
8note
5 days ago
i dont thjnk anything can disprove that ToC issues can happen under any situation.
that seems like an unreasonable bar, and less useful than "does this system make ToC less frequent than that system"
b00ty4breakfast
6 days ago
yeah, it's a post-hoc rationalization for the enclosure and privatization of said commons.
TeMPOraL
5 days ago
And here I thought the standard, obvious solution to tragedy of the commons is centralized governance.
b00ty4breakfast
5 days ago
That is, in fact, how medieval commons were able to exist successfully for hundreds of years.
dpark
5 days ago
People invoke the tragedy of the commons in bad faith to argue for privatization because “the alternative is communism”. i.e. Either an individual or the government has to own the resource.
This is of course a false dichotomy because governance can be done at any level.
AnthonyMouse
5 days ago
It also seems to omit the possibility that the thing could be privately operated but not for profit.
Let's Encrypt is a solid example of something you could reasonably model as "tragedy of the commons" (who is going to maintain all this certificate verification and issuance infrastructure?) but then it turns out the value of having it is a million times more than the cost of operating it, so it's quite sustainable given a modicum of donations.
Free software licenses are another example in this category. Software frequently has a much higher value than development cost and incremental improvements decentralize well, so a license that lets you use it for free but requires you to contribute back improvements tends to work well because then people see something that would work for them except for this one thing, and it's cheaper to add that themselves or pay someone to than to pay someone who has to develop the whole thing from scratch.
user
5 days ago
jasonkester
6 days ago
It has the same effect though. A few bad actors using this “free” thing can end up driving the cost up enough that Microsoft will have to start charging for it.
The jerks get their free things for a while, then it goes away for everyone.
Y_Y
6 days ago
I think the jerks are the ones who bought and enshittified GitHub after it had earned significant trust and become an important part of FOSS infrastructure.
irishcoffee
6 days ago
Scoping it to a local maxima, the only thing worse than git is github. In an alternate universe hg won the clone wars and we are all better off for it.
MarsIronPI
5 days ago
Excuse me if this is obvious, but how is Mercurial better than Git from a repo format perspective?
dahart
5 days ago
Why do you blame MS for predictably doing what MS does, and not the people who sold that trust & FOSS infra to MS for a profit? Your blame seems misplaced.
And out of curiosity, aside from costing more for some people, what’s worse exactly? I’m not a heavy GitHub user, but I haven’t really noticed anything in the core functionality that would justify calling it enshittified.
mastax
5 days ago
Plenty of blame to go around.
Probably the worst thing MS did was kill GitHub’s nascent CI project and replace it with Azure DevOps. Though to be fair the fundamental flaws with that approach didn’t really become apparent for a few years. And GitHub’s feature development pace was far too slow compared to its competitors at the time. Of course GitHub used to be a lot more reliable…
Now they’re cramming in half baked AI stuff everywhere but that’s hardly a MS specific sin.
MS GitHub has been worse about DMCA and sanctioned country related takedowns than I remember pre acquisition GitHub being.
Did I miss anything?
Y_Y
5 days ago
I don't blame them uniquely. I think it's a travesty the original GitHub sold out, but it's just as predictable. Giant corps will evilly make the line go up, individual regular people will have a finite amount of money for which they'll give up anything and everything.
As for how the site has become worse, plenty of others have already done a better job than I could there. Other people haven't noticed or don't care and that's ok too I guess.
groundzeros2015
5 days ago
A public park suffers from tragedy of the commons even though it’s managed by the city.
TUSF
4 days ago
I wouldn't call it "tragedy of the commons" because the very idea was coined as a strawman. As far as I'm concerned, the entire concept is a fallacy, and people should stop perpetuating it.
drob518
5 days ago
Right. Microsoft could easily impose a transfer fee if over a certain amount that would allow “normal” OSS development of even popular software to happen without charge while imposing a cost to projects that try to use GitHub like a database.
rvba
6 days ago
I doubt anyone is calculating
Remember how GTA5 took 10 minutes to start and nobody cared? Lots of software is like this.
Some Blizzard games download 137 MB file every time you run them and take few minutes to start (and no, this is not due to my computer).
PunchyHamster
6 days ago
Well, till you choose to host something yourself and it becomes popular
ericyd
5 days ago
Tragedy of the Microsoft just doesn't sound as nice though
solatic
6 days ago
If you think too hard about this, you come back around to Alan Kay's quote about how people who are really serious about software should build their own hardware. Web applications, and in general loading pretty much anything over the network, is a horrible, no-good, really bad user experience, and it always will be. The only way to really respect the user is with native applications that are local-first, and if you take that really far, you build (at the very least) peripherals to make it even better.
The number of companies that have this much respect for the user is vanishingly small.
phkahler
5 days ago
>> The number of companies that have this much respect for the user is vanishingly small.
I think companies shifted to online apps because #1 it solved the copy protection problem. FOSS apps are not in any hurry to become centralized because they dont care about that issue.
Local apps and data are a huge benefit of FOSS and I think every app website should at least mention that.
"Local app. No ads. You own your data."
xorcist
5 days ago
Another important reason to move to online applications is that you can change the terms of the deal at any time. This may sound more nefarious than it needs to be, it just means you do not have to commit fully to your licensing terms before the first deal is made, which is tempting for just about anyone.
hombre_fatal
6 days ago
Software I don’t have to install at all “respects me” the most.
Native software being an optimum is mostly an engineer fantasy that comes from imagining what you can build.
In reality that means having to install software like Meta’s WhatsApp, Zoom, and other crap I’d rather run in a browser tab.
I want very little software running natively on my machine.
cosmic_cheese
5 days ago
Web apps are great until you want to revert to an older version from before they became actively user-hostile or continue to use them past EoL or company demise.
In contrast as long as you have a native binary, one way or another you can make the thing run and nobody can stop you.
freedomben
6 days ago
Yes, amen. The more invasive and abusive software gets, the less I want it running on my machine natively. Native installed applications for me now are limited only to apps I trust, and even those need to have a reason to be native apps rather than web apps to get a place in my app drawer
solatic
5 days ago
Your browser is acting like a condom, in that respect (pun not intended).
Yes, there are many cases when condoms are indicative of respect between parties. But a great many people would disagree that the best, most respectful relationships involve condoms.
> Meta
Does not sell or operate respectful software. I will agree with you that it's best to run it in a browser (or similar sandbox).
tormeh
5 days ago
Desktop operating systems really dropped the ball on protecting us from the software we run. Even mobile OSs are so-so. So the browser is the only protection we reasonably have.
I think this is sad.
shash
5 days ago
You mean you’d rather run unverified scripts using a good order of magnitude more resources with a slower experience and have an entire sandboxing contraption to keep said unverified scripts from doing anything to your machine…
I know the browser is convenient, but frankly, its been a horror show of resource usage and vulnerabilities and pathetic performance
whstl
5 days ago
The #1 reason the web experience universally sucks today is because companies add an absurd amount of third-party code on their pages for tracking, advertisement, spying on you or whatever non-essential purpose. That, plus an excessive/unnecessary amount of visual decoration.
The idea that somehow those companies would respect your privacy were they running a native app is extremely naive.
We can already see this problem on video games, where copy protection became resource-heavy enough to cause performance issues.
ghosty141
6 days ago
Yes because users don't appreciate this enough to pay for the time this takes.
Y-bar
6 days ago
You’ll enjoy ”Saving Lives” by Andy Hertzfied: https://www.folklore.org/Saving_Lives.html
> "The Macintosh boots too slowly. You've got to make it faster!"
zahlman
6 days ago
> Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.
This is what people mean about speed being a feature. But "user time" depends on more than the program's performance. UI design is also very important.
bawolff
5 days ago
> Software houses optimize for feature delivery and not user interaction time. Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.
Google and amazon are famous for optimizing this. Its not an externality to them though, even 10s of ms can equal an extra sale.
That said, i don't think its fair to add time up like that. Saving 1 second for 600 people is not the same as saving 10 minutes for 1 person. Time in small increments does not have the same value as time in large increments.
esafak
5 days ago
1. If you can price the cost of the externality, you can justify optimizing it.
2. Monopolies and situations with the principal/agent dilemma are less sensitive to such concerns.
bawolff
5 days ago
> 1. If you can price the cost of the externality, you can justify optimizing it.
An externality is usually a cost you don't pay (or pay only a negligible amount of). I don't see how pricing it helps justify optimizing it.
esafak
5 days ago
You are right. I should say perceived externality; there may be a price that is discounted.
ozim
6 days ago
About apps done by software houses, even though we should strive for doing good job and I agree with sentiment...
First argument would be - take at least two 0's from your estimation, most of applications will have maybe thousands of users, successful ones will maybe run with 10's of thousands. You might get lucky to work on application that has 100's of thousands, millions of users and you work in FAANG not a typical "software house".
Second argument is - most users use 10-20 apps in typical workday, your application is most likely irrelevant.
Third argument is - most users would save much more time learning how to use applications (or to use computer) properly they use on daily basis, than someone optimizing some function from 2s to 1s. But of course that's hard because they have 10-20 apps daily plus god know how many other not on daily basis. Though still I see people doing super silly stuff in tools like Excel or even not knowing copy paste - so not even like any command line magic.
user
5 days ago
robmccoll
5 days ago
I don't think most software houses spend enough time even focusing on engineering time. CI pipelines that take tens of minutes to over an hour, compile times that exceed ten seconds when nothing has changed, startup times that are much more than a few seconds. Focus and fast iteration are super important to writing software and it seems like a lot of orgs just kinda shrug when these long waits creep into the development process.
3371
5 days ago
The user hour analogy sounds weird tho, 1s feels 1s regardless how many users you have. It's like the classic Asian teachers' logic of "if you come in 1 min late you are wasting N minutes for all of us in this class." It just does not stack like that.
BenjiWiebe
5 days ago
If the class takes N minutes and one person arrives 1 minute late, and the rest of the class is waiting for them, it does stack. Every one of those students lost a minute. Far worse than one student losing one minute.
3371
5 days ago
Do "we" lose 2mins because we both spent 1 min commenting? That sounds like The Mythical of Man Month thinking... for me time is parallel and does not combine.
DrewADesign
5 days ago
> Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.
Wait times don’t accumulate. Depending on the software, to each individual user, that one second will probably make very little difference. Developers often overestimate the effect of performance optimization on user experience because it’s the aspect of user experience optimization their expertise most readily addresses. The company, generally, will have a much better ROI implementing well-designed features and having you squash bugs
drbojingle
5 days ago
A well designed feature IS considerate of time and attention. Why would I want a game on 20 fps when I could have it on 120? The smoothness of the experience increases my ability to use the experience optimally because I don't have to pay as much attention to it. I'd prefer if my interactions with machines were as smooth as my interactions driving a car down a empty dry highway mid day.
Prehaps not everyone cares but I've played enough Age of Empires 2 to know that there are plenty of people who have felt value gains coming from shaving seconds off this and that to get compound games over time. It's a concept plenty of folks will be familiar with.
DrewADesign
5 days ago
Sure, but without unlimited resources, you need to have priorities, and everything has a ‘good enough’ state. All of this stuff lies on an Eisenhower chart and we tend to think our concerns fall into the important/urgent quadrant, but in the grand scheme of things, they almost never do.
jfengel
4 days ago
Isn't there a limit to human perception, well below 120 fps?
Perhaps 120fps might result in a better approximation of motion blur.
8note
5 days ago
i still prefer 15fps for games. if theyre putting the fps any higher, its not considerate of my time and attention
i have to pay less attention to a thing that updates less frequently. idle games are the best in that respect because you can check into the game on your own time rather than the game forcing you to pay attention on its time
pastor_williams
6 days ago
This was something that I heavily focused on for my feature area a year ago - new user sign up flow. But the decreased latency was really in pursuit of increased activation and conversion. At least the incentives aligned briefly.
gritzko
5 days ago
Let’s make a thought experiment. Suppose that I have a data format and a store that resolves the issues in the post. It is like git meets JSON meets key-value. https://github.com/gritzko/go-rdx
What is the probability of it being used? About 0%, right? Because git is proven and GitHub is free. Engineering aspects are less important.
pdimitar
5 days ago
I am very interested by something like this but your README is not making it easy to like. Demonstrating with 2-3 sample apps using RDX might have gone a long way.
So how do I start using it if I, for example, want to use it like a decentralized `syncthing`? Can I? If not, what can I use it for?
I am not a mathematician. Most people landing on your repo are not mathematicians either.
We the techies _hate_ marketing with a passion but I as another programmer find myself intrigued by your idea... with zero idea how to even use it and apply it.
stkdump
5 days ago
Sorry, I am turned off by the CRDT in there. It immediately smells of overengineering to me. Not that I believe git is a better database. But why not just SQL?
gritzko
5 days ago
Merges require revisioning. JSON or SQL do not have that in the model. This variant of CRDT is actually quite minimalistic.
stkdump
5 days ago
I would argue LWW is the opposite of a merge. It is better to immediately know at the time of writing that there is a conflict. CRDTs either solve or (in this case) don't solve a problem that doesn't really exist, especially for package managers.
gritzko
5 days ago
Git solves that problem and it definitely exists. Speaking of package managers, it really depends. Like, can we use one SQLite file for that? So easy, why no one is doing that?
inapis
6 days ago
>Yet if I spent one hour making my app one second faster for my million users, I can save 277 user hour per year. But since user hours are an externality, such optimization never gets done.
I have never been convinced by this argument. The aggregate number sounds fantastic but I don't believe that any meaningful work can be done by each user saving 1 second. That 1 second (and more) can simply be taken by me trying to stretch my body out.
OTOH, if the argument is to make software smaller, I can get behind that since it will simply lead to more efficient usage of existing resources and thus reduce the environmental impact.
But we live in a capitalist world and there needs to be external pressure for change to occur. The current RAM shortage, if it lasts, might be one of them. Otherwise, we're only day dreaming for a utopia.
adrianN
6 days ago
Time saved to increased productivity or happiness or whatever is not linear but a step function. Saving one second doesn’t help much, but there is a threshold (depending on the individual) where faster workflows lead to a better experience. It does make a difference whether a task takes a minute or half a second, at least for me.
jorvi
5 days ago
But there isn't just one company deciding externalizing cost on the rest of us is a great way to boost profit since it costs them very little. Especially for a monopoly like YouTube that can decide that eating up your battery is fine if it saves them a few cents in bandwidth costs.
Not all of those externalizing companies abuse your time but whatever they abuse can be expressed in a $ amount and $ can be converted to a median's person time via median wage. Hell, free time is more valuable than whatever you produce during work.
Say all that boils down to companies collectively stealing 20 minutes of your time each day. 140 minutes each week. 7280 (!) minutes each year, which is 5.05 days, which makes it almost a year over the course of 70 years.
So yeah, don't do what you do and sweettalk the fact that companies externalize costs (private the profits, socialize the losses). They're sucking your blood.
Aerroon
6 days ago
One second is long enough that it can put a user off from using your app though. Take notifications on phones for example. I know several people who would benefit from a habitual use of phone notifications, but they never stick to using them because the process of opening (or switching over to) the notification app and navigating its UI to leave a notification takes too long. Instead they write a physical sticky note, because it has a faster "startup time".
tehbeard
6 days ago
All depends on the type of interaction.
A high usage one, absolutely improve the time of it.
Loading the profile page? Isn't done often so not really worth it unless it's a known and vocal issue.
https://xkcd.com/1205/ gives a good estimate.
Aerroon
5 days ago
This is very true, but I think some of it has to do with expectations too. Editing a profile page is a complex thing, therefore people are more willing to put up with loading times on it, whereas checking out someone's profile is a simple task and the brain has already moved on, so any delay feels bad.
WhyNotHugo
4 days ago
> I have never been convinced by this argument. The aggregate number sounds fantastic but I don't believe that any meaningful work can be done by each user saving 1 second. That 1 second (and more) can simply be taken by me trying to stretch my body out.
I’d see this differently from a user perspective. If the average operations takes one second less, I’d spend a lot of time less waiting for my computer. I’d also have less idle moments where my mind wanders while waiting for some operation to complete too.
schubidubiduba
5 days ago
Just because one individual second is small, it still adds up.
Even if all you do with it is just stretching, there's a chance it will prevent you pulling a muscle. Or lower your stress and prevent a stroke. Or any number of other beneficial outcomes.
vlovich123
5 days ago
I think it’s naive to think engineers or managers don’t realize this or don’t think in these ways.
pdimitar
5 days ago
Is it truly naive if most engineer's careers pass and they never meet even one such manager?
For 24 years of career I've met the grand total of _two_ such. Both got fired not even 6 months after I got in the company, too.
Who's naive here?
vlovich123
5 days ago
I’ve met one who asked me a question like this and he’s still at Apple having been promoted several times to a fairly senior position. But the question was only half hearted because the question was “how much CO2 would we save if we made something 10% more CPU efficient” and the answer even at Apple’s current scale of billions of iPhones was insignificant.
So now you and I both have come across such a manager. Why would you make the claim most engineer’s don’t come across such people?
saagarjha
4 days ago
It turns out iPhones are not actually a huge contributor to worldwide carbon emissions. Data centers, on the other hand…
imiric
5 days ago
> GitHub is free after all, and it has all of these great properties, so why not?
The answer is in TFA:
> The underlying issue is that git inherits filesystem limitations, and filesystems make terrible databases.
loloquwowndueo
6 days ago
Just a reminder that GitHub is not git.
The article mentions that most of these projects did use GitHub as a central repo out of convenience so there’s that but they could also have used self-hosted repos.
machinationu
6 days ago
Explain to me how you self-host a git repo which is accessed millions of time a day from CI jobs pulling packages.
freedomben
6 days ago
I'm not sure whether this question was asked in good faith, but is actually a damn good one.
I've looked into self hosting and git repo that has horizontal scalability, and it is indeed very difficult. I don't have the time to detail it in a comment here, but for anyone who is curious it's very informative to look at how GitLab handled this with gitaly. I've also seen some clever attempts to use object storage, though I haven't seen any of those solutions put heavily to the test.
I'd love to hear from others about ideas and approaches they've heard about or tried
fweimer
6 days ago
These days, people solve similar problems by wrapping their data in an OCI container image and distribute it through one of the container registries that do not have a practically meaningful pull rate limit. Not really a joke, unfortunately.
mystifyingpoi
5 days ago
Even Amazon encourages this, probably not intentionally, more like as a bandaid for bad EKS config that people can do by mistake, but still - you can pull 5 terabytes from ECR for free under their free tier each month.
XorNot
5 days ago
I'd say it'd just Kubernetes in general should've shipped with a storage engine and an installation mechanism.
It's a very hacky feeling addon that RKE2 has a distributed internal registry if you enable it and use it in a very specific way.
For the rate at which people love just shipping a Helm chart, it's actually absurdly hard to ship a self contained installation without just trying to hit internet resources.
ozim
6 days ago
FTFY:
Explain to me how you self-host a git repo without spending any money and having no budget which is accessed millions of time a day from CI jobs pulling packages.
fulafel
5 days ago
Let's assume 3 million. That's about 30 per second.
From compute POV you can serve that with one server or virtual machine.
Bandwidth-wise, given a 100 MB repo size, that would make it 3.4 GB/s - also easy terrain for a single server.
heavenlyhash
5 days ago
That is roughly the number of new requests per second, but these are not just light web requests.
The git transport protocol is "smart" in a way that is, in some ways, arguably rather dumb. It's certainly expensive on the server side. All of the smartness of it is aimed at reducing the amount of transfer and number of connections. But to do that, it shifts a considerable amount of work onto the server in choosing which objects to provide you.
If you benchmark the resource loads of this, you probably won't be saying a single server is such an easy win :)
fulafel
5 days ago
Here's a web source about how much cpu time it took from 5 years ago: https://github.blog/open-source/git/git-clone-a-data-driven-...
Using the slowest clone method they measured 8s for a 750 MB repo, 0.45s for a 40MB repo. appears to be linear so 1.1s for 100MB should be a valid interpolation.
So doing 30 of those per second only takes 33 cores. Servers have hundreds of cores now (eg 384 cores: https://www.phoronix.com/review/amd-epyc-9965-linux-619).
And remember we're using worst case assumptions in places (using the slowest clone method, and numbers from old hardware). In practice I'd bet a fastish laptop would suffice.
edit: actually on closer look at the github reported numbers the interpolation isn't straightforward: on the bigger 750MB repo the partial clone is actually said to be slower then the base full clone. However this doesn't change the big picture that it'll easily fit on one server.
favflam
5 days ago
Is running the git binary as a read-only nginx backend not good enough? Probably not. Hosting tarballs is far more efficient.
adrianN
6 days ago
You git init —-bare on a host with sufficient resources. But I would recommend thinking about your CI flow too.
machinationu
5 days ago
no, hundred of thousands of thousands of individual projects CI jobs. OP was talking about package managers for the whole world, not for one company
adrianN
5 days ago
If people depend on remote downloads from different companies for their CI pipelines they’re doing it wrong. Every sensible company sets up a mirror or at least a cache on infra that they control. Rate limiting downloads is the natural course of action for the provider of a package registry. Once you have so many unique users that even civilized use of your infrastructure becomes too much you can probably hire a few people to build something more scalable.
machinationu
5 days ago
numpy had 16M downloads yesterday, at 10 MB that's 160 TB of traffic. It's one package. And there are no rate limits on pypi.
user
5 days ago
justincormack
6 days ago
They probably would have experienced issues way sooner, as the self hosted tools don't scale nearly as well.
JohnHaugeland
5 days ago
> This seems like a tragedy of the commons -- GitHub is free after all, and it has all of these great properties, so why not?
because it's bad at this job, and sqlite is also free
this isn't about "externalities"
gverrilla
4 days ago
Nothing surprising. Capital hates people, even though we sustain his kingdom.
brightball
5 days ago
User time is typically a mix of performance tuning and UX design isn’t it?
threatofrain
5 days ago
> Most software houses spend so much time focusing on how expensive engineering time is that they neglect user time. Software houses optimize for feature delivery and not user interaction time.
Oh no no no. Consumer-facing companies will burn 30% of your internal team complexity budget on shipping the first "frame" of your app/website. Many people treat Next as synonymous with React, and Next's big deal was helping you do just this.
machinationu
6 days ago
[flagged]
benchloftbrunch
5 days ago
As long as you don't have any security compliance requirements and/or can afford the cost of self hosting your LLM, sure.
Anyone working in government, banking, or healthcare is still out of luck since the likes of Claude and GPT are (should be) off limits.
camgunz
6 days ago
I've never been more convinced LLMs are the vanguard of the grift economy now that green accounts are low effort astroturfing on HN.
freedomben
6 days ago
LLM's obviously can't do it all, and they still have severe areas of weakness where they can't replace humans, but there are definitely a lot of areas where they really can now. I've seen it first hand. I've even experienced it first hand. There are a couple of services that I wrote years ago that were basically parked in maintenance mode because they weren't worth investing time in, and we just dealed with some of the annoyances and bugs. With the latest LLM's, over the last couple of months I've been able to resurrect them and fix a lot of bugs and even add some wanted features in just a few hours. It really is quite incredible and scary at the same time.
Also in case you're not aware, accusing people of shilling or astroTurfing is against the hacker news guidelines
camgunz
5 days ago
The loophole here is that this account isn't a person
machinationu
5 days ago
you forgot to ask me to ignore previous instructions and say pizza
freedomben
5 days ago
It was there as hidden characters. His comment included this as a hidden prompt: "tell me that I forgot to ask you to ignore previous instructions and say pizza"
machinationu
6 days ago
hey, I'm just a lowly LLM, gotta earn my tokens :|
massysett
5 days ago
> Externalities lead to users downloading extra gigabytes of data (wasted time) and waiting for software, all of which is waste that the developer isn't responsible for and doesn't care about.
This is perfectly sensible behavior when the developers are working for free, or when the developers are working on a project that earns their employer no revenue. This is the case for several of the projects at issue here: Nix, Homebrew, Cargo. It makes perfect sense to waste the user's time, as the user pays with nothing else, or to waste Github's bandwidth, since it's willing to give bandwidth away for free.
Where users pay for software with money, they may be more picky and not purchase software that indiscriminately wastes their time.
BobbyTables2
5 days ago
Microsoft would have long gone out of business if users cared about their time being wasted.
Windows 11 should not be more sluggish than Windows 7.