ksec
7 hours ago
All these improvement in a single year, 2025. While this may seem obvious to those who follows along the AI / LLM news. It may be worth pointing out again ChatGPT was introduced to us in November 2022.
I still dont believe AGI, ASI or Whatever AI will take over human in short period of time say 10 - 20 years. But it is hard to argue against the value of current AI, which many of the vocal critics on HN seems to have the opinion of. People are willing to pay $200 per month, and it is getting $1B dollar runway already.
Being more of a Hardware person, the most interesting part to me is the funding of all the developments of latest hardware. I know this is another topic HN hate because of the DRAM and NAND pricing issue. But it is exciting to see this from a long term view where the pricing are short term pain. Right now the industry is asking, we have together over a trillion dollar to spend on Capex over the next few years and will even borrow more if it needs to be, when can you ship us 16A / 14A / 10A and 8A or 5A, LPDDR6, Higher Capacity DRAM at lower power usage, better packaging, higher speed PCIe or a jump to optical interconnect? Every single part of the hardware stack are being fused with money and demand. The last time we have this was Post-PC / Smartphone era which drove the hardware industry forward for 10 - 15 years. The current AI can at least push hardware for another 5 - 6 years while pulling forward tech that was initially 8 - 10 years away.
I so wished I brought some Nvidia stock. Again, I guess no one knew AI would be as big as it is today, and it is only just started.
wpietri
an hour ago
This is not a great argument:
> But it is hard to argue against the value of current AI [...] it is getting $1B dollar runway already.
The psychic services industry makes over $2 billion a year in the US [1], with about a quarter of the population being actual believers. [2].
[1] The https://www.ibisworld.com/united-states/industry/psychic-ser...
[2] https://news.gallup.com/poll/692738/paranormal-phenomena-met...
jillesvangurp
5 hours ago
2025 was the year of development tool using AI agents. I think we'll shift attention to non development tool using AI agents. Most business users are still stuck using chat gpt as some kind of grand oracle that will write their email or powerpoint slides. There are bits and pieces of mostly technology demo level solutions but nothing that is widely used like AI coding tools are so far. I don't think this is bottle necked on model quality.
I don't need an AGI. I do need a secretary type agent that deals with all the simple but yet laborious non technical tasks that keep infringing on my quality engineering time. I'm CTO for a small startup and the amount of non technical bullshit that I need to deal with is enormous. Some examples of random crap I deal with: figuring out contracts, their meaning/implication to situations, and deciding on a course of action; Customer offers, price calculations, scraping invoices from emails and online SAAS accounts, formulating detailed replies to customer requests, HR legal work, corporate bureaucracy, financial planning, etc.
A lot of this stuff can be AI assisted (and we get a lot of value out of ai tools for this) but context engineering is taking up a non trivial amount of my time. Also most tools are completely useless at modifying structured documents. Refactoring a big code base, no problem. Adding structured text to an existing structured document, hardest thing ever. The state of the art here is an ff-ing sidebar that will suggest you a markdown formatted text that you might copy/paste. Tool quality is very primitive. And then you find yourself just stripping all formatting and reformatting it manually. Because the tools really suck at this.
utopiah
6 hours ago
> All these improvement in a single year
> hard to argue against the value of current AI
> People are willing to pay $200 per month, and it is getting $1B dollar runway already.
Those are 3 different things. There can be a LOT of fast and significant improvements but still remain extremely far from the actual goal, so far it looks like actually little progress.
People pay for a lot of things, including snake oil, so convincing a lot of people to pay a bit is not in itself a proof of value, especially when some people are basically coerced into this, see how many companies changed their "strategy" to mandating AI usage internally, or integration for a captive audience e.g. Copilot.
Finally yes, $1B is a LOT of money for you and I... but for the largest corporations it's actually not a lot. For reference Google earned that in revenue... per day in 2023. Anyway that's still a big number BUT it still has to be compared with, well how much does OpenAI burn. I don't have any public number on that but I believe the consensus is that it's a lot. So until we know that number we can't talk about an actual runway.
pjc50
5 hours ago
Investing a trillion dollars for a revenue of a billion dollars doesn't sound great yet.
steveBK123
15 minutes ago
Indeed, its the old Uber playbook at nearly two extra orders of magnitude.
It is a large enough number to simply run out of private capital to consume before it turns cash flow positive.
Lots of things sell well if sold at such a loss. I’d take a new Ferrari for $2500 if it was on offer.
ACCount37
4 hours ago
Is the AI progress in 2025 an outstanding breakthrough? Not really. It's impressive but incremental.
Still, the gap between the capabilities of a cutting edge LLM and that of a human is only this wide. There are only this many increments it takes to cross it.
chias
6 hours ago
These are not all improvements. Listed:
* The year of YOLO and the Normalization of Deviance
* The year that Llama lost its way
* The year of alarmingly AI-enabled browsers
* The year of the lethal trifecta
* The year of slop
* The year that data centers got extremely unpopular
steveBK123
10 minutes ago
> * The year that data centers got extremely unpopular
I was discussing the political angle with a friend recently. I think Big Tech Bro / VC complex has done themselves a big disservice by aligning so tightly with MAGA to the point AI will be a political issue in 2026 & 2028.
Think about the message they’ve inadvertently created themselves - AI is going to replace jobs, it’s pushing electric prices up, we need the government to bail us out AND give us a regulatory light touch.
Super easy campaign for Dems - big tech trumpers are taking your money, your jobs, causing inflation, and now they want bailouts !!
coffeebeqn
6 hours ago
Seems like Nvidia will be focusing on the super beefy GPUs and leaving the consumer market to a smaller player
Flow
5 hours ago
I don't get why Nvidia can't do both? Is it because of the limited production capabilities of the factories?
ACCount37
4 hours ago
Yes. If you're bottlenecked on silicon and secondaries like memory, why would you want to put more of those resources into lower margin consumer products if you could use those very resources to make and sell more high margin AI accelerators instead?
From a business standpoint, it makes some sense to throttle the gaming supply some. Not to the point of surrendering the market to someone else probably, but to a measurable degree.
ksec
2 hours ago
We will have to wait and see but my bet is that Nvidia will move to Leading Edge node N2 earlier now they have the Margin to work with. Both Hopper and Blackwell were too late in the design cycle. The AI hype and continue to buy the latest and great leaving Gaming at a mainstream node.
Nvidia using Mainstream node has always been the norm considering most Fab capacity always goes to Mobile SoC first. But I expect the internet / gamers will be angry anyway because Nvidia does not provide them with the latest and greatest.
In reality the extra R&D cost for designing with leading edge will be amortised by all the AI order which give Nvidia competitive advantage at the consumer level when they compete. That is assuming there are competition because most recent data have shown Nvidia owning 90%+ of discreet market share, 9% for AMD and 1% for Intel.
_s
6 hours ago
AMD owns a lot of the consumer market already; handhelds, consoles, desktop rigs and mobile ... they are not a small player.
utopiah
6 hours ago
They said "smaller" not small.
tstrimple
4 hours ago
Literally the only thing I've encountered regarding LLMS and AGI is morons stating that LLMs will never become AGI. I literally have no idea where the AGI arguments are coming from. No one who I've ever worked with who uses LLMs is talking about AGI. It's just a fucking distraction from actually usable tools right now. Is there anything except a strawman for LLM AGI?
cherryteastain
4 hours ago
Sam Altman [1] certainly seems to talk about AGI quite a bit
ACCount37
4 hours ago
Honestly, I wouldn't be surprised if a system that's an LLM at its core can attain AGI. With nothing but incremental advances in architecture, scaffolding, training and raw scale.
Mostly the training. I put less and less weight on "LLMs are fundamentally flawed" and more and more of it on "you're training them wrong". Too many "fundamental limitations" of LLMs are ones you can move the needle on with better training alone.
The foundation of LLM is flexible and capable, and the list of "capabilities that are exclusive to human mind" is ever shrinking.