Blammar
9 months ago
I'm one of the designers of the GeForce 256.
The hardware transform and lighting was an enormous step forward, and there was no other single-chip manufacturer that had that functionality. Yes, it took a while before the game developers learned to use the hardware well. We supplied the cart; up to them to get the horse attached and working...
I'm not going to argue the meaning of "GPU" with the other posters. Suffice to say our intent was to implement the entire graphics pipeline in hardware, allowing a nearly complete offloading of the CPU.
We demonstrated the GeForce 256 to SGI engineers, and showed that we could run their OpenGl demos at roughly the same speed they ran on their Onyx systems which cost about 100 times as much.
The linked Nvidia article, to be honest, is marketing fluff. It took several years before we figured out how to turn a GPU into a usable parallel computation engine; in the meantime we had enough effective programmability that people hacked up D3D and OpenGl programs to do some interesting work.
dagw
9 months ago
I was working at a small animation studio back when the GeForce 256 was released. I distinctly remember one our animators buying one, popping into a 'random' Wintel machine, installing Maya, and having it run many of our scenes comparable to our very expensive SGI and Intergraph workstations. Everybody instantly realised that this was the future. 2 years later virtually the entire studio was running on commodity hardware costing less than a quarter of what we used to pay for workstations.