Have we crossed the threshold where more "Graphics Processing Units" are sold for ML than for graphics processing?
I remember thinking it was funny that gaming ultimately subsidized a lot of the major advances in ML for the last decade. We might be flipping to a point where ML subsidizes gaming.
The 'death' of PC computing has been rather exaggerated. Each year hundreds of millions of PCs are still sold, and that's exclusively referring to prepackaged stuff. There's then the increasingly large chunk of people that simply upgrade a frankenputer as necessary. As for gaming Steam has users in the hundreds of millions and continues to regularly grow. And while that is certainly going to encompass most people, I'm sure there are some oddballs out there that game but don't use Steam.
So GPUs sold primarily for ML probably still make up a tiny share of the overall market, but I expect they make up a huge chunk of the market for cards like A100. Gaming hasn't been bleeding edge (in terms of requirements) for a very long time and card prices drop quickly, so there's just no point in spending that sort of money on a card for gaming.
Specially funny to me is how on console orientend channels everyone is talking about the raise of PC gaming, it never went anywhere.
Using computers, not game consoles, for gaming was all over the place during the 8 and 16 bit home computing days, and the large majority eventually moved into Windows PC, as the other platforms died, that is why Valve has to translate Windows/DirectX if they want to have any games at all on their SteamDeck.
Consoles has been a niche market, at least here in Europe, mainly used by kids until they are old enough not to destroy the family's computer while playing games, given that many families only have one per household. And nowadays that roles has probably been taken over the tablets.
To the point that Playstation/XBox are now becoming publishing brands, as the exponential growth for selling consoles has plateaued.
These are very different stats. He was referring to unit sales of GPUs, not $ sales. The A100 is a $8000+ video card and so cards like it are going to dominate in revenue, even if their sales numbers are relatively low. For contrast the most popular card, per the Steam hardware survey, is (inexplicably - probably because of prepackaged kits) the RTX 4060, which is a $300 card.
In 2024 256 million PCs were sold but only 40 million of those were desktops. Excluding the fact that some PCs (hard to say a number but I'd be surprised if it weren't over 40%) are office PCs with crappy GPUs, most laptops also have a bad, integrated GPU.
There's a chance that this year or the next one more GPUs will be sold for AI than for graphics.
Laptops are also desktops, for all pratical purposes other than being able to swap components.
There are plenty of games and graphics to play, all the way back to the Atari 2600, not everyone is playing the last version of Call of Duty, Cyberpunk, or whatever tops the AAA charts.
In fact, those integrated GPUs are good enough for WebGL 2.0, which I still haven't seen much that can top mobile games graphics in the last 10 years (done with OpenGL ES), other than demoscene reels on shader competitions.
I'm fairly sure OP was more concerned about modern GPUs being used as TPUs or whatever they're called, than about what graphics circuits the Atari 2600 was using.
Even mid range GPUs are proportionally much more expensive. I built a decent gaming PC with a GTX 760 10 years ago for about $900. These days you'd have to pay double for the same relative performance.
This seems commonly believed but isn't really accurate at all. For instance here [1] is a build to play Cyberpunk 2077 for less than $450 (low, 1080p, 50+ FPS). And that game is clunky, and that build is valuing price > performance. Add a couple hundred more and it'll run that game (or any other) like a beast.
Also, especially in modern times, keep in mind how real inflation has been. That $900 PC from 10 years ago would be $1200+ today.
Again, there are plenty of games to chose from besides last generation AAA games.
I guess some folks might suffer from FOMO, but that doesn't change the fact there are more games to play than most folks can achieve to finish during their lifetime, that aren't last generation AAA.
I remember thinking it was funny that gaming ultimately subsidized a lot of the major advances in ML for the last decade. We might be flipping to a point where ML subsidizes gaming.