We can use the GPUs for research (64-bit scientific compute), 3d graphics, a few other things. We programmers will reconfigure them to something useful.
At least, the GPUs that are currently plugged in. A lot of this bullshit bubble crap is because most of those GPUs (and RAM) is sitting unplugged in a warehouse, because we don't even have enough power to turn all of them on.
So if your question is how to use a GPU... I got plenty of useful non-AI related ideas. But only if we can plug them in.
I wouldn't be surprised if many of those GPUs are just e-waste, never to turn on due to lack of power.
> I wouldn't be surprised if many of those GPUs are just e-waste, never to turn on due to lack of power.
That's my fear.
The problem is these GPUs are specifically made for datacenters, So it's not like your average consumer is going to grab one to put into their gaming PCs.
I also worry about what the pop ends up doing to consumer electronics. We'll have a bunch of manufacturers that have a bunch of capacity that they can no longer use to create products which people want to buy and a huge backstop of second hand goods that these liquidated AI companies will want to unload. That will put chip manufactures in a place where they'll need to get their money primarily from consumers if they want to stay in business. That's not the business model that they've operated on up until this point.
We are looking at a situation where we have a bunch of oil derricks ready to pump, but shut off because it's too expensive to run the equipment making it not worth the energy.
> As it turns out Nvidia's H100, a card that costs over $30,000 performs worse than integrated GPUs in such benchmarks as 3DMark and Red Dead Redemption 2
I predict there's going to be a niche opening up for companies to recycle the expensive parts of all these compute hardware that AI companies are currently buying and will probably be obsolete/depreciated/replaced in the next 2-5 years. The easiest example is RAM chips. There will be people desoldering those ICs and putting them on DDR5 sticks to resell to the general consumer market.
A technological arms race just occurred in front of your eyes for the past 5 years and you think they're going to let the stockpile fall into civilian hands?
In 2 years the next generation chips will be released and th se chips will be obsolete.
That's truly e-waste. Now in practice, we programmers find uses of 10+ year old hardware as cheap webhosta, compiler/build boxes, Bamboo, unit tests, fuzzers and whatever. So as long as we can turn them on we programmers can and will find a use.
But because we are power constrained, when the more efficient 1.8nm or 1.5nm chips get released (and when those chips use 30% or less power), no one will give a shit about the obsolete stockpile.
In what sense? Not competitive for chat bot providers to use? Is that a metric that matters?
> when the more efficient 1.8nm or 1.5nm chips get released
What if they don't get released? You don't have a broad and competitive set of players providing products in this realm. How hard would it be to stop this?
> no one will give a shit about the obsolete stockpile.
You have lived your life with ready access to cutting edge resources. You ever wonder how long that trend could _possibly_ last?
As in: the 1.5nm or 1.8nm GPUs will use less power and therefore can actually be plugged in.
We are power constrained. The GPUs of this generation can't even be plugged in yet because of these power constraints.
When power is a problem, getting lower power GPUs in is a priority. The 1.8nm and 1.5nm next generation is already in production, and will likely launch before these massive GPU stockpiles are used.
And then what? Why plug in last generations crap when the next generation is shipping?
--------
Todays GPUs have to actually launch and be deployed while they are useful. Otherwise they could fully be obsolete and lose significant value.
I assume even really out of date cards and racks will readily find some use, when the present-day alternative costs ~$100k for a single card. Just have to run them on a low-enough basis that power use is not a significant portion of the overall cost of ownership.
It’ll be interesting to see what people come up with to get conventional scientific computing workloads to work on 16 bit or smaller data types. I think there’s some hope but it will require work.
At least, the GPUs that are currently plugged in. A lot of this bullshit bubble crap is because most of those GPUs (and RAM) is sitting unplugged in a warehouse, because we don't even have enough power to turn all of them on.
So if your question is how to use a GPU... I got plenty of useful non-AI related ideas. But only if we can plug them in.
I wouldn't be surprised if many of those GPUs are just e-waste, never to turn on due to lack of power.