Performance. Intel still manages to squeak out some wins against AMD when it comes to single-threaded CPU task but in every other metric, they are chasing AMD.
Not only in performance, but also in profits, as shown by the Intel vs. AMD financial results. Especially in the server CPU market segment, where Intel has a diminishing market share and losses of billions, while AMD has an increasing market share and profits.
The new Zen 5 has much better single-thread performance than any Intel CPU (e.g. the slower 5.5 GHz Zen 5 launched this week matches a 6.0 GHz Raptor Lake), so for a couple of months, until Intel launches Arrow Lake S, AMD will have much better single-thread performance. After that, Intel and AMD will be again at parity, with negligible differences in ST speed.
Power too. I think we're still in the happy place where buying epyc is cheaper than being given xeons for free after you look at the electricity bill over the life of the machine.
Is this actually true? At 200w power draw 24/365 you're talking 1800kwh. at 0.2$/kwh that's 360$.
These server processors seem to be charged out at multiple thousands of dollars. Is the difference in efficiency in server actually as large as claimed? Surely a sufficient discount on the capital cost of a processor can more than make up for extra power usage.
I guess it all depends on the comparative price/power consumption, it just feels like the difference would have to be rather large to me.
You're not considering the core count difference. Amd has 128cores 256 threads at 2.2ghz at 360w tdp while Intel has 144c/144t, @ 2.2ghz @ 330w Tdp. Cloud providers care about density and power usage.
More cores per server = less power = more servers per rack = more capacity = more opportunities for sales of products.
I'm not really in the space - I was curious. I think people tend to overstate the importance of power consumption relative to the price of the products they buy and the value of their time (eg. if it's a workstation part, higher performance is worth a significant tradeoff in power if it gets jobs like compilation done 10% faster based on the employee time it can save)
For servers, I'm always curious because even though they run 24/365 (so power consumption is v.important), the capital cost of new server chips is incredibly high - eg. those 144c chips I presume you're referring to cost 10k+, so even over a 5y service life that's probably only 20% of the chip only, and relative to the AMD chip the additional inefficiency could easily by compensated by an appropriate discount.
Obviously all of this is why intel still exists in the DC, they just can't charge the same prices as AMD can is all.
With great power comes great heat output. Lower power = lower heat output = lower bill for cooling. or the same bill at more capacity = more margin = more profit for the cloud providers :)
Or the other thing to consider is, less power usage on a global scale = less co2 output.
It looks like the gap has narrowed, though TDP might not mean what it once did. The comparison I remember is a 64 core Rome chip against two 28 core xeons where the Rome one was significantly faster and something like 1/3 the power consumption of the dual. I've got one of those 64C chips and haven't followed the market as closely since.
At least for consumer desktop CPUs AMD is significantly ahead for gaming (with X3D) while for MT/productivity workloads Intel and AMD seem to be pretty even (if we ignore power usage and the whole melting CPU thing..). Which makes since Intel generally offers more cores per $ these days.