This is misleading. For nearly $2000, you better be getting 4K with ray tracing (max settings), AKA, a top of the line device. 1440p/144 is midrange now.
Which that $1865 one does not provide. Even before ray tracing, most games struggle to hit the 144 rate you're aiming for, so with ray tracing that drops down to like ~40 (Cyberpunk 2077 for example in that 6900XT card). You have to enable workarounds like DLSS/FSR to make those games playable.
The only way you're getting good framerate at 4k is without ray tracing, but you're paying $2000 to have to worry about still disabling settings? Ridiculous!
So yes, they are overpriced. For $2000 you should not be worried about having to enable FSR.
The usual excuse when this is brought up is "well just don't play those games, they seem unoptimized" to which again, the question is, why are you spending two thousand dollars to avoid playing certain games? How absurd.
>This is misleading. For nearly $2000, you better be getting 4K with ray tracing (max settings), AKA, a top of the line device. 1440p/144 is midrange now.
You're both just setting an arbitrary baseline of performance at $2k. Arbitrary comparisons cannot be misleading.
Yes, I don't know where this person is getting the idea that you shouldn't have to worry about settings at $2k? Fifteen years ago you could spend $10k and still have to worry about settings!
Well the website is called logical increments, it's really hard to call ray tracing a logical increment. You are sacrificing too much to gain so little, at least for now.
This is how I see it now too. The manufacturers are capitalizing on people needing to have the best by making even more expensive top tier components. You don’t need 4K or ray tracing to enjoy a triple-A game. Last year, I got a 3070-equipped system, force feedback driving wheel, and a 4k TV as a monitor for under $2k. The games are still quite gorgeous, and I have no idea what I’m missing by not having something more expensive.
Ray tracing IS the logical increment the industry has decided on though. So again, you're spending $2000 to miss out on what the industry has decided on is the next logical step, only to have to spend another ~$500 for whatever upgrade to actually play those games.
>Ray tracing IS the logical increment the industry has decided on though.
I disagree. Nvidia decided it didn't had enough exclusivity, if the industry decided it was a logical increment then it would've had to come as a request from video games and video games engine developers towards hardware companies. It would've been a more natural progression, we wouldn't have it now and also it wouldn't feel like it's in a perpetual beta phase.
Sounds like you would like to use a console for gaming.
When I built my last desktop, in 2015, you would've blown way more than 2k just on the over-the-top monitor that you're mad can't be cheaply populated with high-speed, high-complexity output in 2023. Now you can get a decent quality 4k monitor plus a desktop that will drive it for years of normal tasks, all for less than 1k.
2015 is eight years ago. In PC terms, that's forever.
I got my 4k 60Hz monitor in 2017 for $400 or so. You can get them for $250 these days, so I agree that is pretty mid-range by now - or even the lower part of it. That leaves $750 for the computer.
According to the logicalincrements linked above, that gets you a RX 6600 - which they rate for 40 FPS @ 4k in The Witcher 3 - a game which came it out 2015.
So no, you cannot get a decent 4k monitor and a PC able to drive it for less than 1k. That's the entire problem: there is no midrange anymore.
Steam Survey says you're wrong. Unless something changed recently most people are 1080p and with midranges (GTX*60 & *70). I was running a 970 and 1080p until 2021. I play mainstream stuff, FotM, and competitive.
The way the market is structured just doesn't offer a whole lot of reward. Forknife, LoL, CSGO, CoD, etc... Are all built to accommodate toasters. The big titles are are built for consoles and poorly optimized, there isn't a whole lot to be had aside from higher framrates - and many times these are locked.
I upgraded for compute power.
As for resolution I'm not convinced it's worth the upgrade. Maybe in a few years.
But am generally in agreement, I did a big upgrade a while ago and while it's great for a few things, the extra hardware and resolution don't mean much while playing Stellaris.
8 years is no longer forever in PC terms. It used to be you could only go a couple years before your computer could no longer run anything but that's no longer the case.
For an example, I usually build a PC once the older one is horribly outdated. So that was 1999, 2002, 2006, 2011, and ... Well that one has turned into the PC of Theseus. I upgraded to an SSD and $300 video card in 2018, and in 2022 I bought a new cpu/mobo/ram. Still plays games at 1440p at acceptable frame rates.
If I wanted to game at 4k 240Hz, yeah I'd probably have to spend $3500 (and it still wouldn't run that great) which just tells me that 4k gaming isn't ready for prime time
Generally I agree. However, I saw negligible performance improvement from getting a ten year newer mobo/cpu/ram, so the previous trend didn't apply. I only upgraded because there were some system stability issues (probably ram caused, but buying new ddr3 ram in an attempt to fix it seemed a waste). From a performance point of view, the new GPU and SSD in 2018 were the new computer experience.
> 2015 is eight years ago. In PC terms, that's forever.
I think the point is that this isn't true anymore.
In the 80s or 90s eight years was huge in terms of computer hardware advances. In the last eight years though? Meh. Aside from one of my laptops, all of my computers are older than 2015 and they are all perfectly fine for current use. Hardware doesn't advance very quickly anymore, which is great, but bad for sales.
1440p should be midrange but the gpu makers refuse to make that a reality. How is it that decent framerate 1080p was entry level SEVEN years ago, and cards that double the speed of the rx480 are still going for 300-500 dollars?
If you spent over 2k on an over the top monitor in 2015, then you were being silly because even top of the line GPUs weren't hitting 1440 at a stable rate.
We're not talking about normal tasks, if you want normal tasks even a chromebook would be fine. We're talking about intensive tasks, like gaming, which push boundaries and justify the high cost... except for components like GPUs which are dramatically overpriced still. These components make it misleading when people then claim that you can get a top of the line system for under $2k - which is not possible (a top of the line GPU costs more than $2k alone).
You can make tradeoffs, but when paying that much who wants to deal with tradeoffs?
I use a 4k 28inch as my primary monitor for code at home (laptop screen has email and slack). I run i3wm and typically code with 4 terminals windows next splitting the monitor into 4 equal sub windows, each having 1080p and running vim. It's very nice and whenever I have to visit the office and is stuck with their 1080p monitors I find my self severely limited because I'm so used to my home setup.
I've been "coding" on dual 27" 4K@60 displays for years. Can't go back to lo-dpi displays after getting used to having truly legible/beautiful text rendering.
I love my pair of 32" 4k monitors for coding. Screen real-estate lets me have everything I want open, open. I don't have to go digging for windows. I don't have to worry about arranging windows carefully. I get to see a lot of code, and a lot of terminal history.
Gaming and TV on the other hand, I'm perfectly happy with a smaller 1080p monitor. There just isn't much to be gained by a few more pixels.
Which that $1865 one does not provide. Even before ray tracing, most games struggle to hit the 144 rate you're aiming for, so with ray tracing that drops down to like ~40 (Cyberpunk 2077 for example in that 6900XT card). You have to enable workarounds like DLSS/FSR to make those games playable.
The only way you're getting good framerate at 4k is without ray tracing, but you're paying $2000 to have to worry about still disabling settings? Ridiculous!
So yes, they are overpriced. For $2000 you should not be worried about having to enable FSR.
The usual excuse when this is brought up is "well just don't play those games, they seem unoptimized" to which again, the question is, why are you spending two thousand dollars to avoid playing certain games? How absurd.