Hardware Thread - General.

+
Glad I'm not the only one who thinks the race for higher screen resolutions is utterly pointless.

How about we make every game run at a beautiful 144FPS at 1080p instead?

People don't realize there's trade-offs. Do you want 8k visuals? Fine. Just be aware that its going to come at the cost of innovation in other areas, because there's absolutely no reason for devs or hardware makers to push the envelope when only the tiniest percentage of users will be able to take advantage of it.

Oh well. Thems the breaks.
 
How about we make every game run at a beautiful 144FPS at 1080p instead?

Because you'd need at least a 1070 to push that sort of framerate on a 1080p screen, or 60FPS @ 1440p where as 60FPS@1080p is doable with a GPU under $200.

Someone once said that there's absolutely no reason for devs or hardware makers to push the envelope when only the tiniest percentage of users will be able to take advantage of it, and I agree. While there are quite a few gamers who could do 75-90 FPS at 1080p or 60-75 FPS at 1440p if they don't run all the settings on Max, there is also a point of diminishing returns where it's really not worth spendign more time and effort to get smaller gains. I'm not willing to trade half the content quality for a 2% FPS boost or a slightly cooler-looking visual effect that I won't even notice in the middle of a gunfight.
 
Glad I'm not the only one who thinks the race for higher screen resolutions is utterly pointless.

How about we make every game run at a beautiful 144FPS at 1080p instead?

People don't realize there's trade-offs. Do you want 8k visuals? Fine. Just be aware that its going to come at the cost of innovation in other areas, because there's absolutely no reason for devs or hardware makers to push the envelope when only the tiniest percentage of users will be able to take advantage of it.

Oh well. Thems the breaks.

I'd down res a game's settings if it could get me better visuals.

4k = 8,294,400 pixels
1080p = 2,073,600.

That's a hell of a lot of GPU power to push the extra 6 million+ pixels. Heck, give us ray tracing and all that jazz at 420p.
Pushing top end CPU & GPU power to give us the best realism at low res surely would best stickmen models playing pong at 8k.
Post automatically merged:

Because you'd need at least a 1070 to push that sort of framerate on a 1080p screen, or 60FPS @ 1440p where as 60FPS@1080p is doable with a GPU under $200.

Someone once said that there's absolutely no reason for devs or hardware makers to push the envelope when only the tiniest percentage of users will be able to take advantage of it, and I agree. While there are quite a few gamers who could do 75-90 FPS at 1080p or 60-75 FPS at 1440p if they don't run all the settings on Max, there is also a point of diminishing returns where it's really not worth spendign more time and effort to get smaller gains. I'm not willing to trade half the content quality for a 2% FPS boost or a slightly cooler-looking visual effect that I won't even notice in the middle of a gunfight.

Jervi, i and others are willing to pay through the nose.
To clarify, im talking about diverting the use of GPU power away from pixels to visual quality. (grainy low res but life like vs basic model & lighting high res) Not to price/performance ratio.

Given software limitations, what can a 8700k & 2080ti achieve at 420p? What steps are required to best a 144p video of real life?
 
Last edited:
Triff - You say that as though everyone on the planet has a huge entertainment budget. Remember, the world is full of folks that aren't just like you; folks who spend too much on rent, utilities, and healthcare to run around with i9/RTX2080 rigs. Those who can pay through the nose are a minority, making those who will a sub-set of a minority.
 
Because you'd need at least a 1070 to push that sort of framerate on a 1080p screen, or 60FPS @ 1440p where as 60FPS@1080p is doable with a GPU under $200.

Someone once said that there's absolutely no reason for devs or hardware makers to push the envelope when only the tiniest percentage of users will be able to take advantage of it, and I agree. While there are quite a few gamers who could do 75-90 FPS at 1080p or 60-75 FPS at 1440p if they don't run all the settings on Max, there is also a point of diminishing returns where it's really not worth spendign more time and effort to get smaller gains. I'm not willing to trade half the content quality for a 2% FPS boost or a slightly cooler-looking visual effect that I won't even notice in the middle of a gunfight.
I'm confused. A 1070 is a mid-range card, a lot of people have them. It's only when you start getting to 1080 and 1080 Ti territory that the price becomes a major barrier.

Where is the point of diminishing returns for you? 144FPS is a hugely noticeable upgrade over 60. I have two 24" monitors, each 144hz and each 1080p. The pixel density is identical to when I had a 1440p monitor at 27".

Here's the thing... At 1080p, you have two choices - great visual quality at 60FPS, or lowered visual quality at over 100FPS. This is doable even on mid-range, heck even budget cards. I know because I did it with a 970 for a very long time, even when the 10-series cards were well into their lifecycle.

Those are great choices, both of them, and they're only doable because of 1080p. 2% FPS boost is simply false, as is a slightly smaller gain in visual quality, but it's fine if you just don't want it. We will have to agree to disagree, because I think this focus on "PIXELS PIXELS PIXELS" is completely stupid and misguided.

Also, you are seriously throwing me off - we had another guy with a car avatar but his opinions were completely different than yours, and I thought you were him for a second so I was really confused.

If you're out there, I forgot your name, post and remind me. :D
 
I'd suspect ... yes my opinion, no proof ... that 4K is more important for consoles then PCs because usually console players are on the far side of a room from their screen and PC users are at arms length.
 
didnt the 1070 card get a major price drop recently?

Nope still £400. might be one coming, but right now it's still at it's release retail price.

Triff - You say that as though everyone on the planet has a huge entertainment budget. Remember, the world is full of folks that aren't just like you; folks who spend too much on rent, utilities, and healthcare to run around with i9/RTX2080 rigs. Those who can pay through the nose are a minority, making those who will a sub-set of a minority.

yeah, even the 1070 is a relatively high end card, i think the stats linked in here before showed that of steam users, some 25% or so are on 1060's, and that's not likely to change that much in the next couple of years.
 
As I've said elsewhere, I'm currently running a GTX970 and don't play action based games (shooters, platformers, etc.) so everything I have runs at max settings with an FPS of 50+ (usually 70+). Unless there's some reason I need to I'll wait till the 1100 cards are released before I worry about upgrading.

Your "need" for a top end graphic card is VERY much based on what sorts of games you play. So ignore all the hype about which card is "best" and look at "what do I need it to do" instead.
 
Exactly. The question each of us needs to ask is 'am I OK with playing at this level of fidelity?' If the game runs fine and looks good enough on your current card, then don't waste your money on something you don't actually need. If it doesn't, find a card with a good price/performance that fits within your budget. Don't buy the top-of-the-line cards, as the next model down is often 80% of the performance at half the cost.

And if you're buying a card to impress others, don't. It doesn't matter- most of them don't ultimately care. Self-worth isn't determined by stuff.

I'd suspect ... yes my opinion, no proof ... that 4K is more important for consoles then PCs because usually console players are on the far side of a room from their screen and PC users are at arms length.
The other way around, tho'. The closer you are to the screen, the easier it is to see the pixels.
 
I'm confused. A 1070 is a mid-range card, a lot of people have them. It's only when you start getting to 1080 and 1080 Ti territory that the price becomes a major barrier.

Maybe in absolute terms but not as a percentage of the population;

GPU share.jpg

I think that this is a bit more representative of what gamers are running in their rigs. The 1060 (~$270) is the most popular, followed by the 1050Ti (<$200) and 1050 (<$150); the three least expensive cards have about 25% of the market share. The 1070 (~$400) isn't even in the top 5. I'm guessing that there are more people that see an extra $100-700 as more of a price barrier than you thought. Now, if the price difference between a 1070 and a 1060 were the same as the one between a 1050 and a 1050Ti then yes, I'd agree 100%, but it isn't so I can't. I will concede that the price gap between the 1070 and the 1080 is small (less than the 1050Ti/1060 or 1060/1070 gaps), but the way you trivialize the >$200 gap between a 1050TI and a 1070 makes me think we have irreconcilably different views of money.

Where is the point of diminishing returns for you? 144FPS is a hugely noticeable upgrade over 60.

I see it as the point where the increase in quality is no longer proportional to the increase in cost. And while I'm not denying that there is a noticeable difference between 60FPS and 120FPS, I don't think the difference is worth spending over double on my card (1050Ti is less than half the price of a 1070) and then having to spend a few hundred more on top of that to find a high-refresh 32" display. (I won't go back to tiny screens, but once you get above 27", display prices rise rather quickly, so factor that cost in too.) Back when I was young and immortal, and I had Uncle Sam giving me free room and board, I would've agreed with you that a few hundred dollars isn't much. But now that I can't use even one-quarter of my paycheck as play money, I don't see breaking 60 FPS worth the sacrifices I'd need to make in order to do so. (That, and I'd rather not have to explain to my wife why she can't have Starbucks for 4-6 months because that money went into my PCIx slot!)

2% FPS boost is simply false, as is a slightly smaller gain in visual quality, but it's fine if you just don't want it. We will have to agree to disagree, because I think this focus on "PIXELS PIXELS PIXELS" is completely stupid and misguided.

Okay, I was being slightly facetious when referring to what the devs would need to do to improve things from where they are, but I think we do agree that pixels for the sake of pixels is silly. I simply go a step further and question sacrificing gameplay for graphics in general given that dev-hours are a finite resource. That's not to say that I don't want those who spent more on their GPUs and monitor than I spent on most of the cars I've owned to get their money's worth, but I've seen enough games over the decades that try hiding their flaws behind glitzy graphics that I've developed a visceral distrust of games with awesome graphics.
 
I'm confused. A 1070 is a mid-range card, a lot of people have them. It's only when you start getting to 1080 and 1080 Ti territory that the price becomes a major barrier.

Where is the point of diminishing returns for you? 144FPS is a hugely noticeable upgrade over 60. I have two 24" monitors, each 144hz and each 1080p. The pixel density is identical to when I had a 1440p monitor at 27".

Here's the thing... At 1080p, you have two choices - great visual quality at 60FPS, or lowered visual quality at over 100FPS. This is doable even on mid-range, heck even budget cards. I know because I did it with a 970 for a very long time, even when the 10-series cards were well into their lifecycle.

Those are great choices, both of them, and they're only doable because of 1080p. 2% FPS boost is simply false, as is a slightly smaller gain in visual quality, but it's fine if you just don't want it. We will have to agree to disagree, because I think this focus on "PIXELS PIXELS PIXELS" is completely stupid and misguided.

100hz+ is great but id take increased visual quality (realism) over pixel count and frame rate.
For me, 60fps is the cutoff. Below that is way too jerky.

I'd like to see processing power & software design diverted to realism until we hit that magical indistinguishable look milestone, then increase the framerate & pixel count as time goes by.

I'm happy with the release of the RTX 2080 series. A huge hunk of expensive die focused on nothing but visual effect.
A step in the right direction.

I'd suspect ... yes my opinion, no proof ... that 4K is more important for consoles then PCs because usually console players are on the far side of a room from their screen and PC users are at arms length.

As V.Dog says, its the other way round. Liken it to a football stadium screen - the closer you are, the more you see the individual pixels.

For PC monitors its around 90 DPI. 16:9 24" 1080p = 91.79 DPI.
To keep the same at 16:9 32" you need 1440p. (Happens to be exactly same - 91.79)

Theres a reason 91.79 is strongly recommended by quite a few. No scaling issues as the myriad of software is built around that defacto standard. With 4K you need to up the scaling to see font, when you do that and the software you want to run isnt specifically written with high DPI in mind you encounter problems. Even Windows & Steam struggled with it for years. (youd think theyd be the ones to get it right, let alone that piece of accounting software you like.)

With 27" monitors its a choice 1080p or 1440p. 1080p (81.59 DPI) you see individual pixels and at 1440p (108.79) most people have to up the scaling to see font adequately. Then comes scaling issues.

To those on 24" looking for a larger monitor - skip 27" and go to 16:9 32" 1440p.
It's not too large, doesnt require the extreme GPU power 4K requires, and comes with 0 scaling issues. Also, larger screen area than 34" 21:9 and no black bars as most content is 16:9. Compatibility king.

 
Last edited:
Good points all around. I don't agree with most of them, but I appreciate that you guys took the time to articulate your thoughts better.

I think, ultimately, we can all agree on this: gameplay > graphics, always. You won't catch me saying anything other than that.

However, I would like to point out that frame rates are actually not mutually exclusive with gameplay, because they are easily boosted (or lowered) based on player decisions. For instance, if I care more about 144FPS than great graphics, I can drop the settings to low-medium and get around that on even a low-end card (or a last-gen mid-range card) in most well-optimized games.

Now, graphical fidelity... That's another story and definitely is a matter of development resources.
 
Read my sig. Optimisation happens near the end of development. You don't smooth things out until all of the pieces are in place.

As for the specs required, we don't know yet. Once the optimastion process is complete, they'll tell us what we need.
 
So we saw the specs on the demo. 1080TI and they may have mentioned it was on high and not ultra/max etc possibly? My question is, this game was announced 2011,2012? I assume the tech then did not really exist to their standards and since it does now, how does it optimize in two years? Obviously new cards, parts etc will get only better so will there be more options for improving on graphics or will they keep it the same as is now. If you buy a better card then congrats itl just play it more smoothe? How have games done that in the past?

Just seems this game is going to be extremely demanding even for a 1080ti

Thanks folks

Excited for the game
 

Attachments

  • Captureasdas.PNG
    Captureasdas.PNG
    11 KB · Views: 52
Top Bottom