GTX 970 surfaced

+
Actually reduction of power consumption is nowhere near 100W. Under maximum load it even takes a little bit more power then 780/780ti.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-13.html

Other reports and nVidia's own word on the subject disagree. Difference of 80-100W under full load.

If you're only an enthusiast or don't pay for your electricity, maybe it doesn't matter. But if you're a big enough customer that your nVidia representative is going to listen to your requirements, and pass them up the chain, it's a huge difference.
 
Last edited:
And opinions are like assholes. We all have one.
Blurt much? :facepalm:


Justifying manufacture actions and calling it progress by purchasing new graphic card for 500-700$ every 10-12 months to gain 1-10fps. I wish there was so many AAA exclusive games coming out on PC as there is hardware parts. It would be true master race.. until then we can all dream.
 
Blurt much? :facepalm:


Justifying manufacture actions and calling it progress by purchasing new graphic card for 500-700$ every 10-12 months to gain 1-10fps. I wish there was so many AAA exclusive games coming out on PC as there is hardware parts. It would be true master race.. until then we can all dream.
If you have the money why not? If it means that you get more performance with lower TDP. Besides you can still sell your year old cards and fetch a good price on them.

EDIT: And btw, I don't buy GPUs every year. Just making a point.
 
Last edited:
Money part aside, I don't see the point of buying a new GPU of the same architecture which is known to be a refresh of the last one(ex:GTX 600-->700). So at the very least 2 years is when one should change.
 
So at the very least 2 years is when one should change.

Even that is not necessary, unless there are some important gains. Normally, if your card is still enough - why upgrade? If not, that's another matter.

It doesn't mean that manufacturers shouldn't update hardware frequently. There is no point for them to stagnate it. They'll have enough customers who really need every bit of performance and are ready to pay for upgrades. And gaming is not such case.
 
Last edited:
It's not "necessary" at all, I'd give a new GPU 4-7 years until I change it, squeeze every last drop and when I'm done I'll use it for an HTPC :p just saying if you HAVE to get a new one 2 year cycles(respective of architectural changes) is the only one that seems to make SOME sense.
 
Money part aside, I don't see the point of buying a new GPU of the same architecture which is known to be a refresh of the last one(ex:GTX 600-->700). So at the very least 2 years is when one should change.
But if you're a billionaire oil sheik why not :D

I'd get a new Lambo every year. ^^
 
Do they release new series every year, or will they stick with the 900 series in 2015?
Every year but not a new architecture every year, just a refresh of the same cards. So "new technology" cards every 2 years in layman terms.
 
It's not "necessary" at all, I'd give a new GPU 4-7 years until I change it, squeeze every last drop and when I'm done I'll use it for an HTPC :p just saying if you HAVE to get a new one 2 year cycles(respective of architectural changes) is the only one that seems to make SOME sense.

I have to agree. If I get the best GPU this year, then upgrading next year is totally unjustitfiable in my country especially considering the european pricing bullshit. I can barely get the new stuff I posted earlier. If my 6950 hadn't fried I still would have used it till TW3 because it could have run DA:I quite well on medium-high.

Hell, I still have my i5 2500k at stock and it is still a beastly CPU and when I'll really need to, I'll OC it.
 
High-end cards tend to be grey-market here, which means not much of a warranty. So the upgrade is either "when it stops working" or "when it isn't good enough any more".
 
Top Bottom