GTX 970 surfaced

+
Agreed, plus the 970 and 980 are very power efficient, so there is a fair change of actually using less power than you are using with that 770 (which is still a good card btw).
Yeah, in fact I believe (thinking about price/performance factor) it's not needed to upgrade every year, if you possess a good card (770, 780, 780ti, I prefer Nvidia but the same can be applied to amd counterparts) the upgrade can be done every two years while still keeping on a very high level with very few compromises. Even SLI I think is unnecessary, often gives even more problems with drivers, games and whatever else.. but in the end it always depend from what you want, bearing in mind that following the highest performance achievable = following the market = killing your wallet. Everyone needs to find that personal "sweetzone" which, if surpassed, leads to an excessive expense for what you get. Mine is usually a little behind the top, because top = trend = more money for nothing :)

Zotac cards are often overpriced for me, don't know if they are actually Palit (could have read but not sure).. the majority thinks that best manufacturers still remains Evga, Asus, Gigabyte and Msi.. I own a Gainward 560ti with Phantom cooler and didn't give me any problems in the last 3 years.. Yeah it served me well for the expense until I'm willing to add some DX11 features, now it's beggin' for a replacement ;) but I think I'll wait till early to mid 2016 for an entirely new rig
 
Last edited:
I agree about the "not every year", I try to target for two years from a card. My current 670 is a Palit, and it's been very good (and also less prone to getting clogged up with cat-hair than ANY card I've had in the past).

I think from what I read that Zotac is connected to Sapphire. And yes, apparently more common in Asia, which I guess is why it was the first to have info on pricing here. Still unofficial though.
 
Two years? Talk about not getting the proper 'mileage' from your equipment... you can easily last almost(stress on ALMOST) an entire generation of consoles if you buy a good high end GPU with the latter years skirting over more to the high-mid compromise settings.

And why would you get a new GPU every year, taking Nvidia for example. Year 1 they release new architecture, Year 2 the same architecture gets a refresh with minor changes, Year 3 new architecture.
 
Last edited:
Zotac and Palit are different companies. Zotac is a medium-size (6,000 employees) manufacturer of a full line of computer components, based in Macau with a number of factories in mainland China. @Dragonbird Yes, they also own Sapphire.

Palit, located in Hong Kong, is much smaller and a specialist in enthusiast graphics cards. They also own Gainward.

Considering the graphics card business alone, though, Palit/Gainward is the largest supplier, and Zotac/Sapphire is second.
 
Last edited:
All of you talking about upgrading every 2 years is making me feel bad. My GTX 260 really needs replacing.
 
Last edited:
Because higher power consumption results in you paying for that power. It will reflect in your electricity bills. Lower power consumption also means lower requirements for cooling and less noise. All of those are important. I do care for performance, but I'm not interested in having a jet turbine in my room.

It doesn't impact bills in any meaningful way unless you are progamer who plays 10h, a day or you are mining bit coins. I play ~20h a week and power used by GPU during playing video games costs me a little bit more than 1$ per month and I own a card with very high TDP (And I barely can hear it). So what I can save? 20 cents per month on energy?

Previous GeForce cards already had good power consumption and noise levels and I doubt that anyone will be like "I need to upgrade - I will save few cents per month and my card will be 1dB quieter."
 
Last edited:
It doesn't impact bills in any meaningful way unless you are progamer who plays 10h, a day or you are mining bit coins. I play ~20h a week and power used by GPU during playing video games costs me a little bit more than 1$ per month and I own a card with very high TDP (And I barely can hear it). So what I can save? 20 cents per month on energy?

Previous GeForce cards already had good power consumption and noise levels and I doubt that anyone will be like "I need to upgrade - I will save few cents per month and my card will be 1dB quieter."

It is nice to not need a kilowat PSU though, especially since those are expensive.
 
Previous GeForce cards already had good power consumption and noise levels and I doubt that anyone will be like "I need to upgrade - I will save few cents per month and my card will be 1dB quieter."
Firstly TDP does not equate to just electricity consumption, it's the amount of heat it will produce.

And what's there to complain about? The one benefit of the accidental 2 extra years on the same node has given engineers time to get more efficiency out of the same, this will only be better for the future. They didn't exactly have any options for a node that has overstayed its welcome and now that they have an efficient design. Going from 28nm to 20nm will be a 40%+ TDP reduction in itself which gives more room for performance, add an already efficient architecture that is being built upon and we see more room for performance.
 
@eskimoe
I would personally wait for 980 Ti. Looking at the Nvidia/Amd competitions in the past, my prediction is that AMD is going to release a single card that is slightly faster than the 980, then Nvidia is going to release a 980 Ti which is slightly faster than AMD's card, then AMD is going to tape two of their cards together and say it's the fastest, then Nvidia tapes two 980s together, and finally AMD reduces the prices of their cards. The name of the card may not be 980 Ti, but I'm sure a 980 that is properly pushed to its limits is coming not long from now, maybe 3-6 months from now.

The problem with those cards is that they are extremely expensive and offer a horrible bang for buck. You will end up with the fastest single GPU but it will also be very cost ineffective and you pay heavily for having the fastest out there. The comparison between a 970 and 980 shows that 15% of extra performance costs you an extra 220 or so and I doubt that a ti variant will be anymore rewarding than that.
What 501105 said. GTX 970 SLI combo just offers you shitloads more performance vs a single GTX 980 and only costs slightly more. If the 980 isn't getting a pricecut (which it won't knowing nvidia) I will more than likely opt for the 970 SLI setup.
 
And what's there to complain about? The one benefit of the accidental 2 extra years on the same node has given engineers time to get more efficiency out of the same, this will only be better for the future. They didn't exactly have any options for a node that has overstayed its welcome and now that they have an efficient design. Going from 28nm to 20nm will be a 40%+ TDP reduction in itself which gives more room for performance, add an already efficient architecture that is being built upon and we see more room for performance.

I don't complain about lower TDP. It obviously advantage. It's just quite disappointing to see a little to none improvement in performance. I still believe that they could make a GPU with worse TDP, but better performance. It's pity they didn't, because I've hoped to see next step in terms of GPU power.

Firstly TDP does not equate to just electricity consumption, it's the amount of heat it will produce.

Yes, but usually by improving TDP you also improve power efficiency, because less energy is wasted on heat and it's also true in that case - 980 has lower power consumption than 780 and 780Ti.
 
As expected, the performance leap is quite minimal considering it's a new generation. The low power consumption is very nice, but overclocked to their full potential, the 980 and 780 Ti are pretty much neck and neck in frame pushing power (look at linus' video review). However, that aggressive pricing on the 970 is very pleasing to see. I'll get one of those for my new rig (or wait for some 780 Tis to drop in price) and wait for nvidia to release their 20nm maxwell before really splashing my money on a gpu. Based on how these new maxwells are performing, I'm expecting the 980 Ti or 990 (or whatever it'll be called) to be a real beast.
 
I don't complain about lower TDP. It obviously advantage. It's just quite disappointing to see a little to none improvement in performance. I still believe that they could make a GPU with worse TDP, but better performance. It's pity they didn't, because I've hoped to see next step in terms of GPU power.
I didn't mean YOU were complaining, it was a general statement. Yes sadly like I said, we aren't going to see any more worthwhile performance improvements on this node we need 20nm and/or 16nm FinFET fast. I think this is the first time that GPUs are actually behind in this technology compared to CPUs, although GPUs are still really fast.


Yes, but usually by improving TDP you also improve power efficiency, because less energy is wasted on heat and it's also true in that case - 980 has lower power consumption than 780 and 780Ti.
Yeah, just saying it's not JUST electricity being saved but also less heat produced which in turn gives room for more performance, overclockability.
 
I'm getting a GTX 970. The 980 is far too expensive for what it offers...Can't spot a DirectCU II version though...
 
Ok, this is the card I'm getting.

http://www.pcgarage.ro/placi-video/asus/geforce-gtx-970-strix-oc-4gb-ddr5-256-bit/

Would this PSU be good enough for it? http://www.pcgarage.ro/surse/corsair/builder-series-cx600m-cp-9020060/ I am also planning to OC my CPU and GPU in the future, when it is necessary. Is is enough?

Also, forgive my incredibly noobish question but can overclocking a component just...fry it on the spot, even if I do everything by the book? I am a bit apprehensive about OC-ing my components because one day my non-overcloked 6950 just fried but I can only upgrade every 3-4 years so I really need to get the best out of them. Is it really as dangerous and risky as I think it is?
 
I am not sure what to make of these strange statements about TDP, electricity consumption, and heat.

All of TDP is electricity that is consumed and converted to heat when the GPU is running at full power.

Higher TDP is more electricity drawn from the power supply and more heat that must be dissipated. And now I have to shout. HEAT IS FUCKING EXPENSIVE! If you are in a position of having to pay for the electricity that is converted to that heat, and the electricity that you spend getting rid of it, a 100W power saving pays for the whole card over its useful life.

A fractional performance improvement, $100-200 reduction in purchase price, and 100 watts less power in operation is not just a bargain but a damn good one.

@Cormacolindor Yes, it is possible to "let the magic smoke out" even in conservative overclocking. If you are careful, it is a small risk, but it is not zero.

There was a whole lot of trouble three years ago with overclocked nVidia cards, even factory overclocked models. In order to improve stability in DirectX 11, nVidia released drivers that boosted the supply voltage on the cards. This was done without taking into account the overvoltage that was already programmed into factory and end-user overclocked cards. The driver update actually did destroy some cards.

The Corsair CX500 and CX600 are a different platform from the popular CX430. I like the CX430, but it's just enough for a basic computer with a 150 watt or so GPU, not a foundation for overclocking. I don't know the CX500 and CX600.
 
Last edited:
Top Bottom