Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    SUGGESTIONS
  • STORY
    MAIN JOBS SIDE JOBS GIGS
  • GAMEPLAY
  • TECHNICAL
    PC XBOX PLAYSTATION
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
FAN ART (THE WITCHER UNIVERSE)
FAN ART (CYBERPUNK UNIVERSE)
OTHER GAMES
Menu

Register

Building a gaming PC

+
Prev
  • 1
  • …

    Go to page

  • 24
  • 25
  • 26
  • 27
  • 28
  • …

    Go to page

  • 154
Next
First Prev 26 of 154

Go to page

Next Last
D

DuranA.111

Rookie
#501
Jan 27, 2015
The 900 series is not defective; that's FUD. The 970 is the only card subject to the problem. The 980 performs all the way up to the full 4GB VRAM. Nothing wrong with the 980.
Click to expand...
It's not just that though it's Nvidia's promises that Maxwell would be 20nm, so much for that.

Even though EVGA has long been nVidia's flagship manufacturer, and they still have the best warranties and customer support going, other manufacturers have better cooling implementations that run quieter and cooler. Gigabyte G1 and MSI 4G models are the best of these. ASUS, Palit, and Zotac are not far behind, though I really dislike Zotac support. With any high-end card, you should make sure it will fit your case; the Gigabyte is 312mm long.
Click to expand...
Which would you go for between Gigabyte and MSI?
 
G

GuyNwah

Ex-moderator
#502
Jan 27, 2015
Between Gigabyte and MSI, I'd go for Gigabyte, unless space or quiet operation were a concern. Gigabyte's 3-fan cooler is a marvel, but it is also very long and far from silent. MSI is not far behind, though, and ASUS is not far behind MSI. EVGA got left at the gate on the 900 series.

nVidia got too optimistic on 20nm and got screwed when Apple and Qualcomm got all the 20nm capacity. We may not see 20nm consumer GPUs from them for years, or ever. They're going to 16nm with Pascal, but Pascal is for their monster render appliances.
 
D

DuranA.111

Rookie
#503
Jan 27, 2015
Between Gigabyte and MSI, I'd go for Gigabyte, unless space or quiet operation were a concern.
Click to expand...
I'd take that to mean MSI is more silent while Gigabyte's got better performance? What about thermal performance?

nVidia got too optimistic on 20nm and got screwed when Apple and Qualcomm got all the 20nm capacity.
Click to expand...
There's that and then there's them delaying the GTX 980ti till 2016.
 
G

GuyNwah

Ex-moderator
#504
Jan 27, 2015
DuranA said:
I'd take that to mean MSI is more silent while Gigabyte's got better performance? What about thermal performance?



There's that and then there's them delaying the GTX 980ti till 2016.
Click to expand...
Thermal is better on the Gigabyte G1. Overclocking performance always depends on the individual GPU; you're as likely to get a good overclocker with one as the other.
 
eskiMoe

eskiMoe

Mentor
#505
Jan 27, 2015
http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation



The error, as NVIDIA explains it, is that in creating the GTX 970 reviewer’s guide, the technical marketing team was unaware of Maxwell’s aforementioned and new “partial disable” capabilities when they filled out the GTX 970 specification table. They were aware that the GTX 970 would have the full 256-bit memory bus, and unaware of the ability to independently disable ROPs they assumed that all 64 ROPs and the full 2MB of L2 cache was similarly available and wrote the specification table accordingly
Click to expand...
So much bullshit it hurts my head.

But this will come to bite their asses at some point, most likely very soon.

 
Last edited: Jan 27, 2015
T

tahirahmed

Rookie
#506
Jan 27, 2015
Another explanation

http://videocardz.com/54774/nvidia-updates-geforce-gtx-970-specifications-to-56-rops-after-reports-of-3-5gb-issue

The catchy bits

1) 0.5 GB memory pool is operating at 1/7th speed of main 3.5 GB pool.
2) The 3.5 GB pool is operating at 224 bit interface.

This could be Nvidia's most successful GPU launch ever but they messed it up.
 
G

GuyNwah

Ex-moderator
#507
Jan 27, 2015
^ That is a blunder on the order of the time long ago that Intel said the FDIV bug didn't matter. No, it's not a lie, nobody meant to deceive anybody. Somebody got or guessed at the wrong number, and it didn't get caught by proofreaders. It may even have been an engineering specification that driver writers worked against. It's a breakdown in communication within the company that will not be a career-limiting mistake due to the out-of-control demand for engineers in Silicon Valley.

56 ROPs, not 64. And because in nVidia architecture the memory bus is tied to the ROPs, it matches perfectly the observed inability to address one-eighth of the memory.
 
  • RED Point
Reactions: tahirahmed
A

atc.710

Rookie
#508
Jan 27, 2015
All this GTX 970 castration mustard race is making my balls shrivel...

let's lighten the mood, kay?



vehmently hillarious and topical :D
 
  • RED Point
Reactions: eskiMoe, Luxorek, sidspyker and 3 others
T

tahirahmed

Rookie
#509
Jan 27, 2015
^ lmao :D
 
T

tahirahmed

Rookie
#510
Jan 28, 2015
Btw there is a respectable rep named Peter on Nvidia forum who is now genuinely trying to help customers getting refunds etc on 970. If you're no longer satisfied by this GPU and your request of return getting denied by the individual seller then you can PM him the details and he'll talk to them on your behalf. I think Nvidia is now in full damage control mode just like AnandTech said.

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090

@Guy N'wah

Hey man can you advice me on how sensible it would be to get into a dual card setup now ? I am asking about a crossfire setup, I have no experience with dual cards before so I am not sure what to expect from it, good experience or bad experience overall ? I have Sapphire R9 290 Tri-X and since they are real cheap these days now I was thinking about getting another one before TW3.

Asking online so far is a mixed bag experience for me, some say it's great while some say it's not worth it and staying with single card is the best solution. I also heard that new crossfire XDMA is much better than before solving frame pacing issues greatly and giving better scaling overall.

What's your advice on all this, should I wait for R9 300 series to appear or just get another R9 290 for crossfire setup ?
 
eskiMoe

eskiMoe

Mentor
#511
Jan 28, 2015
tahirahmed said:
Btw there is a respectable rep named Peter on Nvidia forum who is now genuinely trying to help customers getting refunds etc on 970. If you're no longer satisfied by this GPU and your request of return getting denied by the individual seller then you can PM him the details and he'll talk to them on your behalf. I think Nvidia is now in full damage control mode just like AnandTech said.

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090
Click to expand...
I'm still waiting to get a reply from the retailer I bought the cards from but if it fails, I'll contact him.

Also, regarding crossfire:

https://www.youtube.com/watch?v=pGN1na3F5do

Generally speaking I've heard that SLI works better than crossfire in most cases. But that wouldn't surprise me since Nvidia did pioneer that technology.
 
  • RED Point
Reactions: tahirahmed
yayodeanno.831

yayodeanno.831

Forum veteran
#512
Jan 28, 2015
This is probably redundant but I think it's the original article everybody else is quoting
[h=1]Nvidia: the GeForce GTX 970 works exactly as intended[/h]
 
  • RED Point
Reactions: GuyNwah
T

tahirahmed

Rookie
#513
Jan 28, 2015
eskimoe said:
I'm still waiting to get a reply from the retailer I bought the cards from but if it fails, I'll contact him.

Also, regarding crossfire:

https://www.youtube.com/watch?v=pGN1na3F5do

Generally speaking I've heard that SLI works better than crossfire in most cases. But that wouldn't surprise me since Nvidia did pioneer that technology.
Click to expand...
Thanks for the reply, yes I saw that video before but so far everything I heard about XDMA CrossFire is good, some saying it's the right step to put CrossFire on par with SLI, still I know SLI has wider support in games and it generally has more consistent performance but since I have MSI Z97 G45 mobo with 3 PCI-E slots supporting either SLI or CF I think I am wasting that mobo with one GPU only.

For now I game only on 1080p and for that my single R9 290 give consistent 60 fps most of the time though I see 45 - 50 fps with demanding games like DAI specially with the use of AA, I am thinking to go 1440p and with that I want 60 fps at highest possible settings in games and some room for experimenting with AA.
 
Last edited: Jan 28, 2015
G

GuyNwah

Ex-moderator
#514
Jan 28, 2015
tahirahmed said:
Btw there is a respectable rep named Peter on Nvidia forum who is now genuinely trying to help customers getting refunds etc on 970. If you're no longer satisfied by this GPU and your request of return getting denied by the individual seller then you can PM him the details and he'll talk to them on your behalf. I think Nvidia is now in full damage control mode just like AnandTech said.

https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/post/4438090/#4438090

@Guy N'wah

Hey man can you advice me on how sensible it would be to get into a dual card setup now ? I am asking about a crossfire setup, I have no experience with dual cards before so I am not sure what to expect from it, good experience or bad experience overall ? I have Sapphire R9 290 Tri-X and since they are real cheap these days now I was thinking about getting another one before TW3.

Asking online so far is a mixed bag experience for me, some say it's great while some say it's not worth it and staying with single card is the best solution. I also heard that new crossfire XDMA is much better than before solving frame pacing issues greatly and giving better scaling overall.

What's your advice on all this, should I wait for R9 300 series to appear or just get another R9 290 for crossfire setup ?
Click to expand...
If you had asked me a year ago, I would have said wait for a sufficiently powerful R9 300. But 290's in Crossfire run quite well. The big question is whether you intend to play DX9 titles, because AMD's frame pacing fix does not work in DX9. If all the demanding games you play are in DX10 or DX11, you're fine.
 
  • RED Point
Reactions: tahirahmed
T

tahirahmed

Rookie
#515
Jan 28, 2015
Guy N'wah said:
If you had asked me a year ago, I would have said wait for a sufficiently powerful R9 300. But 290's in Crossfire run quite well. The big question is whether you intend to play DX9 titles, because AMD's frame pacing fix does not work in DX9. If all the demanding games you play are in DX10 or DX11, you're fine.
Click to expand...
Thanks man. Well I have very few games remaining to be played in DX9 and from here on all the games we'll get will be on DX11 or ahead so I don't think I will go back to DX9 and even if I do a single R9 290 will be enough for them (in case I had to turn off CF).

Btw what's your take on DX12 ? I heard that for full DX12 features we'll need new GPUs while AMD said DX12 will be supported on their GCN GPUs so it's still okay to go for R9 290 even if DX12 is coming soon ? The performance improvement benefits will still apply to recent GPUs right ?
 
sidspyker

sidspyker

Ex-moderator
#516
Jan 28, 2015
Feature levels, DirectX12 has feature levels. The spec is still not finished so whatever GPU claims to be DX12, it only has partial support.

That said, they've mentioned that it's not an issue for the main 'highlight' of DX12, which is better performance by utilizing CPU properly, more drawcalls etc etc
 
  • RED Point
Reactions: tahirahmed
T

Toddster.255

Rookie
#517
Jan 29, 2015
I literally just purchased a GTX 970 G1 a week before this memory issue was discovered. Thinking I'll just keep it anyway, I can't justify an extra $250 for the GTX 980. Might get a Pascal card later next year anyway. So far I have been very happy with the performance, and while I don't think NVIDIA should have lied about the specs, I'll still buy their products in the future.
 
sidspyker

sidspyker

Ex-moderator
#518
Jan 29, 2015
Yeah it's still the same card when it comes to performance and benchmarks, TDP etc etc. I still would've bought it if it was 3.5GB only and I can't think of anything else in that price range offering the same performance and features. On the (somewhat) plus side, they said they're working on a driver that will help with dealing with the 512MB allocation somehow, here's hoping.

That said
http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

Concluding

Our product reviews in the past few months and its conclusion are not any different opposed to everything that has happened in the past few days, the product still performans similar to what we have shown you as hey .. it is in fact the same product. The clusterfuck that Nvidia dropped here is simple, they have not informed the media or their customers about the memory partitioning and the challenges they face. Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned.
The Bottom line

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities. If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980. However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it. Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer. But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer.
Click to expand...
 
  • RED Point
Reactions: MkTama
S

SkycladGuardian

Forum veteran
#519
Jan 29, 2015
Since a google search gave me no results, I hope that maybe somebody her does know whether Nvidia's Gameworks features like Hairworks etc. will run on a dedicated GPU as it is possible with PhysX.
 
G

GuyNwah

Ex-moderator
#520
Jan 29, 2015
SkycladGuardian said:
Since a google search gave me no results, I hope that maybe somebody her does know whether Nvidia's Gameworks features like Hairworks etc. will run on a dedicated GPU as it is possible with PhysX.
Click to expand...
AFAIK, only PhysX-based features can be offloaded to a dedicated GPU. Hairworks is DirectCompute, and there is no provision for a dedicated GPU in DirectCompute.
 
  • RED Point
Reactions: SkycladGuardian
Prev
  • 1
  • …

    Go to page

  • 24
  • 25
  • 26
  • 27
  • 28
  • …

    Go to page

  • 154
Next
First Prev 26 of 154

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.