Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

GTX 970 VRAM Segmentation Handled Properly in RED Engine?

+
Prev
  • 1
  • 2
  • 3
  • 4
  • 5
Next
First Prev 3 of 5

Go to page

Next Last
G

GuyNwah

Ex-moderator
#41
Jan 31, 2015
mouacyk said:
Would we miss out on any RED Engine features that leverage only NVidia hardware/software?
Can you elaborate on the screen size limiting TW2?
Click to expand...
In TW2, performance at any feature level tracks the number of pixels on the screen, very closely. The correlation between screen size (in pixels), frame rate, ROP count, and core clock is close enough that you can estimate the performance of a particular GPU to within about 10%.

Any PhysX effects (which doesn't include HairWorks) have to run on CPU if your GPU is AMD, and they would have to be dialed back for CPU performance. Otherwise I don't expect them to be much different. They would have to tell us which effects use PhysX, and they haven't done so, so my guess is it will be mostly particle effects.
 
S

Scholdarr.452

Banned
#42
Jan 31, 2015
This discussion is so pointless. The 970 is basically the same card as before. All the initial benchmarks are still valid. Nothing has really changed. Ok, I can "only" really use 3.5GB of VRAM but that was already the case when the initial benchmarks were done. I guess most people (like myself) bought a 970 because it offered a great price/value ratio - and it still does no matter what.

My 970 offered me some great graphics in Shadow or Mordor, Dragon Age Inquisition and AC Unity, basically all on ultra/maximum settings. I don't see why TW3 should be any different here, given the fact that the nvidia drivers already addressed the proper allocation of resources. And of course they will continue to improve on that...
 
  • RED Point
Reactions: MkTama
M

mouacyk

Senior user
#43
Jan 31, 2015
Scholdarr said:
This discussion is so pointless. The 970 is basically the same card as before. All the initial benchmarks are still valid. Nothing has really changed. Ok, I can "only" really use 3.5GB of VRAM but that was already the case when the initial benchmarks were done. I guess most people (like myself) bought a 970 because it offered a great price/value ratio - and it still does no matter what.

My 970 offered me some great graphics in Shadow or Mordor, Dragon Age Inquisition and AC Unity, basically all on ultra/maximum settings. I don't see why TW3 should be any different here, given the fact that the nvidia drivers already addressed the proper allocation of resources. And of course they will continue to improve on that...
Click to expand...
Thanks for sharing your experience with the 970 and that your needs are still met. I have no qualms with the vanilla TW3 and 970. Requirements are already out that 2GB is recommended. Having said that, there are graphical enhancements that can be made outside of the vanilla game (in terms of mods) which will raise graphics requirements, especially VRAM.

On the driver's allocation of the memory and future improvements... stuttering and framerate drops using >3.5GB are odd effects of proper resource allocation. NVidia has tweeted and posted they will not be making a specific driver update for this card's memory management.
 
A

arkhenon

Rookie
#44
Jan 31, 2015
Actually nVidia devs stated on their forums that they "are" going to provide a driver for it. Of course it won't magically make the .5 GB portion faster. But they said that at least they can use that cache to load the driver overhead, so that the assets of the game will be stored in the main VRAM. Of course I have no idea as to how efficient this solution will be, but at least it's something.
 
G

GuyNwah

Ex-moderator
#45
Jan 31, 2015
arkhenon said:
Actually nVidia devs stated on their forums that they "are" going to provide a driver for it. Of course it won't magically make the .5 GB portion faster. But they said that at least they can use that cache to load the driver overhead, so that the assets of the game will be stored in the main VRAM. Of course I have no idea as to how efficient this solution will be, but at least it's something.
Click to expand...
Unfortunately, they later retracted that statement and said no, they would not. Widely reported, for example: http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-we-will-not-boost-geforce-gtx-970-performance-with-drivers/
 
S

Scholdarr.452

Banned
#46
Jan 31, 2015
mouacyk said:
Thanks for sharing your experience with the 970 and that your needs are still met. I have no qualms with the vanilla TW3 and 970. Requirements are already out that 2GB is recommended. Having said that, there are graphical enhancements that can be made outside of the vanilla game (in terms of mods) which will raise graphics requirements, especially VRAM.
Click to expand...
In all honesty: 500Mb of VRAM isn't a big deal-breaker not matter what. I can almost guarantee that you won't even notice a noticable difference at all in a properly developed game between the game using 3.5GB or using 4GB for graphics.

Take Shadow of Mordor for example: the difference between high and ultra textures is really a minor thing. Some people don't even notice any real difference at all. Nevertheless ultra textures need a lot more VRAM.


On the driver's allocation of the memory and future improvements... stuttering and framerate drops using >3.5GB are odd effects of proper resource allocation
Click to expand...
DAI and SoM don't have any framerate drops or stuttering at all. AC Unity has both in some situations but in my experience there is no real connection to the VRAM usage. The drops occur "randomly" without any real rise in VRAM usage (above 3.5GB ) on my PC...

NVidia has tweeted and posted they will not be making a specific driver update for this card's memory management.
Click to expand...
Actually nvidia has posted that they won't be making a specific driver update for the GTX 970 because it's already part of the existing drivers. nvidia said that the current driver is already programmed to limit GTX 970 cards to 3.5GB in games.

---------- Updated at 11:03 PM ----------

Guy N'wah said:
Unfortunately, they later retracted that statement and said no, they would not. Widely reported, for example: http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-we-will-not-boost-geforce-gtx-970-performance-with-drivers/
Click to expand...
German PC Games Hardware asked nvidia Germany directly about their strategy here and that was their answer:
On inquiry with Nvidia Germany we learnt that they will of course further optimise their drivers - "as it always was". The memory management was an optimisation like "anything else", but there won't be any special driver only for the GTX 970. A possible exception could be special cases found by "massive internal tests" or by the press which would require optimization.
Click to expand...
http://www.pcgameshardware.de/Geforce-GTX-970-Grafikkarte-259503/News/Treiber-Update-VRAM-3-5-GB-1149190/
 
A

arkhenon

Rookie
#47
Jan 31, 2015
Guy N'wah said:
Unfortunately, they later retracted that statement and said no, they would not. Widely reported, for example: http://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-we-will-not-boost-geforce-gtx-970-performance-with-drivers/
Click to expand...
Ah, I see. Thanks for the information :)
 
eskiMoe

eskiMoe

Mentor
#48
Jan 31, 2015
arkhenon said:
But people who own one 970 should not worry about the VRAM in my opinion, as they would unlikely be able to run the game on anything more than 1080p anyway (myself included). Of course, if we are talking about 970 SLI, now that's another - and completely valid - issue.
Click to expand...
Since a card with max VRAM of 2GB GDDR5 is enough for 1080p and high settings, I doubt VRAM will be an issue at higher resolutions. I suspect that we need to wait until 2016 and 14/16nm cards to arrive until we can go crazy with the graphics settings and resolutions with the game. The current gen of cards just don't seem to have enough juice. Unless of course, you put bunch of them in SLI.
 
Last edited: Jan 31, 2015
R

randyrhoads

Rookie
#49
Jan 31, 2015
So NV wont even mend their fuck up?
 
S

Scholdarr.452

Banned
#50
Jan 31, 2015
randyrhoads said:
So NV wont even mend their fuck up?
Click to expand...
There is nothing to "mend". It's simply the architecture of the GPU. It's hardware restricted.
 
M

mouacyk

Senior user
#51
Jan 31, 2015
randyrhoads said:
So NV wont even mend their fuck up?
Click to expand...
I understand the frustration, but I don't want this thread to go off in a different direction. At this point, NVidia is kind of quiet and you can follow-up on the GeForce forums.

Scholdarr said:
There is nothing to "mend". It's simply the architecture of the GPU. It's hardware restricted.
Click to expand...
Well, through the close collaboration of engine developers and the NVidia driver developers (through better API or documentation), the user experience could still be salvaged. That is all I am hoping for, and that engine developers know and acknowledge the hardware restriction you mentioned. Otherwise, they could be banging their heads, delaying stuff, not understanding why unit-testing texturing causes unexpected stutter.
 
R

red36

Forum regular
#52
Jan 31, 2015
well i hope i can get the game running on my 770 at 60 frames. In general these companies like to make a big difference between graphics cards, but in my experience it's mainly up to how well the game is optimized. When I went from a 280 to a 770, I didnt see a revolutionary difference in performance. Sure there was a bump, but not like i expected. For instance I'd still hit frame lag pockets in witcher 2 on the 770, when if i turned it down to medium settings on the 280 I'd more or less experience pretty good performance.
 
G

GuyNwah

Ex-moderator
#53
Jan 31, 2015
Scholdarr said:
There is nothing to "mend". It's simply the architecture of the GPU. It's hardware restricted.
Click to expand...
It's not unlike the fuck-up that damaged the 660Ti's reputation, but this one is worse. (The 660Ti has a 192-bit bus driven by 6 memory controllers. 1.5GB is addressable through the 192-bit bus. But the last 512MB is visible to just 2 of the controllers, so it can't be interleaved and in effect has only a 64-bit bus.)

nVidia seems to think this is an architecture that is acceptable or even desirable. The rest of the world isn't so sure.

---------- Updated at 03:20 PM ----------

red36 said:
well i hope i can get the game running on my 770 at 60 frames. In general these companies like to make a big difference between graphics cards, but in my experience it's mainly up to how well the game is optimized. When I went from a 280 to a 770, I didnt see a revolutionary difference in performance. Sure there was a bump, but not like i expected. For instance I'd still hit frame lag pockets in witcher 2 on the 770, when if i turned it down to medium settings on the 280 I'd more or less experience pretty good performance.
Click to expand...
Witcher 2's a DX9 game. The GTX 2xx's can spool up to 100% and run it very well indeed. Later GPUs don't have an advantage running this older technology.
 
R

red36

Forum regular
#54
Jan 31, 2015
Yeah, just as an addendum, I meant Nvidia or tech websites, not game makers themselves, make a big deal out of the x70 to y70 jumps or the ati equivalent.
 
J

jediknight16

Senior user
#55
Jan 31, 2015
The game uses Nvidia techs, and will surely be optimized for Nvidia and Amd cards. It should run great on 970 even if you may not be able to put everything on Ultra settings. The 900 series is just here because Nvidia couldn't do 16nm gpus, it doesn't bring lot of major improvements compared to previous family.
I will just wait that they release new graphic cards on 16nm architecture.
 
M

mouacyk

Senior user
#56
Feb 1, 2015
Guy N'wah said:
It's not unlike the fuck-up that damaged the 660Ti's reputation, but this one is worse. (The 660Ti has a 192-bit bus driven by 6 memory controllers. 1.5GB is addressable through the 192-bit bus. But the last 512MB is visible to just 2 of the controllers, so it can't be interleaved and in effect has only a 64-bit bus.)

nVidia seems to think this is an architecture that is acceptable or even desirable. The rest of the world isn't so sure.
Click to expand...
As a consumer who has been burned once with the GTX 660, I surely do not accept or desire this architecture when there are alternatives. Just don't want to be burned again, also.
 
O

onionshavelayers

Rookie
#57
Feb 1, 2015
I'll wait for 16nm nvidia cards to come out, then I'll ask for my refund hehehehe. I hope Nvidia will allow me to do that :)
 
S

Scholdarr.452

Banned
#58
Feb 1, 2015
Guy N'wah said:
nVidia seems to think this is an architecture that is acceptable or even desirable. The rest of the world isn't so sure.
Click to expand...
Well, it depends on how you look at the issue. Was it a good thing to be not completely honest and transparent about it at release? Hell, of course not! Is it a desirable or "well done" architecture from a purely technical perspective. Hell, of course not!

But one question remains: is this archtecture acceptable if you can have great performance for a relatively small price (especially compared to nvidia's usual pricing policies)? Is it acceptable if you can get such a GTX 970 that offers at least 80% of the performance of the "full" GTX 980 for less than 60% of the price? I'd say yes and that's the reason why I'll probably stick with my GTX 970. Even with "only" 3.5GB of fully usable VRAM the GTX 970 is still an awesome deal.
 
  • RED Point
Reactions: MkTama
M

MkTama

Rookie
#59
Feb 1, 2015
caruga said:
Are they pricey? If I compare them to how much titans cost, or even the 780 ti until recently, then they don't look pricey at all. They're top of the market but in the context of what we were previously getting they still look good bang for the buck to me...
Click to expand...
Probably my English isn't good enough to express myself clearly but I'll try nonetheless. As Isee it they are not so less pricey, if Nvidia priced them less than before is not because they're doing us a favor but because their relative value have decreased. Leaving alone the fact that 600€ (980s cost no less here) are indeed a hefty amount of money per se, this is a critical GPU generation because there are no great gains from the last one and because it's a transition: the consoles are only now being more and more pushed and like it or not, they lead the market. In addiction to that, we're waiting for some great changes: win 10 approaching, DirectX 12, skylakes CPU approaching, new pascal GPUs which are probably a greater innovation than skylake itself, SSDs will clearly become mainstream and have a big drop in price. All this things make me think that there is a little revolution coming and therefore these Maxwell GPUs will become old soon. In fact 600€ for a gpu which, IMHO, will become old not later than its sister 970, because leaving alone the memory non-issue their performance are not so distant, well this is for me a bad deal. If you have unlimited money there's no problem in upgrading every year, but if you want to make a good deal I think this is not the right moment, that's all. When it was released, the Titan was a better deal than this 980, even with its much higher price which not everyone was willing to pay. A Titan still has something to say and it wasn't a gaming only GPU, much more versatile, even if the great majority bought it just for playing.
In addiction to that, 980 is the top GPU and if you buy the top, you're always overspending in exclusivity. The best bang for buck today is 970, it only may become old quickly but the recent vram drama has nothing to do with it. What's coming has.
 
Last edited: Feb 1, 2015
C

caruga

Rookie
#60
Feb 1, 2015
MkTama said:
Probably my English isn't good enough to express myself clearly but I'll try nonetheless. As Isee it they are not so less pricey, if Nvidia priced them less than before is not because they're doing us a favor but because their relative value have decreased. Leaving alone the fact that 600€ (980s cost no less here) are indeed a hefty amount of money per se, this is a critical GPU generation because there are no great gains from the last one and because it's a transition: the consoles are only now being more and more pushed and like it or not, they lead the market. In addiction to that, we're waiting for some great changes: win 10 approaching, DirectX 12, skylakes CPU approaching, new pascal GPUs which are probably a greater innovation than skylake itself, SSDs will clearly become mainstream and have a big drop in price. All this things make me think that there is a little revolution coming and therefore these Maxwell GPUs will become old soon. In fact 600€ for a gpu which, IMHO, will become old not later than its sister 970, because leaving alone the memory non-issue their performance are not so distant, well this is for me a bad deal. If you have unlimited money there's no problem in upgrading every year, but if you want to make a good deal I think this is not the right moment, that's all. When it was released, the Titan was a better deal than this 980, even with its much higher price which not everyone was willing to pay. A Titan still has something to say and it wasn't a gaming only GPU, much more versatile, even if the great majority bought it just for playing.
In addiction to that, 980 is the top GPU and if you buy the top, you're always overspending in exclusivity. The best bang for buck today is 970, it only may become old quickly but the recent vram drama has nothing to do with it. What's coming has.
Click to expand...
Ok, so in the context of them becoming obsolete they aren't good value-for-money. But is it ever untrue that there's something better coming on the horizon? By your logic, shouldn't titans be a bad deal in 2013 because Maxwell was coming next year?
*EDIT Ok I see what you're saying: you think there will be a revolution in performance, not just a small increase. Time will tell I suppose.
 
Last edited: Feb 1, 2015
  • RED Point
Reactions: MkTama
Prev
  • 1
  • 2
  • 3
  • 4
  • 5
Next
First Prev 3 of 5

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.