Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Prev
  • 1
  • …

    Go to page

  • 95
  • 96
  • 97
  • 98
  • 99
  • …

    Go to page

  • 134
Next
First Prev 97 of 134

Go to page

Next Last
G

GuyNwah

Ex-moderator
#1,921
Sep 27, 2014
theLaughingStorm said:
@Guy N'wah do you think it is currently a good idea to sli 780ti's if you already have one, or no?
Click to expand...
With a good CPU and motherboard and a tolerance for the extra hassles of running SLI, yes, and at that point you're close to the limit of what consumer motherboards can do. ("Good" = Core i5/i7 Sandy Bridge, AMD FX-8xxx, and up.)
 
  • RED Point
Reactions: Boreas_Mun_bg
S

samplerico

Rookie
#1,922
Sep 28, 2014
Guy N'wah said:
With a good CPU and motherboard and a tolerance for the extra hassles of running SLI, yes, and at that point you're close to the limit of what consumer motherboards can do. ("Good" = Core i5/i7 Sandy Bridge, AMD FX-8xxx, and up.)
Click to expand...
And the 3GB of vram? it will be a problem in the future or not? will the horsepower of two 780 in sli be oversized with 3GB of vram?

srry my english...
 
T

theLaughingStorm.108

Rookie
#1,923
Sep 28, 2014
Guy N'wah said:
With a good CPU and motherboard and a tolerance for the extra hassles of running SLI, yes, and at that point you're close to the limit of what consumer motherboards can do. ("Good" = Core i5/i7 Sandy Bridge, AMD FX-8xxx, and up.)
Click to expand...
My cpu is an Intel Core i5-4670K and my motherboard is an MSI Z87-GD65 Gaming. My avatar is actually a pic of my rig, though its probably hard to see any of it.

I am strongly considering sli at this point, though I am sort of nervous about it. Especially considering if I do this I wont be touching my pc(upgrade wise of course) for a very long time.
 
S

samplerico

Rookie
#1,924
Sep 28, 2014
theLaughingStorm said:
My cpu is an Intel Core i5-4670K and my motherboard is an MSI Z87-GD65 Gaming. My avatar is actually a pic of my rig, though its probably hard to see any of it.

I am strongly considering sli at this point, though I am sort of nervous about it. Especially considering if I do this I wont be touching my pc(upgrade wise of course) for a very long time.
Click to expand...
Then think about the VRAM..
 
T

tahirahmed

Rookie
#1,925
Sep 28, 2014
theLaughingStorm said:
My cpu is an Intel Core i5-4670K and my motherboard is an MSI Z87-GD65 Gaming. My avatar is actually a pic of my rig, though its probably hard to see any of it.

I am strongly considering sli at this point, though I am sort of nervous about it. Especially considering if I do this I wont be touching my pc(upgrade wise of course) for a very long time.
Click to expand...
It's good to wait for sometime and then get another 780 for sli, it will be real cheap after sometime. My processor is i5 4690k and I ran two R9 290 (one mine and one of my friend's), they ran pretty good, I will too think about getting another R9 290 but near the release of TW3.
 
S

SeasonedWitcher

Senior user
#1,926
Sep 28, 2014
Guy N'wah said:
Well, no, because each card has to have a complete set of textures, and the textures take up most of the VRAM. The frame buffer itself is only 33MB at 3840x2160; cutting that in half by doing checkerboard or alternate line rendering makes only an insignificant difference.
Click to expand...
Hmm... The different AA techniques available and post processing effects, also use huge amounts of memory. Hitman Absolution is a well known example of this. As far as I'm aware, the memory used for these features is halved in SLI and Crossfire for each card, which is likely the reason I could set some ridiculously high AA settings when mine was still working? It turns out also, that the 6gb Vram recommendation in Shadows of Mordor for ultra settings at 1080p, most likely assumes maximum super-sampling settings are being applied (which kills frame rates anyway, regardless of the amount of Vram available), meaning those who run it with FXAA, should see a lot less Vram required. Time will tell.....

As for the entire texture maps having to be duplicated on both cards, it's a ridiculous limitation, that should have been addressed long ago. Nvidia were supposed to be doing something about it with Maxwell, unified memory, which was basically just making system memory more available, and there are various other plans in the memory utilization area, but all that has been pushed back to Pascal now.
 
Last edited: Sep 28, 2014
O

OliverDK

Rookie
#1,927
Sep 28, 2014
I will be buying a new GFX for this game alone. Right nowI hav a AMD 7870 XT and while it performs well in most games I doubt it will run TW3 in a manner that is acceptable to me. I think I will go for the new GFX 970 by Nvidea that will hopefully drop a bit in price when February arrives.

My only is if I should buy a new CPU too and maybe mainboard? My currently CPU is Intel i5 3570K 3.400 Mhz and a MSI Z77 mainboard.

I hope the change in GFX will be enough to run the game in High settetings. Otherwise I proberly have to sell a kidney for the rest too!
 
A

AleRx8

Senior user
#1,928
Sep 28, 2014
Hey guys,

Witcher 3 is the only game I'm waiting for and the only one I want to pre-order. I truly love the whole universum and I also think about the first game as the best game ever made. My problem is in my rig, I'm not really sure I can handle the W3. What I'm looking for is 1080p and at least 40 fps. I don't really care about amount of shadows, quality of AA or details, it could easily be turned down. I just need that 1920x1080. As a student, I can't afford any upgrades right now, so I hope my PC will deliver all the needed performance with pride :D

And what's inside the box?

GTX 660 2GB
6 GB RAM
and what is, I'm afraid, my real weakness... The stock AMD II X4 940 BE CPU...

In recent days, my financial situation can MAYBE (about 15 % success rate) result in replacing my CPU with an Core i5 platform, but for now, the situation looks like this... So guys, what do you think? Will it work? And when will (ETA) the official HW specs be released?
To fully describe my situation, I was really confident that my processor won't be a problem for another 2 years... But as I'm looking at the newest games with their absurdly high requirements (that Ring thing and also that scary thing,) I'm really getting some bad feelings about this :D :X

Thanks for any positive response :) And I'll truly hate you for negative response, so think about it and be positive! :D
 
HellKnightX88

HellKnightX88

Forum veteran
#1,929
Sep 28, 2014
We cant really tell you anything for sure and no idea when they'll release the specs. I'd guess it'll be a few more months until they publish the system requirements because they're still optimizing stuff. I think TW2 was more GPU intensive than CPU and I'm expecting TW3 to be the same (I have a Phenom II X4 955 myself @3.2GHz).

Welcome to the forums, btw!
 
tommy5761

tommy5761

Mentor
#1,930
Sep 28, 2014
My guess is somewhere around late December for the system spec reveal which is roughly 2 months before release .
 
G

GuyNwah

Ex-moderator
#1,931
Sep 28, 2014
seasonedwitcher said:
Hmm... The different AA techniques available and post processing effects, also use huge amounts of memory. Hitman Absolution is a well known example of this. As far as I'm aware, the memory used for these features is halved in SLI and Crossfire for each card, which is likely the reason I could set some ridiculously high AA settings when mine was still working? It turns out also, that the 6gb Vram recommendation in Shadows of Mordor for ultra settings at 1080p, most likely assumes maximum super-sampling settings are being applied (which kills frame rates anyway, regardless of the amount of Vram available), meaning those who run it with FXAA, should see a lot less Vram required. Time will tell.....

As for the entire texture maps having to be duplicated on both cards, it's a ridiculous limitation, that should have been addressed long ago. Nvidia were supposed to be doing something about it with Maxwell, unified memory, which was basically just making system memory more available, and there are various other plans in the memory utilization area, but all that has been pushed back to Pascal now.
Click to expand...
There is no way to render a scene in which camera and actors are moving from frame to frame with a partial set of textures. Absent real memory sharing, which no high-performance card currently has or could even have as long as they are connected by PCI Express. It is not a "ridiculous" limitation at all.

Whether current programming styles are actually wasteful of VRAM or are using it to achieve unprecedented performance remains to be seen. My bet is on the former. Sloppy work is cheaper and gets games to market faster, where they make more money.
 
S

SeasonedWitcher

Senior user
#1,932
Sep 28, 2014
Guy N'wah said:
There is no way to render a scene in which camera and actors are moving from frame to frame with a partial set of textures. Absent real memory sharing, which no high-performance card currently has or could even have as long as they are connected by PCI Express. It is not a "ridiculous" limitation at all.

Whether current programming styles are actually wasteful of VRAM or are using it to achieve unprecedented performance remains to be seen. My bet is on the former. Sloppy work is cheaper and gets games to market faster, where they make more money.
Click to expand...
Where there's a will, there's a way. This limitation has been there since SLI's and crossfires inception, not to mention, textures are far from being the only thing eating up memory as I already said. The only reason they haven't bothered, is nobody has been ahead of the memory included due to consoles having limited memory. That's not the case any more, so hopefully the likes of NVlink will solve this problem of memory sharing. But again, it's Pascal, a long ways away.

Oh without a doubt much of it is wasteful, but at least if the rumours are true, simply turning down the AA settings, should allow users of single 4gb cards to max out Shadow of Mordors textures, and SLI/ Crossfire users will probably be able to dial it up a bit.
 
M

MkTama

Rookie
#1,933
Sep 28, 2014
2Gb for filters only seems as much absurd.. Even if it was the case 4Gb are still too much IMHO. I think the real point is that IF there is not a good reason for this programming (which I personally do not see) more and more people should simply stop to give 'em money.. I think that PC enthusiasts are too much looking too much for numbers while neglecting what's behind, good efficiency and programming which is essential. Having a beastly rig does not solve problems, because more than often it could be done BETTER with the same rig :)
 
S

SeasonedWitcher

Senior user
#1,934
Sep 28, 2014
MkTama90 said:
2Gb for filters only seems as much absurd.. Even if it was the case 4Gb are still too much IMHO. I think the real point is that IF there is not a good reason for this programming (which I personally do not see) more and more people should simply stop to give 'em money.. I think that PC enthusiasts are too much looking too much for numbers while neglecting what's behind, good efficiency and programming which is essential. Having a beastly rig does not solve problems, because more than often it could be done BETTER with the same rig :)
Click to expand...
That's just a guess on my part. The rumours going around are that 6gbs are required if super-sampling is maxed out. That won't make much of a difference anyway, as unless you're running a powerful SLI/Crossfire setup, max super-sampling will kill your frame rate regardless of how much Vram you have, just look at the difference cranking it up makes in Crysis 3 benchmarks, and that's also at relatively low resolutions.

Yes they could no doubt be more efficient, but have you seen SOM? The game is quite a looker, it rivals what I've seen of Witcher 3 in the looks department. Lets just hope CDPR don't kill us with Vram requirements also.
 
Last edited: Sep 28, 2014
sidspyker

sidspyker

Ex-moderator
#1,935
Sep 30, 2014
If SSAA is making it have 6GB VRAM requirements then that makes perfect sense, downsampling is known to eat VRAM for breakfast.

seasonedwitcher said:
The game is quite a looker, it rivals what I've seen of Witcher 3 in the looks department.
Click to expand...
:look::huh:
 
  • RED Point
Reactions: MassEffectReaper
S

SeasonedWitcher

Senior user
#1,936
Sep 30, 2014
No, it looks a lot better than the guy in his chicken suit........ Look for the PC screens, not the consoles.
 
sidspyker

sidspyker

Ex-moderator
#1,937
Sep 30, 2014
I have and it looks pretty nice but Witcher and that are not in the same league.
 
  • RED Point
Reactions: MassEffectReaper
S

SeasonedWitcher

Senior user
#1,938
Sep 30, 2014
sidspyker said:
I have and it looks pretty nice but Witcher and that are not in the same league.
Click to expand...
Witcher 3 has brighter colours, so can throw the eye, however the other is Mordor after all, so art direction has to be more dreary.
 
HellKnightX88

HellKnightX88

Forum veteran
#1,939
Sep 30, 2014
Yeah, but looking at some of the textures on the Uruk leaders and even the protagonist itself you can objectively say that the resolution on them isn't as high as you'd expect or at least it doesn't seem as detailed. I'm not saying it looks bad, mind you, it looks great and at the end of the day I care more about gameplay/framerate than ultra HD graphics (as long as the game looks decent, story and characters are good and gameplay is fun then for me that's a good game for the most part).

But I wouldn't put it much higher that TW3 in the eye-candy factor.... dunno if I'd even put it higher than TW3.
 
sidspyker

sidspyker

Ex-moderator
#1,940
Sep 30, 2014
A new thread in the community section can be used for this, this is a system requirements thread not talking about graphics in other games or comparisions. I can move the posts to a new thread if we're going to discuss this further if no one objects.
 
Prev
  • 1
  • …

    Go to page

  • 95
  • 96
  • 97
  • 98
  • 99
  • …

    Go to page

  • 134
Next
First Prev 97 of 134

Go to page

Next Last
Status
Not open for further replies.
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.