Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
I need to ask myself weather how my computer will run the game


Processor 1 x Intel® Core™ i7-4770K Processor (4x 3.50GHz/8MB L3 Cache) - Intel® Core™ i7-4770K
Memory 1 x 16 GB [8 GB x2] DDR3-1600 Memory Module
Video Card 1 x NVIDIA GeForce GTX 770 - 4GB - Single Card

Forgot what else to put.

Thanks
 
I need to ask myself weather how my computer will run the game


Processor 1 x Intel® Core™ i7-4770K Processor (4x 3.50GHz/8MB L3 Cache) - Intel® Core™ i7-4770K
Memory 1 x 16 GB [8 GB x2] DDR3-1600 Memory Module
Video Card 1 x NVIDIA GeForce GTX 770 - 4GB - Single Card

Forgot what else to put.

Thanks

It's a nice computer. But nobody can give you an answer you can take to the bank. We just don't know enough to say this or that high-end feature will work well on this or that high-end GPU.

The game would have to run well on systems like yours, because the developers are not all going to have exotic high-end rigs like SLI'd 780s, and they have to run the game and the development tools well enough to get work done. But exactly what is needed to get all the eye candy out of the game, we can't say yet.
 
I have a little theoretical question, if one were to run TW2 on highest settings at a comfortable 28fps, it makes sense that one should be able to run TW3 with it looking just as well as TW2 on highest settings (whatever settings that means for TW3) and it still run at the same 28fps with the same computer? I understand the phrase "graphics looking just as good as TW2 on highest settings" is a little vague but one should get my point.

Or is the fact that RedEngine3 is newer engine make it inherently more demanding meaning it would run with less fps on the same PC even if visually one configured it to look "just as well" as TW2 on highest settings.

Maybe it works the other way, that RedEngine3 is a new more optimized engine meaning if one would like to configure in graphics options for TW3 to look as well as TW2 on higher settings TW3 would actually run at a higher fps than TW2 with equivalent visual quality and on the same PC.
 
It's hard to say, because there are at least three different facets that we know little about.

One is the additional eye candy and DirectX 11 support in TW3. Both of these are expensive, especially if DirectX 11 is not used conservatively. This could mean more demanding graphics if the game implements all of the promised features; it could just as well be less demanding if their use of DirectX 11 is very careful.

Another is the optimization in RedEngine 3. They surely did not start from scratch and write a whole new engine. It will have large pieces of the Witcher 2 engine, and these may be optimized to varying degrees.

The third, and to me the most important, is we do not know what the limiting resources are. Optimization is nothing but a waste of a lot of time and money if you optimize the wrong code or the wrong feature. You have to find the limiting resources and optimize for those. In the Witcher 2, the output processors (ROPs) were very noticeably limiting -- to the point that you can make predictions of performance within ten percent or so from knowing the ROP complement and the core clock rate alone. In Red Engine 3, well, we just don't know yet. They may have put more burden on the shaders and less on pixel-level processing at the output stage.

So any prediction of whether the Witcher 3 will perform better or less well on the same hardware is really a wild-arsed guess.
 
There seems to be a trend of games requiring more CPU power lately, and the system requirements of many of the upcoming games such as Watch Dogs confirms this. But the Graphics cad requirement have not increased that much. If this is also the case for Witcher 3, then it may reduce the performance if your CPU is not up to the task.
 
There seems to be a trend of games requiring more CPU power lately, and the system requirements of many of the upcoming games such as Watch Dogs confirms this. But the Graphics cad requirement have not increased that much. If this is also the case for Witcher 3, then it may reduce the performance if your CPU is not up to the task.

This is increasingly likely. A lot of CPU goes to waste in older high-performance games. Features like multiple render queues allow more CPU resources to be consumed. I would not be surprised to see dual cores perform poorly, and the playable minimum on older systems to be something like a Phenom II x4 or Core 2 Quad Q9550.
 
They probably will, the only question is when they will reach the point of having solid requirements.
 
@Guy N'wah
What do you think of i5 4670K and GTX 760 for gaming? Thanks

I'm not made of money, so I always consider price along with performance, and they're an excellent choice. In gaming, the Core i7's are not 100 dollars or so better than the Core i5's. The 760 is the best card in performance for price in nVidia's line. There's a lot of opinion favoring the 770, but they're at least 80 dollars more here, and the difference isn't worth it.

If you have a motherboard and power supply capable of SLI, pick a popular model of 760, and if one isn't enough, you can get another and run them in SLI. Two 760's in SLI can outperform a 780 and cost a lot less.

But if you are cost-sensitive and not tied to nVidia, the comparable AMD cards are better deals now that the bitcoiners aren't snarfing up the entire supply. Compare an R9 270x to that 760.
 
The question is if this Game is gonna benefit from thei7 power,because modern games don't and i5 is actually a better deal.
 
I just hope my PC won't melt, by the way if i remember correctly there's an "Auto-detect Best settings" button on the launcher of TW2 i hope it's included in TW3.

Word of advice - Do not 100% rely on those "Auto-detect Best settings" . Majority of games that have them implemented are all over the place and from my experience, very sporadic and unreliable. The same goes for Geforce Experience,If you are a Nvidia owner.

I would never put all my trust into what those programs suggest. It is always better to tweak & test until you find what performs and looks best for your particular PC setup.

ETA: W2 launcher had/has a bug where certain settings would resort back to lower values IIRC. I had to go into the ini.config file in order to fix the issue. Some values were even missing completely like Maxfoilagedistance ( that affected LOD: level of detail. )

I'd rather not use FXAA, because it really makes textures blurry. If I have a choice between FXAA or no AA, I'll go with no AA to save the textures.

While I do agree with you on the fact that FXAA tends to 'soften' textures, in certain games, I thought it positively affected the picture quality overall. I've recently been playing Alan Wake and noticed with FXAA on high combined with 4x MSAA ,FXAA definitely improved the image quality.

It is one of those love it or hate it things I suppose. Depending on the game and how well it is implemented along with other forms of AA.
 
Last edited:
The question is if this Game is gonna benefit from thei7 power,because modern games don't and i5 is actually a better deal.
I don't think so, not in the near term -- and you buy for the near term. Buying for the future in a universe where Moore's Law is marching orders is wasteful and expensive. There has to be a shift in how engines work, to where there are more CPU-bound threads. Some of that is happening, but not enough. If most of what the CPU is doing is running Lua scripts, then no, more bogocores won't do much.
 
My GPU (a 6870) died recently, it forces me to upgrade my PC entirely, and sooner than expected.
I'm on my way to buy a 770 SLI. Hope it will be enough to run W3 smoothly.

Maybe Witcher 4, CoD: Very advanced space Warfare and Fifa 2024 too.
 
Last edited:
Would it benefit to have I7 processor over i5 If I wanna run couple of other programs+ a game?

What the Core i7 (as well as lesser HT CPUs like Core i3 and laptop models) is really good at is running random loads efficiently. So the answer I'd give is yes, sort of.

if you have a lot actually running in the background, like you have a busy torrent node or a compiler running, the HT on Core i7 will help.

If you have programs that are just open but idle while you are playing, more memory so they don't get paged out is more important.
 
I got a new rig.
i5 3450
R9 270
8(4X2) GB 1600 Mhz RAM
I play games at 720p. So I think I should be able to run TW3 at medium/high.
 
3 days ago I've read a YouTube comment ( Don't judge me! ) stating OG GTX780 won't be able to run W3 on 1080p with more that 30fps.

Personally I think of that as *cough*BULLSHIT*cough* and obvious trolling comment but now I'm not so sure.

Around Winter 2014. I'll buy/assemble new gaming rig & I have plans for single GTX770 or single GTX780 from MSI; i5K CPU ; 16GB RAM @2400MHz & MSI MOBO.

Is GTX780 or GTX780Ti be enough to run W3 on high/ultra settings, 1080p on 60fps - that is my concern ?
 
Last edited:
In my opinion, i think that, it will be worth to wait for the 800 series from nvidia, because they are announced they will be out at same time that TW3. Ubisoft annouce watchdogs need a 780 for running it at ultra, and the graphics are not outstandings so...
Personnaly, i would buy a cheap GPU at beginning and buy a high-end card when TW3 release so i will be sure to maximise it.

For the amount of RAM and CPU you are going to be ok depend which model you choose for your CPU

I hope they will talk about this at E3, so fans can prepare there wallet
As i said its only my opinion.
 
Last edited:
In my opinion, i think that, it will be worth to wait for the 800 series from nvidia, because they are announced they will be out at same time that TW3. Ubisoft annouce watchdogs need a 780 for running it at ultra, and the graphics are not outstandings so...
Personnaly, i would buy a cheap GPU at beginning and buy a high-end card when TW3 release so i will be sure to maximise it.

For the amount of RAM and CPU you are going to be ok depend which model you choose for your CPU

I hope they will talk about this at E3, so fans can prepare there wallet
As i said its only my opinion.

Is that an official announcement from nVidia, or just a rumor? I do not believe nVidia has set a date for release of any second-generation Maxwell product. So far, all we have is the 28nm 128-bit first-generation (750/750Ti discrete; 850m/860m mobile) products.

I'd like to believe they will have 20nm 256-bit Maxwell cards in production before the game is released, but their record of not actually getting Maxwell out the door has yet to give me any foundation for that belief.

But still, unless you must have a new graphics card now for a different reason, waiting and seeing what is available is the best way to avoid spending too much money on inadequate hardware.
 
Last edited:
Status
Not open for further replies.
Top Bottom