Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Prev
  • 1
  • …

    Go to page

  • 112
  • 113
  • 114
  • 115
  • 116
  • …

    Go to page

  • 134
Next
First Prev 114 of 134

Go to page

Next Last
F

fenreck

Rookie
#2,261
Nov 13, 2014
I hope so i trust in them, inclusive with a excellent profile SLI support... whatever graphic card is, it will be awesome...

we wait they listen us = D
 
Last edited: Nov 13, 2014
G

GosuPL

Forum veteran
#2,262
Nov 14, 2014
shawn_kh said:
I apologize. I got into too much detail and politics for the purposes of this thread, which consequently led to my main point concerning this thread to not be clear.
My advice regarding the Witcher 3 is do not buy/build PCs based on the ridiculous requirements of some games that suffer from poor optimization, because it is a waste of money and it does not matter at the end. A poorly optimized game is still a poorly optimized game, no matter what kind of PC you run it on. From what we can gather from the interviews, CDPR team seem to have emphasized a lot on optimization of the Witcher 3 across all platforms. If they deliver and the game is properly optimized, then we would not need two 780s to get a stable performance on ultra with 40+ FPS at 1080p. Of course we cannot be sure unless we see the requirements, and even then I'll hold off until I see the game in action.
Click to expand...
Good example how devs can be lazy is AC Unity....But CD Projekt is a different story :)
 
M

moonknightgog

Forum veteran
#2,263
Nov 14, 2014
shawn_kh said:
I apologize. I got into too much detail and politics for the purposes of this thread, which consequently led to my main point concerning this thread to not be clear.
My advice regarding the Witcher 3 is do not buy/build PCs based on the ridiculous requirements of some games that suffer from poor optimization, because it is a waste of money and it does not matter at the end. A poorly optimized game is still a poorly optimized game, no matter what kind of PC you run it on. From what we can gather from the interviews, CDPR team seem to have emphasized a lot on optimization of the Witcher 3 across all platforms. If they deliver and the game is properly optimized, then we would not need two 780s to get a stable performance on ultra with 40+ FPS at 1080p. Of course we cannot be sure unless we see the requirements, and even then I'll hold off until I see the game in action.
Click to expand...
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.
 
G

GosuPL

Forum veteran
#2,264
Nov 14, 2014
Moonknightsg said:
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.
Click to expand...
But AC Unity is poor optimization piece of shit, anyway.
Lots of bugs.Even horrible framerate on consoles, below 20 fps :)
 
G

GuyNwah

Ex-moderator
#2,265
Nov 15, 2014
Moonknightsg said:
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.
Click to expand...
^ This.

Optimization does NOT mean reducing the game until it can run the best and highest of the remaining features on ordinary equipment.

Optimization does NOT mean pretending that the game will be able to run its best and highest features on ordinary equipment, without reducing and compromising these features.

The goal of optimization is to deliver the best possible performance across a wide range of platforms, including sub-minimum ones, and including a level of performance that is only possible on Big Iron and that delivers an experience that makes having the Big Iron worth it to you.
 
  • RED Point
Reactions: WildFinn96
T

Tuchi

Forum veteran
#2,266
Nov 15, 2014
Guy N'wah said:
Please don't repeat their CRRRAP. It makes it really hard for those of us who take giving technical advice seriously to have to tell people to ignore it.

Short answer: We don't know how many GPU hamsters it will take to spin this game at any specific frame rate, resolution, and quality yet. The best guesses are indeed that it will take a high-end GPU to get 1920x1080 at something close to 60 fps, a pair of them in SLI or Crossfire to handle 1440p, and not much chance of 4k, which is 4x the pixels of 1920x1080.

If I were determined to have 4K at 60fps at any cost, I would get a Xeon on an LGA 2011 motherboard, so I would have enough PCI-Express support for 3x SLI. If I were not so determined and willing to settle for lesser resolution or frame rate, I would get a Z97-chipset motherboard that had a good layout for two damned big GPUs in SLI, and only buy one GPU. That way, I can determine whether the second GPU is really needed, before spending another several hundred dollars.
Click to expand...


i see this is waht i'm plannig to put together to run at 1440 p at ultra,possibly 60 fps let me know if you think it's enough


case: Coler Master Storm Stryker Tower White

power suply: Cooler Master Silent Pro 700 wat

cpu: Asus Maximus VII Hero

processor : i7 4790k 4.40 gigahertz

16 gb System Ram

4 hard disk : primary SSd 500 giga,secondary HDD 1 tb ,3rd HDD 500 giga and lastsly another one HDD da 500

gpu: Asus gtx 980 4 giga

i will do this upgrade ANYWAY ( most of the parts listed i already have anyway including the gpu) i'm asking you if you think this is enough for what im' requestong only becouse i have doubts about putting another 980 in sli with this configuration, that' the only variable here
thank you

if it matter i'm plainng tu use a asus PA279Q for monitor at 1440 p obviously
 
Last edited: Nov 15, 2014
S

spacehamsterZH

Rookie
#2,267
Nov 15, 2014
Okay, so help an idiot out here - if I have an i7-2600k 3.4 GHz with 16GB DDR3-1600 RAM, what's about the biggest GPU that makes sense? I'm willing to splurge for this game, but I also don't want to buy a prohibitively expensive GPU that I can't take full advantage of.
 
G

GuyNwah

Ex-moderator
#2,268
Nov 15, 2014
spacehamsterzh said:
Okay, so help an idiot out here - if I have an i7-2600k 3.4 GHz with 16GB DDR3-1600 RAM, what's about the biggest GPU that makes sense? I'm willing to splurge for this game, but I also don't want to buy a prohibitively expensive GPU that I can't take full advantage of.
Click to expand...
There isn't any single GPU that a Sandy Bridge would bottleneck. I would be reluctant to put two high-end GPUs in SLI, because it has only PCI-e 2.0. But it has more than enough bandwidth for any single card.
 
S

spacehamsterZH

Rookie
#2,269
Nov 15, 2014
Guy N'wah said:
There isn't any single GPU that a Sandy Bridge would bottleneck. I would be reluctant to put two high-end GPUs in SLI, because it has only PCI-e 2.0. But it has more than enough bandwidth for any single card.
Click to expand...
Well, crap. Now I'm going to have to rely on restraint and rationality, I was hoping my hardware limitations would make the choice for me. Haha. But thanks, man.
 
S

samplerico

Rookie
#2,270
Nov 15, 2014
shawn_kh said:
I think Ubisoft and Nvidia outdid themselves this time around. Not only Assassin's Creed Unity gets an average of 30 FPS on a GTX 780 and an i7 4770k, but it also dips down to 17 FPS on the PS4. It is a game that is not optimized on any system, and funny thing is that it's one of the biggest releases this year.
It's even funnier how us, PC gamers, spend more and more money to get more powerful PCs, and at the end of the day it doesn't matter because a poorly optimized game doesn't even properly take advantage of the power that is available to it.
No one should buy/build PCs based on ridiculous requirement of poorly optimized games like Unity, and I'd say buying games like this before patches and price drop just gives the greedy and lazy companies like Ubisoft more money to make more unoptimized games.
And lastly the trailers and interviews saying that Nvidia is in close collaboration to optimize and make the game look the best on PC, has turned into pure comedy after Watch Dogs and Unity. It gives me the chuckles every time I see one of those trailers, and I try to avoid that game because it means the game is unoptimized say 99% of the time. These trailers and the game performances suggest two possibilities: Nvidia does this on purpose in order to force the customers to buy overpriced cards, or Nvidia is simply incompetent.
Click to expand...
Completely Agree with u man, we need to stop buying hardware and software until they show us their working hard on their products. How is posible a 1 year old 500$ card cannot maintain 50FPS in a new game at full 1080p? Simple, its a fucking SCAM (sorry). How is it posible that a fricking 5000GFlops card only run 15-20FPS more on ACU than a 1800GFlops in a closet system. Theres any engineer somewhere can seriously explain that? Its begining to smell of big shit around the videogame industry. Im personally sick of upgrading and paying and paying and paying more to get less and less and less. Whats the fucking mess here?

Glorious, those days of the GTX8000 series + Crysis. That was really worth to pay for tecnology. Now you pay gold to get a wood wheel...
 
Last edited: Nov 15, 2014
G

GuyNwah

Ex-moderator
#2,271
Nov 15, 2014
There is no scam and no call for flinging foul language around. Do not make me put on my moderator hat; you will not like the result.

There is no comparison whatsoever between the artistic and production standards of games that were current seven years ago and first-class games of today.

As an engineer, I will answer you. You can only put so much into the game before it exceeds the capacity of current hardware. Blaming the engineers for not being able to run the highest quality production on anything short of the highest quality and most modern hardware is a gross and offensive insult.

And I have to repeat something I said earlier. OPTIMIZATION IS NOT REDUCING THE FEATURES OF A GAME UNTIL IT WILL RUN ON YOUR HARDWARE.
 
S

samplerico

Rookie
#2,272
Nov 15, 2014
Guy N'wah said:
There is no scam and no call for flinging foul language around. Do not make me put on my moderator hat; you will not like the result.

There is no comparison whatsoever between the artistic and production standards of games that were current seven years ago and first-class games of today.

As an engineer, I will answer you. You can only put so much into the game before it exceeds the capacity of current hardware. Blaming the engineers for not being able to run the highest quality production on anything short of the highest quality and most modern hardware is a gross and offensive insult.

And I have to repeat something I said earlier. OPTIMIZATION IS NOT REDUCING THE FEATURES OF A GAME UNTIL IT WILL RUN ON YOUR HARDWARE.
Click to expand...
wow take it easy man, if i cannot explain my point of view you dnt need to ban me, i quit myself.

Im just saying what is obvious for all. If ur engineer pls explain me how you get 40FPs with crashes stutterings and frame drops in a 5000Gflops card. Can you explain me exactly how developers optimize the game? if u could explain me that you probably will not be here moderating. Don't take that personally. but man optimization theese days its a fucking joke. Im a consumer i pay games i pay hardware, peripherals and all the paraphernalia its not fair that you buy a 500$ graphic card, a 200-300$ CPU, you buy a 60$ game and tachan! they got you, poor optimization, bugs, crashes, disconnections etc etc...And all that crap wiht the BIG LOGOG OF NVIDIA its meant to be played!! sure to be played at the museum. Can you also explain me that? Im a consumer, if i cannot says these words and im just here to buy and to be bought then we are all of us full of shit.
 
G

GuyNwah

Ex-moderator
#2,273
Nov 15, 2014
samplerico said:
wow take it easy man, if i cannot explain my point of view you dnt need to ban me, i quit myself.

Im just saying what is obvious for all. If ur engineer pls explain me how you get 40FPs with crashes stutterings and frame drops in a 5000Gflops card. Can you explain me exactly how developers optimize the game? if u could explain me that you probably will not be here moderating. Don't take that personally. but man optimization theese days its a fucking joke. Im a consumer i pay games i pay hardware, peripherals and all the paraphernalia its not fair that you buy a 500$ graphic card, a 200-300$ CPU, you buy a 60$ game and tachan! they got you, poor optimization, bugs, crashes, disconnections etc etc...And all that crap wiht the BIG LOGOG OF NVIDIA its meant to be played!! sure to be played at the museum. Can you also explain me that? Im a consumer, if i cannot says these words and im just here to buy and to be bought then we are all of us full of shit.
Click to expand...
I've been an engineer longer than most of the members of this forum have been walking, so I've seen enough cases of bad development to ascribe it to bad development.

Not availing yourself of the assistance provided by the graphics card maker that is in a position to provide that assistance, in the name of some fictitious notion of fairness, is the best possible way to compound all the mistakes you made into an even worse excuse for a product.

And as a moderator, I have no objection to your stating your opinion, but it is my duty to tell you to do so decently and in order or keep silence instead.
 
A

ApuLunas.211

Rookie
#2,274
Nov 15, 2014
which one is better 4k tvs or gaming monitors? i just bought a new tv, now i don't like my 27" like old times.
 
G

GosuPL

Forum veteran
#2,275
Nov 16, 2014
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)

This thing killed my exellent 980 reference (1510/7810 at stock)

What for me the pal said is being confirmed that "Benchmark"in the Combined test is killing GPU after strong OC.
This killed my 780 Ti Lighting, now ref 980 :/

In some of tournament, peoples lost their GPUs and even PSUs on that test "Combined" :/
What a shame Futuremark, plz be cautious with yours GPUs
 
Last edited: Nov 16, 2014
  • RED Point
Reactions: Fallout_Wanderer
G

GuyNwah

Ex-moderator
#2,276
Nov 16, 2014
GosuPl said:
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)

This thing killed my exellent 980 reference (1510/7810 at stock)

What for me the pal said is being confirmed that "Benchmark"in the Combined test is killing GPU after strong OC.
This killed my 780 Ti Lighting, now ref 980 :/

In some of tournament, peoples lost their GPUs and even PSUs on that test "Combined" :/
What a shame Futuremark, plz be cautious with yours GPUs
Click to expand...
Quoted for truth.

Once you overclock, you are working at your own risk. It is entirely possible that you can destroy hardware by running it at full continuous load, overclocked and especially if it is also overvoltaged.

But blaming the benchmark programs for causing the failure is really not accurate. The failure was caused by the decision to overclock and then run at full load without being ready to back down really fast at the first sign of trouble.
 
G

GosuPL

Forum veteran
#2,277
Nov 16, 2014
Guy N'wah said:
Quoted for truth.

Once you overclock, you are working at your own risk. It is entirely possible that you can destroy hardware by running it at full continuous load, overclocked and especially if it is also overvoltaged.

But blaming the benchmark programs for causing the failure is really not accurate. The failure was caused by the decision to overclock and then run at full load without being ready to back down really fast at the first sign of trouble.
Click to expand...
But ONLY 3dMark Fire Strike Extreme - Combined test do that.
In past two months i have 9 different GTX 780Ti and 4 GTX 980 and ONLY 780Ti Lighting and one of 980 references died, durning this test.
780Ti Lighting at stock voltage...

3D Mark Fire Strike (newest version) have some kind of voltage peak bug durning combined test.
 
R

randyrhoads

Rookie
#2,278
Nov 16, 2014
GosuPl said:
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)
Click to expand...
Isn't that thing just for showing off?
 
Elegast7

Elegast7

Senior user
#2,279
Nov 16, 2014
Mobo: Asus Maximus VII Ranger
CPU: Intel Core i5-4690K
RAM: Corsair Vengeance Pro 2133-8GB
Fan: Cooler Master Hyper 212 EVO
GPU: Asus Strix GeForce GTX970
PSU: Corsair RM 850W
(Case: Corsair Graphite 780T)
=+- €1200

Opinions would be appreciated. :D

Edit: I wanne go SLI in 1-2 years, therefore the high PSU. And I already have 2.5 TB on HDD space.
 
Last edited: Nov 16, 2014
G

GosuPL

Forum veteran
#2,280
Nov 16, 2014
randyrhoads said:
Isn't that thing just for showing off?
Click to expand...
For test stability with overclock GPU.
But never agian with that shit...
Only 3dm11 and Ungine Haven 4.0.
 
Prev
  • 1
  • …

    Go to page

  • 112
  • 113
  • 114
  • 115
  • 116
  • …

    Go to page

  • 134
Next
First Prev 114 of 134

Go to page

Next Last
Status
Not open for further replies.
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.