Hardware/Software Technical Discussion Thread

+
That was not the part that made me bristle. It was acting like NVMe/Optane was ubiquitous and every PC owner who even thinks about gaming has at least $300 in their PCIe slot that got to me.


I didn't act like anything was the norm. Ask 1st. I hope that more game engines start building file systems that by default are designed to take advantage of solid state storage. Here is hoping
Red Engine 4 fits that bill. I hate future games being pulled down by the anchor called legacy technology.
 
NVIDIA GTX 1180 Already On Sale For $1500.00 - WTF" on YouTube


Its coming very soon. Not for 1500usd mind you. Last time Nvidia sent out invites like this,we got the Pascal unveiling and the 1080 and 1070. Yummy.
 

Guest 4211861

Guest
I believe CDPR has not done any optimization yet and that's why they had to run it on a rocket for the "superexclusive, fancy, VIP-only" demo. We've seen this with betas before, terrible frames in the beta, great frames post-release.

Somehow I'm confident they can deliver impressive graphics even on current tech. I mean, I was able to run W3 on a 670Ti, medium-settings, which were still quite pretty actually. I know the present is no guarantee for the future, but it's still some sort of indication.
 
Heh, I'm quite happy TW3 is hitting 60-80 fps for me at 1920x1200 with Vega 56 using Wine+dxvk. But I'll wait with buying higher resolution / higher refresh rate monitor until some next generation chips will come out, may be Navi or even one after that.

Why are you waiting? Get a large 1440p screen. It doesn't need to be high refresh (its not yet mature - LOADS of potential problems and you play the panel lottery)

32" 16:9 60hz 1440p samsung for e.g. S32D850 - $388 from amazon.
You can prob find it cheaper. Awesome for gaming, work and video content (16:9 is more practical). No dodgy acer or asus monitor quality.

I have basically no budget limit and this is my current choice & recommendation (unless youre a hardcore CS:GO player and need 100000fps).

Regarding 4K: GPU fluency isnt there plus software scaling at high DPI STILL has a long way to go.
1440p @ 32" 16:9 is exact same DPI as 24" @ 1080p (the defacto software size/scaling standard) Therefore all software works perfectly at 100% (default)
 
Ah, yep, you're right. I believe the panel I have right now is TN. Hopefully we see some good, affordable IPS panels with G/Freesync @ 144Hz sooner rather than later. I prefer it, too, my side monitor (no G-Sync) is IPS, so I'm constantly seeing the difference.

Im a VA convert from IPS. VA has come a long way and doesnt suffer from IPS glow. The blacks are MUCH better, and unless youre at extreme angles theres no colour/contrast shift.
What i'd like to see is just higher refresh rate monitors from quality manufacturers. It doesnt need g or freesync. Just set the refresh and lock in vsync. (of course you want a GPU that wont drop below that rate)

(To clarify on quality manufacturers, i only recommend asus motherboards yet i wouldn't touch their monitors outside of a basic office setup. Same with Acer. I saw RMAs firsthand and theres plenty of stories out there of people having to go through 4 predator or asus equiv monitors before finding one that was fault free. Mind you this is a $1k+ monitor)
 
Last edited:
That was not the part that made me bristle. It was acting like NVMe/Optane was ubiquitous and every PC owner who even thinks about gaming has at least $300 in their PCIe slot that got to me.

Not NVME but regular SSDs should be considered the user norm for devs of high end games. We want those huge textures & models loaded fast.
 
Ahh, I see.

I've had the same 500GB Samsung 850 EVO SSD for years, still in great condition, and it's my sole drive for gaming and general storage. Forces me to be conservative with what games I have installed at any given time. Considered using an additional HDD, but just too lazy to get one. Plus, I can't stand the slower game loading times.

Given that games are getting MASSIVE in disk space usage it wouldnt be bad idea getting a 256gb (for o/s and programs) and using the 500gb gaming only, or a 1tb (and using the 500gb for o/s).

Fallout 4 (DLC + high texture pack, no mods) was 90+gb.
GTA 5 was 60gb and required around 80 or 90 (if memory serves me right) free space to unpack during install.

Spinning rust should only be used for mass media storage - no apps/programs. Music, photos, video. The latter chewing up space like no tomorrow.
 
Not NVME but regular SSDs should be considered the user norm for devs of high end games.

They will be as soon as they become the standard in consoles. :) Otherwise, load times need to be bearable on the XB1 and PS4, and the streaming needs to be able to keep up with the game (which can apparently be problematic even in TW3).
 
They will be as soon as they become the standard in consoles. :) Otherwise, load times need to be bearable on the XB1 and PS4, and the streaming needs to be able to keep up with the game (which can apparently be problematic even in TW3).

Oh man, console man is keepin us down.
Cant they do a boxed SSD disk swap out option? (Extrematize your game!) The N64 had that RAM upgrade pack
 
Why are you waiting? Get a large 1440p screen. It doesn't need to be high refresh (its not yet mature - LOADS of potential problems and you play the panel lottery)

32" 16:9 60hz 1440p samsung for e.g. S32D850 - $388 from amazon.
You can prob find it cheaper. Awesome for gaming, work and video content (16:9 is more practical). No dodgy acer or asus monitor quality.

May be, but I personally prefer Dells. I'm OK with 1920x1200 for now, and higher resolution would require better GPU for me to keep framerate good. Also, I like 16:10 more than 16:9, and all high resolution monitors like that are either very expensive or have bad response time.

32" is also a bit too big, requiring you to put the monitor in a distance, otherwise it's not ergonomic to use. Something like 27" is optimal I think.
 
I didn't act like anything was the norm. Ask 1st. I hope that more game engines start building file systems that by default are designed to take advantage of solid state storage. Here is hoping
Red Engine 4 fits that bill. I hate future games being pulled down by the anchor called legacy technology.

And I'm hoping that CP2077 will be able to run on hardware that is more than a few months old or under $1k. The sort of systems that regular people own. I understand where you are coming from, and I see no reason for a 32-bit version of CP2077 that can run on a 16MB 486, but I don't think that the latest-and-greatest should be the baseline unless it's also near-ubiquitous either. Since most systems have been 8GB quad-cores for a while now, I see that as a reasonable baseline.

Also, with the issues DX12 had and pretty much every DX12 game that came out running better under DX11 anyways, I think the baseline should be something that actually works instead of simply what is newest.

Not NVME but regular SSDs should be considered the user norm for devs of high end games. We want those huge textures & models loaded fast.

I think that modest SSDs are common enough among those that have a system even remotely suitable for gaming that that isn't unreasonable. Sure, the fancy stuff should also be supported, but I don't think it should be considered baseline.

May be, but I personally prefer Dells. I'm OK with 1920x1200 for now, and higher resolution would require better GPU for me to keep framerate good. Also, I like 16:10 more than 16:9, and all high resolution monitors like that are either very expensive or have bad response time.

32" is also a bit too big, requiring you to put the monitor in a distance, otherwise it's not ergonomic to use. Something like 27" is optimal I think.

I run 32" 1920x1080 and sit a bit closer than people half my age ever would. You kids with your 20/20 eyesight....
 
Why are you waiting? Get a large 1440p screen. It doesn't need to be high refresh (its not yet mature - LOADS of potential problems and you play the panel lottery)

32" 16:9 60hz 1440p samsung for e.g. S32D850 - $388 from amazon.
You can prob find it cheaper. Awesome for gaming, work and video content (16:9 is more practical). No dodgy acer or asus monitor quality.

I have basically no budget limit and this is my current choice & recommendation (unless youre a hardcore CS:GO player and need 100000fps).

Regarding 4K: GPU fluency isnt there plus software scaling at high DPI STILL has a long way to go.
1440p @ 32" 16:9 is exact same DPI as 24" @ 1080p (the defacto software size/scaling standard) Therefore all software works perfectly at 100% (default)

144FPS is beneficial for far more than hardcore FPS gameplay. I've seen that assumption a few times. If you have a high refresh rate display, you must be a FPS player, because there's absolutely zero reason for one otherwise. :rolleyes:

It makes games way smoother in general. It's fantastic for RPGs, strategy games, and basically every game on the market. If you've never played at 144FPS, you won't know the difference, but it's a big one. I had a 1440p display in the past, it was alright, but not worth the trade-off for me.

And I've never had (nor have I ever heard of) issues with "panel lottery," so not sure what you're talking about there. Monitor issues are common among all displays, certainly not just 144Hz.

24", 1080p, 144Hz will be my go-to for the foreseeable future. Until hardware is powerful enough to drive high FPS at 1440p, 4K, etc., that is, then I'll gladly jump on board.

Bottom line? Everybody has their preferences. Some people prefer higher resolutions so they can have their massive 34" curved displays, others are happy with 24" (dual monitors in my case, for easy multitasking) for high FPS.
 
And I'm hoping that CP2077 will be able to run on hardware that is more than a few months old or under $1k. The sort of systems that regular people own. I understand where you are coming from, and I see no reason for a 32-bit version of CP2077 that can run on a 16MB 486, but I don't think that the latest-and-greatest should be the baseline unless it's also near-ubiquitous either. Since most systems have been 8GB quad-cores for a while now, I see that as a reasonable baseline.

Also, with the issues DX12 had and pretty much every DX12 game that came out running better under DX11 anyways, I think the baseline should be something that actually works instead of simply what is newest.



I think that modest SSDs are common enough among those that have a system even remotely suitable for gaming that that isn't unreasonable. Sure, the fancy stuff should also be supported, but I don't think it should be considered baseline.



I run 32" 1920x1080 and sit a bit closer than people half my age ever would. You kids with your 20/20 eyesight....


Agree pretty much 100%. I am 50+ and legally blind,btw. Horrid eyesight on this side of the screen. Cataracts and glaucoma in both my eyes,and the retinal wall in my left eye is about 1/3rd gone,or rather,its floating around inside my eye in thousands of tiny pieces.
 
When this game out of next gen consoles then I will buy it on that because I want to get the best experience possible.
 
A lot of the demo impressions ranted about how crazily dense and super populated the city was for the cyberpunk demo. Yea, the demo was running on a PC, but how in the world will night city be running at 30 fps on current gen consoles when from what I undestand CDPR had to reduce the number of npc's for novigrad in the witcher 3 so the FPS would be stable and even then it struggled to be 30 fps.

Novigrad is a medieval fantasty town; its relatively simple. Night city is a ghost in the shell/blade runner cyberpunk futurstic setting with flying cars and neon billboards everywhere with a ton of vehicles. And like the demo impressions claim, the world will be super packed with people.

What is CDPR gonna do? Reduce the number of npc's by half and make it 900p to make it run at 30 fps for current gen consoles. That would probably hurt the game since most media will knock the game since they're probably not even aware consoles aren't as strong as high end pc's and won't look like the demo they saw.

I hope CDPR didn't make a mistake even by this closed door demo. I hope they have an idea how well the game will run on current gen consoles. Like, its obvious it needs to at least be locked to 30 fps so I hope they have an idea on how they will accomplish this.
 
I hope CDPR didn't make a mistake even by this closed door demo. I hope they have an idea how well the game will run on current gen consoles. Like, its obvious it needs to at least be locked to 30 fps so I hope they have an idea on how they will accomplish this.

Whether people want to admit that or not, both consoles played a significant role in The Witcher III's success, so it is very important to show good performance on both machines. Now, HOW they will achieve that with a game of such magnitude - one can only guess.

We should be must be rational in our expectations. The game will always be better on PC, that's for sure. But it shouldn't be as bad as PS3's Fallout New Vegas on current-gen consoles. As a fellow console owner I share your concerns completely though. Let's get ready for inevitable, maybe huge, but certainly (and sadly) necessary downgrades.

When this game out of next gen consoles then I will buy it on that because I want to get the best experience possible.

You'll get the best one on a high-end PC though.
 
Last edited:
You'll get the best one on a high-end PC though.

Well, depends. Consoles have a lot of advantages. Comfort, screen size, ease of sharing experience with family and friends.

If you don't care about mods and that extra 10% graphics (or however you define the ratio of superior PC graphics to next gen console) boost from PC, then console is arguably a superior experience.

It's a tradeoff and each has advantages.
 
Oh, that's absolutely true, they all have their pros.

Just curious, what percentage do you think it will take on current-gen ones? Regarding visuals, draw distance and so on. On basic Xbox One S and PS4 consoles.
 
Top Bottom