Building a gaming PC

+
Ok... I'm going to be that guy and totally go against some of what's been said here.
  1. You DO NOT need an i7. Actually, you don't even need an Intel processor to make a truly awesome rig. Yes, you need a decent i5 or equivalent, but to say you need an i7, that's simply not accurate.
  2. You DO NOT need a stupid amount of Ram. You need around 8Gb on Win10 or 8.1 to be comfortable, 16Gb is better, anything more is waste. Make sure you get the fastest you can for your board though.
  3. You DO need as good a graphics card as you can afford. My rig cost me $650 (excluding monitors) 2 years ago, the graphics card was around $300 of that. It currently runs new VR with full capacity and I get 1440p on pretty much everything except the most punishing games, and those I'm getting very solid performance at 1080p. If anything, I might beef up my graphics card next
  4. You do not need to spend stupid money to make a good gaming rig; even for VR. You simply need to be intelligent about how you chose to build it.
Here's a site I strongly suggest you look at before taking any action: https://techguided.com/best-cheap-gaming-pcs/
My rig is based on an older build with an upgraded video card, and as I said above is over 2 years old. I run current games - Hellblade, Vampyr, etc. - as well as old games like Witcher 3, Hitman, etc. I'm seriously not having any issues at all. I strongly advise that you read up a bit and perhaps take a look at that link before deciding your next steps.

Can you buy something big and gorgeous? Sure... if you want. If money isn't an option I can recommend a ludicrous build for around $1400-$2k that will rock your socks. But you don't need it.

Source: I've worked in IT for over 25 years. I built my own rig for VR with a target of $700 excluding the headset, and beat it (barely :)).

I mostly agree, except for a couple things I feel were omitted or overlooked.

A) The latest generation of i3s are actually pretty impressive. The i3-8xxx series is comparable to the i5-7xxx series

B) 1080p gaming has notably lower requirements than 1440/4k/VR so you can get by with going a step down

C) You don't really need more than 1080p unless you have the eyesight to read a book from across the room or are running a ginormous screen. It's less about pixel count than about pixel density, and the Inverse Square Law plays a role insofar as distance from the display affects effective pixel density. At a certain point, it is less about looking better and more about bragging rights. VR does need a bit more resolution since the display is a lot closer than it is on most desktop rigs, and larger screens need a higher pixel count to keep from looking like a tile mosaic, but for anything under about 27", you can get by with less than you think youcan unless you sit close enough to your monitor to get nose prints on the screen.
 

Guest 4310777

Guest
I mostly agree, except for a couple things I feel were omitted or overlooked.

A) The latest generation of i3s are actually pretty impressive. The i3-8xxx series is comparable to the i5-7xxx series

B) 1080p gaming has notably lower requirements than 1440/4k/VR so you can get by with going a step down

C) You don't really need more than 1080p unless you have the eyesight to read a book from across the room or are running a ginormous screen. It's less about pixel count than about pixel density, and the Inverse Square Law plays a role insofar as distance from the display affects effective pixel density. At a certain point, it is less about looking better and more about bragging rights. VR does need a bit more resolution since the display is a lot closer than it is on most desktop rigs, and larger screens need a higher pixel count to keep from looking like a tile mosaic, but for anything under about 27", you can get by with less than you think youcan unless you sit close enough to your monitor to get nose prints on the screen.

Resolution and frame rate are a bit subjective, I think 1440p is the sweet spot for performance vs pixel density.. ofcourse sitting back fixes the problem, but its because the resolution of your own eye can no longer resolve the individual pixels of the screen, so they effectively "vanish" but.. you still have the limitation of the resolution, it's just that you have synchronised the limitations of your eye with the limitations of the display.. if that makes sense.. it's kind of like how, textures in games look amazing as long as each Texel is at the sub pixel level, then you get too close and it turns to shit.. so yes you could just keep your distance, so you can't perceive the lack of detail, but by doing so you haven't increased the detail, it's just not there. So anyway, rant over.

1440p is my subjective preference, however, I do look forward to 4k becoming the standard on high refresh rate monitors in the future because, it essentially nukes aliasing completely, looks incredible. That being said, 1080p is absolutely fine, running natively without the annoying interpolation when upscaling. I am spoiled, I have a Zowie 1080p 144hz, Asus 1440p 165hz, and a Dell 4K 60hz. The Asus is my fav. The 1080p is for the family comp, running a 1050ti, it's a good match, with a bit of effort you can tune games to look and run great (just)
 
If you INSIST on maxed out settings for every game then you really have no choice except to pony up for a 1080Ti, and you'll probably still complain until the 1180Ti

If you don't mind turning down details a notch so that it looks practically the same when you're moving but sitting still and taking a magnifying glass to your display will reveal that the reflection of the ripples in the puddle over there are a little off, then a 1070 will more than do it, and a 1060 might even do the trick.

If High is good enough for you though, even a 1050Ti will do 60FPS@1080p in most games.
I'll go with 1070, I guess.
 
I'll go with 1070, I guess.

Check also AMD Vega 56, it's competitive with 1070. I switched from Nvidia to AMD a while ago, and quite happy with it. Pricing was the issue until recently though, but it's getting better now.

I also heard that Nvidia is paying to developers of games to make their component better than AMD also with intel

It's mostly caused by developers using some Nvidia lock-in garbage, that doens't work on AMD with hardware acceleration. When developers are using proper tools, games perform well on all GPUs.
 
Last edited:
Resolution and frame rate are a bit subjective, I think 1440p is the sweet spot for performance vs pixel density.. ofcourse sitting back fixes the problem, but its because the resolution of your own eye can no longer resolve the individual pixels of the screen, so they effectively "vanish" but.. you still have the limitation of the resolution, it's just that you have synchronised the limitations of your eye with the limitations of the display.. if that makes sense.. it's kind of like how, textures in games look amazing as long as each Texel is at the sub pixel level, then you get too close and it turns to shit.. so yes you could just keep your distance, so you can't perceive the lack of detail, but by doing so you haven't increased the detail, it's just not there. So anyway, rant over.

1440p is my subjective preference, however, I do look forward to 4k becoming the standard on high refresh rate monitors in the future because, it essentially nukes aliasing completely, looks incredible. That being said, 1080p is absolutely fine, running natively without the annoying interpolation when upscaling. I am spoiled, I have a Zowie 1080p 144hz, Asus 1440p 165hz, and a Dell 4K 60hz. The Asus is my fav. The 1080p is for the family comp, running a 1050ti, it's a good match, with a bit of effort you can tune games to look and run great (just)

I agree that they are a bit subjective. Half a lifetime ago when I could count the pixels on my monitor from halfway across the room, I was considerably fussier. But nowadays, I have no reason to spend $600 on a monitor and another $400 on a 1070 when I can get a case, 600W PSU, motherboard, mid-range i5, 8GB RAM, 512GB SSD, 2TB spinner, 1050Ti, and a 32" 1080p screen for under a grand without missing anything.

As for 4k, I never noticed enough difference at normal viewing distances to warrant the cost except on screens too large to fit in my apartment. Yeah, it nukes aliasing, but so does aging. You may think it won't happen to you, but it will. There will come a time when your cornea becomes too stiff to focus on near objects no matter how much you strain and squint. And the money I save by gaming at 1080p instead of 4k is enough to get nice glasses with all the options instead of some ugly, basic pair. it's all about priorities ;)

Check also AMD Vega 56, it's competitive with 1070. I switched from Nvidia to AMD a while ago, and quite happy with it. Pricing was the issue until recently though, but it's getting better now.

There is still room for improvement, and AMD has to improve if they want their GPUs to be nearly as competitive a value as their nVidia rivals. When I checked a few minutes ago, they were still charging 1070Ti prices for 1070 performance. Getting better, but still not where it needs to be yet.

It's mostly caused by developers using some Nvidia lock-in garbage, that doens't work on AMD with hardware acceleration. When developers are using proper tools, games perform well on all GPUs.

Sadly, us players have no control over what tools devs used, so we are stuck choosing the cards that work best with the games we are given.
 
Last edited by a moderator:
When I checked a few minutes ago, they were still charging 1070Ti prices for 1070 performance. Getting better, but still not where it needs to be yet.

They actually charge normal prices. Retailers however inflated them, because of the cryptocurrency boom. AMD hardware is better for GPGPU, so it's more popular among miners. Now Ethereum boom is over, so prices are finally normalizing. You can finally now get Vega 56 for around the same price as GTX 1070.

Sadly, us players have no control over what tools devs used, so we are stuck choosing the cards that work best with the games we are given.

I don't really care personally. If developers picked some garbage that's Nvidia only, I just disable such feature. I consider it very poor taste on developers' part however.
 
Last edited:
I just checked. Sapphire Pulse Vega 56 - $480 on Newegg. Comparable to some GTX 1070 models. It was at crazy $750 a few months ago. It can probably go down a bit more still.
 

Guest 4310777

Guest
As for 4k, I never noticed enough difference at normal viewing distances to warrant the cost except on screens too large to fit in my apartment. Yeah, it nukes aliasing, but so does aging. You may think it won't happen to you, but it will. There will come a time when your cornea becomes too stiff to focus on near objects no matter how much you strain and squint. And the money I save by gaming at 1080p instead of 4k is enough to get nice glasses with all the options instead of some ugly, basic pair. it's all about priorities ;)

Something to look forward to huh :p I'm turning 30 soon, I am just old enough to have experienced the Commodore Amiga, Intel 286/386/486, Intel Pentium 1/2/3, AMD Athlon 64, Intel Core2duo, etc and all the different storage, graphics, and display technologies that evolved along the way. I think I am lucky to have experienced computers from a young age and seen the rapid evolution of games and hardware, in a way that just isn't happening anymore, at least not in a way that translates to dramatic changes in the end user experience, there's definately a curve of diminishing returns that we are pushing against on so many fronts now. I look forward to seeing improvements in artificial intelligence and physics in video games, the graphics is really too far ahead of the other elements now I think.. I would definately like to see consoles locking down 1080p @ 60fps instead of moving to 4K @ 30fps like I think we are going to see. I was watching a recent lecture by John Carmack, he was talking about how, not only did old console games like SNES run at 60fps, the actual latency was so much less from input to output compared to modern consoles and TVs.
 
I just checked. Sapphire Pulse Vega 56 - $480 on Newegg. Comparable to some GTX 1070 models. It was at crazy $750 a few months ago. It can probably go down a bit more still.

When I looked, the 56 was closer in price to a 1070Ti. Then again, Amazon still lists the 1050Ti is still ~25% higher than it was when I ordered mine.

Something to look forward to huh :p I'm turning 30 soon, I am just old enough to have experienced the Commodore Amiga, Intel 286/386/486, Intel Pentium 1/2/3, AMD Athlon 64, Intel Core2duo, etc and all the different storage, graphics, and display technologies that evolved along the way. I think I am lucky to have experienced computers from a young age and seen the rapid evolution of games and hardware, in a way that just isn't happening anymore, at least not in a way that translates to dramatic changes in the end user experience, there's definately a curve of diminishing returns that we are pushing against on so many fronts now. I look forward to seeing improvements in artificial intelligence and physics in video games, the graphics is really too far ahead of the other elements now I think.. I would definately like to see consoles locking down 1080p @ 60fps instead of moving to 4K @ 30fps like I think we are going to see. I was watching a recent lecture by John Carmack, he was talking about how, not only did old console games like SNES run at 60fps, the actual latency was so much less from input to output compared to modern consoles and TVs.

Yeah, I remember back in middle school being impressed that some computers even had color displays. Evolution has slowed, and in some cases reversed. I'd like to see a return to the days when game design (story, mechanics, and world layout) meant more than just graphical detail. I mean, finding a secret passage in an old 640x480 game was a lot more thrilling to me than a game where the walls are rendered in WTFHD sub-pixel detail but it's just a wall since they spent too much T&E with graphics to bother putting actual detail (like secret passages) in there. But enough on my hopes for CP2077.

Diminishing returns is why I think 1080p is good enough for most folks. Figure, what is the average display size among gamers? The surveys I've seen show that most people have screens under 25". Likewise, most folks have more modest computers than us gamers; even my i5-4460/1050Ti is a beast compared to what many folks have. But you won't hear many non-gamers threatening suicide if their latency ever exceeds 40ms, they drop to 59FPS for a split second, or they have to turn one single graphics setting down from UltraSuperMaximalOver9000!!!!!111111 either.

So if 1080p is good enough for average-sized displays, and a 1050Ti is good enough for 60FPS@1080p, why spend more than you really need to? Now, that's not to say that I wouldn't go for a 1070Ti pushing a 40" 1440p if I had the money to spare, but I could get more joy for the same money by, say, eating out once a week for a year so I'll go with the better ROI.
 
So what are your thoughts on the upcoming RTX 2070/2080? I'm thinking about upgrading my aging GTX 970.

Will depend on real world benchmarks. RTX is super cool and in 2 games I will be playing, so I will probably upgrade just for that, but it will be dependent on benchmarks.

Oh and don't get FOunder's Edition - super overpriced. Wait.
 
Building a gaming PC

Hello everyone, for Christmas I'm planning to buy myself a gaming computer, using this website https://www.pcspecialist.co.uk/pcs/ and I was hoping that there would be a few people here willing to give me good advice on what parts would be best to buy. I'm not completely clueless when it comes to computers but I've never bought a PC designed specially for gaming, so any advice on what works and what doesn't would be greatly appreciated.

Ideally I don't want to spend more than £1500.
I'd like to be able to play games (like the Witcher 3) in high settings, if possible, to give you an idea of what kind of performance I'm looking for.

At the moment, the computer specs I've chosen are as follows:

CPU: Intel Core i7 six core processor i7-4930K (3.4GHz) 12MB cache
Motherboard: ASUS P9X79 LE: INTEL SOCKET LG2011
Memory (RAM): 16GB Kingston Hyper-X Fury Dual-DDR3 1600MHz (2 x 8GB )
Graphics Card: 4GB NVIDIA GEFORCE GTX 980 - 1 DVI, 1 mHDMI, 3 mDP - 3D Vision Ready
1st Hard Disk (and my only hard disk): 500GB 3.5" SATA-III 6GB/s HDD 7200RPM 16MB CACHE
DVD/BLU-RAY Drive: 8x BLU-RAY ROM DRIVE, 16x DVD ±R/±RW
Memory Card Reader: INTERNAL 52 IN 1 CARD READER (XD, MS, CF, SD, etc) + 1 x USB 2.0 PORT
Power Supply: CORSAIR 650W VS SERIES™ VS-650 POWER SUPPLY
Sound Card: Asus Xonar DG 5.1 SoundCard & Headphone AMP (Award Winner)
OS: Genuine Windows 8.1 64 Bit - inc DVD & Licence
Keyboard & Mouse: CM Storm Devastator Keyboard and Mouse


The cost of this is (with VAT incl) £1619, which is a little over my budget. I was wondering if anyone has any suggestions for cheaper alternatives or if I've chosen something unnecessarily powerful and so on.

Also, I have no idea whether it's better to have 2 okayish graphics cards or one amazing graphics card... so I picked one that is quite expensive.

Any feedback would be appreciated!!

also i'm sorry if i posted this in the wrong section
1 brand. Alienware
 
So what are your thoughts on the upcoming RTX 2070/2080? I'm thinking about upgrading my aging GTX 970.

Also, hi @Gilrond-i-Virdan !
Nvidia have gone completely off their rocker with the pricing on those. Performance leaks (and their own between the line announcements) have something like 35-40% improvements for each 10x0 -> 20x0 successor. At prices 70-100% higher.
Also Dice is already backtracking on the raytracing implementation in BF V, simply because the performance is too bad.
I do get the feeling that the Turing cards are a filler generation, sort of previewing the raytracing technology which will only be useful on future, even more powerful chips in 7nm.


In other news, I'm very close to upgrading to a Pinnacle Ridge (Ryzen 2000) system now. Sandy Bridge can finally rest in peace.
 
Thanks everyone. Yeah current performance doesn't look too impressive considering the ridiculous price tag. My current GTX 970 is still alive and kicking and in some games all I have to do is turn down a couple of things, but it has been 4 years I kind of want a new GPU.

What about getting a previous gen GTX 1080 Ti if/when the prices go down? I don't love the idea of getting an older card but oh well...

Also what a disappointment that they wasted Turing's name on a transitional/experimental device. Should have been left for the ground breaking stuff.

Hi @M4xw0lf ! Bin noch immer in Deutschland und alles läuft gut!
 
Thanks everyone. Yeah current performance doesn't look too impressive considering the ridiculous price tag. My current GTX 970 is still alive and kicking and in some games all I have to do is turn down a couple of things, but it has been 4 years I kind of want a new GPU.

What about getting a previous gen GTX 1080 Ti if/when the prices go down? I don't love the idea of getting an older card but oh well...

Also what a disappointment that they wasted Turing's name on a transitional/experimental device. Should have been left for the ground breaking stuff.

Hi @M4xw0lf ! Bin noch immer in Deutschland und alles läuft gut!
Getting a 1080(Ti) seems the smarter choice atm, but let's wait and see. Less than two weeks until launch.
And well, the Turing GPU is kind of groundbreaking as it brings new technology to the market - only it will probably not be powerful enough to really pull off playable acceleration of raytracing applied on the large scale. Very good and much more powerful than previous gpus for non-realtime raytraced rendering nevertheless.

Nice to hear life's good for you here :)
 
My rule of thumb when building a new system is to always get the best-of-best hardware from the prior generation. Even if I have the money to burn, buying state-of-the-art hardware is usually buying really expensive, proprietary technology, completely untested and unsupported on the present market. So, I'm basically spending ~$1,000 to beta test the hardware. It will be 1-3 years before the first titles appear that are written to take advantage of it and most of the kinks are worked out. By that time, not only will prices have dropped significantly, but the newer models will be faster and more stable.

The system I built in 2015 is still screaming today: ASUS Z97 mobo, i7-4790K, GTX 980 ti, 16 GB RAM, Samsung EVO SSD. No overclocking, no extra cooling. I can still run all my games at Maximum / Ultra graphics at 1080p smoothly (50-60 FPS). Including recent performance-killers, like SCUM. (Heck, I got Batman: Arkham Knight for free at release, and even that ran without any major issues.)
 
Top Bottom