Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    SUGGESTIONS
  • STORY
    MAIN JOBS SIDE JOBS GIGS
  • GAMEPLAY
  • TECHNICAL
    PC XBOX PLAYSTATION
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
FAN ART (THE WITCHER UNIVERSE)
FAN ART (CYBERPUNK UNIVERSE)
OTHER GAMES
Menu

Register

Building a gaming PC

+
Prev
  • 1
  • …

    Go to page

  • 105
  • 106
  • 107
  • 108
  • 109
  • …

    Go to page

  • 154
Next
First Prev 107 of 154

Go to page

Next Last
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#2,121
Nov 18, 2018
The problem is that games can't always match the refresh rate of the monitor really. Not sure how vsync would help if your game framerate is simply lower than your monitor refresh rate. Without adaptive sync you'll still have some tearing.

It's easy if your refresh rate is 60 Hz, and your GPU always produces more than 60 fps. With vsync you'll get your stable 60 fps and it will match nicely. But in other cases it won't.
 
EngryEngineer

EngryEngineer

Forum regular
#2,122
Nov 18, 2018
Huh, man lots of tech blogs and G-Sync engineers are wrong when they say it is a derivative of the adaptive sync standard.

The best of luck with the “future proofing” and the like.
 
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#2,123
Nov 19, 2018
EngryEngineer said:
The best of luck with the “future proofing” and the like.
Click to expand...
Worry not, Nvidia lock-in will lose this one. But I don't expect it very soon.
 
Hoplite_22

Hoplite_22

Senior user
#2,124
Nov 19, 2018
Gilrond-i-Virdan said:
Worry not, Nvidia lock-in will lose this one. But I don't expect it very soon.
Click to expand...
i mean given DP 2.what ever onwards is going to have adaptive sync as part of the standard, feels like Nvidia's G-Sync is going to have fall in line or display makers will let it fall by the way side so they can continue selling the latest thing.
 
  • RED Point
Reactions: Gilrond-i-Virdan
SigilFey

SigilFey

Moderator
#2,125
Nov 19, 2018
Gilrond-i-Virdan said:
The problem is that games can't always match the refresh rate of the monitor really. Not sure how vsync would help if your game framerate is simply lower than your monitor refresh rate. Without adaptive sync you'll still have some tearing.

It's easy if your refresh rate is 60 Hz, and your GPU always produces more than 60 fps. With vsync you'll get your stable 60 fps and it will match nicely. But in other cases it won't.
Click to expand...
Whether above or below the refresh rate, there will be tearing. The big issue with straight Vsync (at least in times past) is that it requires anything that can't maintain a 60 FPS draw rate to drop to 30 (or if the refresh is 120, it drops to 60, etc.) Of course, that means that games can feel like they're "chugging" all of a sudden. Modern hardware is more than capable of handling it, and the drop from 120 to 60 isn't so bad as to interrupt gameplay. Presently, it's fairly expensive hardware for such a minuscule return. Most players could take it or leave it.

Besides, running at unlimited FPS is great way to wear out hardware for no reason. End result is a monitor can never display more FPS than its max refresh rate. Lots of players look at their "render" rate, then claim they're seeing 100+ FPS in XYZ title. They're not. They're seeing 60, because their monitor is running at 60 Hz.

G-Sync / Freesync / adaptive sync is mostly going to come into play when my system can't handle the load. It's a fantastic solution for that, but it's not normally necessary and quite expensive for the few titles that might require it. I'd say it's the equivalent adding a little extra dressing to your salad: nice to have, but it hardly makes or breaks the meal.


Gilrond-i-Virdan said:
Worry not, Nvidia lock-in will lose this one. But I don't expect it very soon.
Click to expand...
Hoplite_22 said:
i mean given DP 2.what ever onwards is going to have adaptive sync as part of the standard, feels like Nvidia's G-Sync is going to have fall in line or display makers will let it fall by the way side so they can continue selling the latest thing.
Click to expand...
Having separate processors built into monitors is a valid solution for the issue as things stand now. I feel they will definitely figure out a way of making it work through the software or build something directly onto the video cards themselves, making Nvidia's method come across as over-engineered. I think it will really start coming into play once 4K becomes more standard, as it'll allow for much more complex geometry without slowdown, and adaptive sync will win out and become a standard technique.

Alternatively, having a separate "sync" processor will become so necessary for advanced techniques that it will simply require manufaturers to incorporate them, meaning costs will drop sharply.

Either way, I seriously doubt Nvidia will be cornering this feature as proprietary tech.
 
V

volsung

Forum veteran
#2,126
Nov 19, 2018
I'm very conflicted. I can get the relatively nice Gsync monitor and use the variable refresh rate immediately with my current GPU, as well as with an upgrade, OR I can shell out a large amount of money for a similarly specced Freesync monitor AND a Vega 64.

I have been reading a bit about what's been happening behind the scenes, regarding Linux development, and I can see why people dislike Nvidia so much. From my perspective as a Linux user, who dual boots for gaming, I can deal with a fair amount of issues compiling and installing proprietary drivers. Nvidia Linux drivers are still much more efficient than AMD Linux drivers, and even GSync (from my understanding) works transparently in Linux. But they have been really shitty in other ways, from not adopting industry standards to full-on anticompetitive crap like the now defunct Geforce Partner Program.

It's just so hard to switch to AMD at this very moment, when their most competitive card is such a power hungry beast. Electricity is expensive in here, and given my current options I really should buy Nvidia, but it makes me feel a bit dirty.

I wonder what will happen to GSync monitors if Nvidia adopts Freesync. I guess they would just stop being manufactured but the drivers would support them? And I suppose they would not work with non-Nvidia GPU's? That'd be quite a lesson there, if an RTX killer is released sometime and Gsync monitors become just... monitors. But then again that might not happen anytime soon, not within the life expectancy of a GPU (or a monitor for that matter).
 
Last edited: Nov 19, 2018
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#2,127
Nov 19, 2018
If you want to avoid Nvidia but not use Vega, you can just wait with switch to 4K. But hard to say exactly how long it will take for AMD to release a new high end card. But they'll surely do it at some point.

I'm interested in higher resolutions too, but will probably wait exactly because of this. I'm not going to use Nvidia anymore. Once you switch to open drivers, going back to blobs is really not something you'd want to do.
 
H

highhand

Senior user
#2,128
Nov 19, 2018
My opinion only: Sit down and really think a bit. You have enough budget to really put together a system for the next, say 5 years. So what you need to think about is the next 5 years. As prices and offerings tumble, is V/R in the cards for you? And what about the huge decrease in the cost of monitors. 4K and more, already here and discounted. And even now, you can pick up some huge savings which would increase your viewing pleasure and speed as well as content capability of your system. Remember also, the newer games are memory hogs. Just think of something like Skyrim with all its MODS, or Fallout. Immersive game playing is well worth getting good products for and as most know V/R around the corner will just become better. In this light look for specials. Right now through the Holidays they abound. 650 power supplies under $90, as is a good case, Intel i5 CPU's are inexpensive, some coming with a free CPU cooler, and 2X8 RAM can be found under $100. The Motherboards out there are many and $150 would get you a good one, with good enough sound you do not need a sound card. Read the reviews. The big hit right now of course is the video card. You get what you pay for, And for the future most recommend a 1080 ti or better. I see the new 2070's can be had for just a few twentys more. These cards are expensive yes, so for now get a lower cost one that you can use while you save money for the video card upgrade, for specials sure to come (as crytocurrency seems to be on tha wane (for now)). Your system then is ready for a better monitor and video card, and most games coming will fall right into place without a problem. Again, maybe a stretch, but I do not think you will regret it. Think ahead into the system you really want.
 
M

M4xw0lf

Forum veteran
#2,129
Nov 24, 2018
Laptop stays dead. Good thing it's black friday time, so i got a great deal on a Huawei Matebook D14, with AMD Ryzen 5 processor - a device which I had eyed for some time without real reason or plan to buy it. Now i ordered one for just 500€ from Amazon Italy, including shipping. :D
 
M

M4xw0lf

Forum veteran
#2,130
Nov 28, 2018
Got my shiny new toy/piece work equipment. (It arrived yesterday, but went to the post office since nobody was home).
I like it a lot so far - could switch to German OS language and keyboard layout without any problems during first startup (ordered from Italy, remember), and have successfully migrated my Firefox settings, so a lot of important stuff is already done ;)
Pleasantly little bloatware, and the PC Manager software seems actually a useful thing - a lightweight center to keep you drivers up to date (already used it to update the graphics drivers, super easy to use and did update without problems).
Display is adequate, contrast and brightness are good, only colors in the red spectrum are a bit off, but this can be somewhat alleviated with the nice color control options in the Radeon settings.
500€ well spent, I guess, especially since this laptop is now at 700-800€ again.

Oh, and I have now more threads on my laptop than on my gaming PC... the i5 2400 has to go! Bring on the Ryzen 3000 7nm goodness
 
Last edited: Nov 28, 2018
  • RED Point
Reactions: SigilFey, Gilrond-i-Virdan and Riven-Twain
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#2,131
Nov 30, 2018
Intel will continue their focus on Linux gaming in their upcoming discrete GPU: https://hothardware.com/reviews/intel-answers-gpu-questions

Intel’s software suite will obviously play a big role in its upcoming discrete GPU efforts as well. Part of our discussion centered around a plan to use telemetry and machine learning, on a per-system and per-user basis, to suggest the best settings possible for a particular game title. A lot of work has to be done for Intel to realize its vision, but Ari seemed confident and pointed out that Intel has already made significant strides in overall compatibility and with Day 0 driver support for some of the latest games. We should also mention that Ari underscored that Linux gaming will be a focus for Intel as well.
Click to expand...
Actual GPUs are expected in 2020. Would be interesting to see how it will work out.
 
eskiMoe

eskiMoe

Mentor
#2,132
Dec 2, 2018
I think I'm gonna build a new PC (or mostly new) around Zen 2 next year. 2700X is already impressive and if they can push boost clocks closer to 5 GHz on 3700X, I see very little reason to opt for the much more expensive Intel chips.
 
  • RED Point
Reactions: Gilrond-i-Virdan
SigilFey

SigilFey

Moderator
#2,133
Dec 3, 2018
Gilrond-i-Virdan said:
Intel will continue their focus on Linux gaming in their upcoming discrete GPU: https://hothardware.com/reviews/intel-answers-gpu-questions
Click to expand...
Man, AMD threw them into tizzy. I love watching that happen over and over again. Every time Intel thinks they've cornered it, AMD is like: BAM! K seires. BAM! Athlon. Not enough!? BAM! Athlon 64. Watch out! X2. Okay...okay...we're done...we're done... BAM! Ryzen.
 
  • RED Point
Reactions: Gilrond-i-Virdan
M

M4xw0lf

Forum veteran
#2,134
Dec 3, 2018
SigilFey said:
Man, AMD threw them into tizzy. I love watching that happen over and over again. Every time Intel thinks they've cornered it, AMD is like: BAM! K seires. BAM! Athlon. Not enough!? BAM! Athlon 64. Watch out! X2. Okay...okay...we're done...we're done... BAM! Ryzen.
Click to expand...


FUTURAMD
 
  • RED Point
Reactions: SigilFey
Pruny

Pruny

Rookie
#2,135
Dec 3, 2018
My first cpu was AMD Athlon 3700+ at 200$ in march 2006, i still have it ;D
 
  • RED Point
Reactions: Gilrond-i-Virdan and SigilFey
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#2,136
Dec 6, 2018

These are some beast CPUs in the making, if the charts are correct:


Post automatically merged: Dec 6, 2018

More info on upcoming GPUs here. Take it all with a grain of salt, since it's some kind of a leak.
 
Last edited: Dec 6, 2018
  • RED Point
Reactions: eskiMoe
V

volsung

Forum veteran
#2,137
Dec 6, 2018
Even if it's only a rumor I hope it will knock some sense on Nvidia and make them reevaluate the pricing of their RTX line. I've been doing some reading and honestly think the RTX cards are not bad, they are just stupidly priced. The RTX 2070 is basically a slightly faster and more energy efficient GTX 1080, which is still a high end card. The problem at least in Germany is 1080's are nowhere to be found anymore. If the RTX 2070 were priced at 450 € or so, I don't think anyone could complain.

That said $250 for 2070/1080 performance is insane and I want it to be true. I also hope this line is a lot more energy efficient which is the thing bothering me about Vega. One thing I can tell you with almost 100% certainty is my next CPU will be AMD.

In other news I am a little ashamed to tell you this but I got my hands on an HP Omen X 35, a 35" VA monitor with G-Sync. It was on sale, an unbelievably good sale for German standards, cheaper than anything else including Freesync and "Nosync" monitors. I'm not 100% convinced because it's so high contrast, there is some gradient banding in low quality video (eg. youtube, game intro cinematics) but I'm messing around with the settings to see if it is fixable. Brighter, high definition content looks amazing however and Gsync appears to be automatically recognized in Linux.
 
Last edited: Dec 6, 2018
  • RED Point
Reactions: Gilrond-i-Virdan
SigilFey

SigilFey

Moderator
#2,138
Dec 6, 2018
Something to consider as well is number of cores versus clock frequencies. Games are going to benefit less from multiple cores at low frequencies than fewer cores at high frequencies. More cores at lower frequencies are better for productivity (rendering, compiling code, etc.). More cores does not directly result in better gaming performance.

The way I've always chosen is to look at the highest clock frequency available for the fewest cores. So, say there are 4-core chips hitting 5.2 GHz. If I want to future-proof, I'll want to find an 8-core chip in that same ballpark, say: 8 Cores at 4.8 GHz.
  • A 12-core running at 3.5 GHz is likely to be "good". Alright. It'll probably show its limitations within a year or so.
  • Vice versa, a 12-core at 4.3 GHz is overkill, expensive, and will likely offer very little return for gaming. (It's like buying a semi-truck with a Ferrari engine and using it for grocery shopping.) By the time games are made that will truly benefit from that power, the chips will be much more advanced and far cheaper.
  • I'd want no part of a 16-core at 2.9 Ghz (<-- this would be a pure, workstation CPU and likely to offer surprisingly poor gaming performance before too long).
For gaming, I want to ensure that I maintain competitive GHz while the game is running on a single core. That should ensure relatively smooth performance, and a GPU at the same level will handle the load very nicely. Games can't be relied upon to utilize that much threading effectively. (Not yet.) Backwards compatibility is also a factor if I intend to hang onto the games of yore and enjoy them for years to come.

As always, I'll never buy anything that was "just released". Give it a generation or two. It's always a bit of a gamble, but I'd say a very strong 8-core in the ~5.0 GHz range is the best bang for the buck right now.

For AMD chips, I'm happy to subtract 0.5 GHz. AMD chips have traditionally offered equivalent performance to Intel chips at significantly lower GHz and much lower costs. Don't rely too much on "benchmarks", either. For whatever reason, AMD chips tend to get rather spotty scores in bench tests, but their actual, in-game performance can often whomp Intel chips. I've owned three AMD systems over time, and these are my considerations:

- AMD chips have not been adopted by developers as readily as Intel chips. There are occasionally odd issues that arise for certain titles / drivers / Windows updates. These issues are actually pretty darn uncommon, to be fair -- AMD chips work flawlessly with most titles. However, the issues that do arise are not always formally addressed.

+ The cost offset makes AMD chips a really real consideration. I'd much rather buy an AMD chip at the specs I want than shift down to a lower grade Intel chip for the same price. (The primary reason I purchase Intel is for that extra bit of reliability, and because I can usually get them almost at-cost. I normally have to wait for months, though.)
 
Gilrond-i-Virdan

Gilrond-i-Virdan

Forum veteran
#2,139
Dec 6, 2018
SigilFey said:
Something to consider as well is number of cores versus clock frequencies. Games are going to benefit less from multiple cores at low frequencies than fewer cores at high frequencies.
Click to expand...
I'd add - older, badly designed games that don't scale and don't utilize available parallelism. Properly designed games today should not be CPU bound if there are many cores available. They can use Vulkan to saturate the GPU using all cores.

So you definitely going to benefit from more cores in modern games.
 
Last edited: Dec 6, 2018
SigilFey

SigilFey

Moderator
#2,140
Dec 6, 2018
Gilrond-i-Virdan said:
I'd add - older, badly designed games that don't scale and don't utilize available parallelism. Properly designed games today should not be CPU bound if there are many cores available. They can use Vulkan to saturate the GPU using all cores.

So you definitely going to benefit from more cores for modern games.
Click to expand...
Right, to an extent. I identify backwards compatibilty / legacy gaming as well. Newer games do certainly incorporate multi-threading, but not in such a way that it will benefit from 12-16 cores. Not yet. Granted, I haven't sat there running diagnostics, but I've got very few titles that utilize more that 4 cores. (Personally, I've only seen a handful. One of the Battlefields [4...I think] would go up to 6. I think SCUM uses a bunch, but it also tends to have super-stutter-fits at times, so it's not very well optimized yet. There are some mods / .EXE tricks that will force Oldrim to use more cores and memory, but stability suffers.) Most of my library uses only 1 core and may occasionally poke at a second.

Will games be created that can use 12+ cores in the near furture? Most likely! Will they greatly benefit from it? Doubtful. So, my opinion is that 8 Cores, high GHz is the best balance right now.
 
Prev
  • 1
  • …

    Go to page

  • 105
  • 106
  • 107
  • 108
  • 109
  • …

    Go to page

  • 154
Next
First Prev 107 of 154

Go to page

Next Last
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

CD PROJEKT®, The Witcher®, GWENT® are registered trademarks of CD PROJEKT Capital Group. GWENT game © CD PROJEKT S.A. All rights reserved. Developed by CD PROJEKT S.A. GWENT game is set in the universe created by Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.