Getting PC ready for Phantom Liberty DLC

+
HI, I', trying to get my PC ready for the DLC. I did notice that the requirements went up.

Would it be best to just wait until it is out, and then look at benchmarks? Probably, I think.

Right now, I have a 12700K and the RTX 3070. I have a 1080p screen and a 4k screen but I have played CP on 1080p so far, RT Ultra. Playing in 4k would be nice.

I'm thinking it would be good to get a RTX 4080, no? But will it be enough, or do I need a better CPU as well? Possibly wait for 14700K? Or does AMD have good CPUs as well right now?

e4fde54e7fcfca001f98a02d2594d9435806d700.jpg
 
Hopefully they explain the updated reqs soon. That being said, I'd definitely wait for independent benchmarks to come out next month first.

Many are scratching their heads over the updated reqs, particularly the CPU's listed. A 7800X3D for 1080p High, but a 12900 for Overdrive 4k? Hard to accept that without proof.

If it's accurate it'll be very interesting to learn why...
 
What if the graphical presets are named after how much you will miss the sun after playing the expansion? :smart:
"Ray tracing: Overdrive" sounds suspicious.

FPS = Fairly Plane Skin
OS = Other Side (and the value is Windows)
RAM = Risk At Midday
ST O RAGE = Stay Tight Or Rage (and the value is GB NVME = Great Burn NeVerMind End)

Then it would make sense that the specs increased :O

I will walk myself out :) o_O
 
Many are scratching their heads over the updated reqs, particularly the CPU's listed. A 7800X3D for 1080p High, but a 12900 for Overdrive 4k? Hard to accept that without proof.
Scratching my head too. The new CPU requirements/recommendations seem very high. i9 12900 even without ray tracing? As for the GPU, I don't think I'll be able to play even 1080p on RT ultra with the 3070's only 8 GB VRAM. I don't need high fps in single player, but 60 fps would be nice.
 
Many are scratching their heads over the updated reqs, particularly the CPU's listed. A 7800X3D for 1080p High, but a 12900 for Overdrive 4k? Hard to accept that without proof.
Its not really required, its recommended. Im guessing it will run on lower specs at those settings, but we wont know until release unless CDPR hands out codes for people too test before then. CDPR ill be willing to test it out! Gimme a call ;)
 
Its not really required, its recommended.
Yes, but who wants to play at 30 fps? Besides, especially in this game, if you enter a busy market, it'll drop some more. If you play at 30 fps you might end up with 20-25 in certain locations, likewise with 60 fps, it will drop to 40-50 fps in some places.
 
Yes, but who wants to play at 30 fps? Besides, especially in this game, if you enter a busy market, it'll drop some more. If you play at 30 fps you might end up with 20-25 in certain locations, likewise with 60 fps, it will drop to 40-50 fps in some places.
i mean clearly the 13th gen intel or 7000 series AMD will be ahead of it, but we are not sure why these recommendations exactly because, well, they are nonsense.

For instance, if a game is going to use the extra L3 cashe of an X3D chip effectively, then a 12700 would be way behind, if it's not using that at all, why not the 7700? which is more equivalent to the 13700 in pretty much everything else. why a 7900 if it doesn't need the extra cores? is it that limited by the clock speed does it do that much better with the clock boost the 7900 gets? would that not mean that the 12900 should also get beaten by the 7700? it is all very very unclear.
 
Last edited:

Attachments

  • Screenshot 2023-08-29 150036.png
    Screenshot 2023-08-29 150036.png
    209.8 KB · Views: 38
  • Screenshot 2023-08-29 150616.png
    Screenshot 2023-08-29 150616.png
    210.2 KB · Views: 33
  • Screenshot 2023-08-29 150629.png
    Screenshot 2023-08-29 150629.png
    208.3 KB · Views: 34
So, 3090 TI doesn't cut it anymore if you want 4k and path tracing, according to Nvidia but RTX 20 and 30 series will get DLSS 3.5 too (without frame generation). This just reaffirms for me, that a 4080 or 4090 is required to see it in 4k in its best version.
Tbh i have trouble seeing 4k PT on a 4080 due to the Vram on it. Sure on performance mode DLSS it might be enough but on quality its already close to 15 gb vram usage now. DLSS 3.5 might help tho, not sure really. Anyways its nice to se some numbers but at the same time 1080p upscaled to 4k isent ideal -.-
 
Tbh i have trouble seeing 4k PT on a 4080 due to the Vram on it. Sure on performance mode DLSS it might be enough but on quality its already close to 15 gb vram usage now. DLSS 3.5 might help tho, not sure really. Anyways its nice to se some numbers but at the same time 1080p upscaled to 4k isent ideal -.-

Yes, this is an important point: Performance profile, not Quality. As for 16 GB not being enough, Hm. I don't have the budget for a 4090. I would rather wait for a 4080 refresh/super.

edit: I wonder how many GPU generations and years it will take until it is playable in 4k + raytracing without DLSS.
 
Last edited:
Yes, this is an important point: Performance profile, not Quality. As for 16 GB not being enough, Hm. I don't have the budget for a 4090. I would rather wait for a 4080 refresh/super.

edit: I wonder how many GPU generations and years it will take until it is playable in 4k + raytracing without DLSS.

The pricing is out of wack as well for the 4080.

So much so that I am thinking of upgrading from 2080s/3700x to 4070/5800x3D instead, which comes cheaper than just a 4080 where I live, by almost 300$ (CAD).

The 4070 with not run 4k Ultra RTX, but it will run 4K mixed settings with some RT above 60fps.

I always bought xx80 cards, but I do not like the price/perf. ratio on this one. The 4070 makes a bit more sense in that area IMO. The 4090 is the only card that makes a good proposition, but I can't justify myself, or to my business, spending that much on a GPU.

I guess 4070/5800x3D will have to be my rig for the next three years, for better or worse.
 
Top Bottom