I really want a 4090 to play cyberpunk on. Are there reports of lines yet forming at microcenter? I remember for the 3090 some people camped out for days and days.
True. I personally think the price sounds about right, maybe a tad high for the general market, but not unreasonable. Surely it cost a pretty penny to produce given the circumstances. I'm just guessing that it's unlikely people are camping out for various reasons. IF any do, I would guess it's to avoid potential price spikes... A price drop may be in order, but the ones Newegg accidentally listed all sold out swiftly. 2077 was the real driver for 3K sales. Is there anything special coming out for PC to drive 4K sales?1600 sounds pretty cheap, over here its probably going too be atleast 2500 -.- no prices are confirmed in my country yet so we shall se. Ill wait untill the next year atleast before i decide. Even amds new cpus did not sell out so the situation is quite diffrent this time around.
I wish I had that chance. I had to bite the bullet last year and get a 3080ti at markup when my PC took a dump. I work in VFX; having the CUDA cores is crucial.I think it may be easier getting one of these suckers than anything when the 30s came out. I couldn't get a 3070 for a good price until just a couple months ago. Not like my 980ti before it was bad. If history repeats itself though.....
Not sure, my 3090 was for cyberpunk pretty much. Never doing that misstake againIs there anything special coming out for PC to drive 4K sales?
Is that on DLSS ultra performance? Seems a bit high for 4k max othervise since i barely keep 60 with my very OCed 3090 (basicly same clockspeeds as a 3090ti) on DLSS Performance.Sounds like a great card from what I've read about it so far. However, I'm not sure that CP2077 can make full use of it. I run the game with everything maxed out at 4k on a 3090ti, and almost never drop below about 80fps. I'm looking forward to seeing what sorts of real world numbers it puts up.
CDPR will be updating the game with graphics settings specifically applicable to the 4000 series.Sounds like a great card from what I've read about it so far. However, I'm not sure that CP2077 can make full use of it. I run the game with everything maxed out at 4k on a 3090ti, and almost never drop below about 80fps. I'm looking forward to seeing what sorts of real world numbers it puts up.
That's a good question. I just checked, and it's set to auto. I guess that means it's dynamically adjusting. I thought it was set to ultra, but apparently not. I'm not sure whether it was set to ultra the last time I checked the fps, but I assume not. So, my fps probably aren't quite as good as I was thinking.Is that on DLSS ultra performance? Seems a bit high for 4k max othervise since i barely keep 60 with my very OCed 3090 (basicly same clockspeeds as a 3090ti) on DLSS Performance.
It sounded high but on auto it can be hard too tell. It might be that it drops down on the heavy parts atleast.That's a good question. I just checked, and it's set to auto. I guess that means it's dynamically adjusting. I thought it was set to ultra, but apparently not. I'm not sure whether it was set to ultra the last time I checked the fps, but I assume not. So, my fps probably aren't quite as good as I was thinking.
Here's a spoiler. You will not notice any difference, and your wallet will be thinner.I really want a 4090 to play cyberpunk on. Are there reports of lines yet forming at microcenter? I remember for the 3090 some people camped out for days and days.
It's just ridiculousI surely hope new AMD cards won't be that huge.
I wonder if their switch to chiplets will help them avoid such kind of size creep.
Im fairly sure they will win the performance/watt and lower power draw. They will probably not win RT and raw power tho.. Im guessing they have not caught up on RT cores yet, its going to be hard too keep up if nvidia keeps going like this. They have almost doubled the performance in rasterasation and 4090 is more power efficent then the 3090(especially if undervolted it seems). Chiplets will probably bring problems too and needs too be developed just like the cpus needed. im quite curious too see how the cooling will work and if it will have a IHS or if it sill be direct die cooling and alot of other things.May be chiplets allow them to run different parts with different frequencies possibly reducing power draw due to more flexible control over them. I saw something about it giving more room for better efficiency. In CPUs this already became the common approach, but I guess in GPUs AMD is using it first.