Are there lines yet for the 4090? Less than a week away

+
I really want a 4090 to play cyberpunk on. Are there reports of lines yet forming at microcenter? I remember for the 3090 some people camped out for days and days.
 
I wouldn't be too surprised, but things are a bit different. Prices hadn't reached a fever pitch when the 3K series launched, miners got wrecked, we're in the third quarter of a recession and general inflation will surely influence people with less discretionary spending this time around. The current ask is 1600 for the 4090. It doesn't take a wild imagination to estimate they'll swiftly jack the price up to see who's willing to fork over the cash. You can try camping out on your computer with the F5 key ready.

If people are camping out, it's likely a good chunk of them are trying to get ahead of the looming price curve.
 
1600 sounds pretty cheap, over here its probably going too be atleast 2500 -.- no prices are confirmed in my country yet so we shall se. Ill wait untill the next year atleast before i decide. Even amds new cpus did not sell out so the situation is quite diffrent this time around.
 
I think it may be easier getting one of these suckers than anything when the 30s came out. I couldn't get a 3070 for a good price until just a couple months ago. Not like my 980ti before it was bad. If history repeats itself though.....
 
1600 sounds pretty cheap, over here its probably going too be atleast 2500 -.- no prices are confirmed in my country yet so we shall se. Ill wait untill the next year atleast before i decide. Even amds new cpus did not sell out so the situation is quite diffrent this time around.
True. I personally think the price sounds about right, maybe a tad high for the general market, but not unreasonable. Surely it cost a pretty penny to produce given the circumstances. I'm just guessing that it's unlikely people are camping out for various reasons. IF any do, I would guess it's to avoid potential price spikes... A price drop may be in order, but the ones Newegg accidentally listed all sold out swiftly. 2077 was the real driver for 3K sales. Is there anything special coming out for PC to drive 4K sales?
Post automatically merged:

I think it may be easier getting one of these suckers than anything when the 30s came out. I couldn't get a 3070 for a good price until just a couple months ago. Not like my 980ti before it was bad. If history repeats itself though.....
I wish I had that chance. I had to bite the bullet last year and get a 3080ti at markup when my PC took a dump. I work in VFX; having the CUDA cores is crucial.
 
Last edited:
Is there anything special coming out for PC to drive 4K sales?
Not sure, my 3090 was for cyberpunk pretty much. Never doing that misstake again :D Luckily it works for other things too. But i think its a pretty poor gaming winter incoming -.- Havent been exited about a game in ages
 
Sounds like a great card from what I've read about it so far. However, I'm not sure that CP2077 can make full use of it. I run the game with everything maxed out at 4k on a 3090ti, and almost never drop below about 80fps. I'm looking forward to seeing what sorts of real world numbers it puts up.
 
I'm keen to see what RTX "OverDrive" really does in 2077. I'm hoping it means, at minimum, the game's faux GI system is disabled. Doing so now gets rid of all the terribly placed occlusion boxes, broken shading, and other artifacts but comes at a performance cost as those things are used as an assist for their current RTXGI solution and breaks the lighting even in some situations. Turning it off without RTX is ano-no unless you... ahem... use GITS Visuals.

The idea, and performance, of their solution is grand. People get a fast global illumination system without RTX, that also uplifts RTXGI performance compared to not having it, but the artifacts can be distracting at some moments. Mafia remake had their own solution which looks fairly decent, but it can be slow and even quite noisy at times. First world problems.
 
Sounds like a great card from what I've read about it so far. However, I'm not sure that CP2077 can make full use of it. I run the game with everything maxed out at 4k on a 3090ti, and almost never drop below about 80fps. I'm looking forward to seeing what sorts of real world numbers it puts up.
Is that on DLSS ultra performance? Seems a bit high for 4k max othervise since i barely keep 60 with my very OCed 3090 (basicly same clockspeeds as a 3090ti) on DLSS Performance.
 
Sounds like a great card from what I've read about it so far. However, I'm not sure that CP2077 can make full use of it. I run the game with everything maxed out at 4k on a 3090ti, and almost never drop below about 80fps. I'm looking forward to seeing what sorts of real world numbers it puts up.
CDPR will be updating the game with graphics settings specifically applicable to the 4000 series.
 
Is that on DLSS ultra performance? Seems a bit high for 4k max othervise since i barely keep 60 with my very OCed 3090 (basicly same clockspeeds as a 3090ti) on DLSS Performance.
That's a good question. I just checked, and it's set to auto. I guess that means it's dynamically adjusting. I thought it was set to ultra, but apparently not. I'm not sure whether it was set to ultra the last time I checked the fps, but I assume not. So, my fps probably aren't quite as good as I was thinking.
 
That's a good question. I just checked, and it's set to auto. I guess that means it's dynamically adjusting. I thought it was set to ultra, but apparently not. I'm not sure whether it was set to ultra the last time I checked the fps, but I assume not. So, my fps probably aren't quite as good as I was thinking.
It sounded high but on auto it can be hard too tell. It might be that it drops down on the heavy parts atleast.
 
It's just ridiculous :) I surely hope new AMD cards won't be that huge.

I wonder if their switch to chiplets will help them avoid such kind of size creep.
 
It's just ridiculous :) I surely hope new AMD cards won't be that huge.

I wonder if their switch to chiplets will help them avoid such kind of size creep.

Probably not .... The Die isn't that much larger than the 3090 but it's packed with a lot more transistors and even though it is considerably more power efficient in terms of watts per frame it packs a lot of power and thus needs a large cooling solution. The actual PCB is less than half the size of the heatsink. If AMD wants to compete it's going to have the same problem because 450W is 450W whether it's one large die or several chiplets. The advantage in the chiplet design is it is cheaper to manufacture because you get a much much higher yield per wafer because instead of a defect ruining an entire die or a 100% loss but if you have 4 chiplets using the same space on a wafer as the single large die and one has a defect your loss is only 25%. This should allow AMD to make an equally powerful device but offer it at a lower price point.
 
May be chiplets allow them to run different parts with different frequencies possibly reducing power draw due to more flexible control over them. I saw something about it giving more room for better efficiency. In CPUs this already became the common approach, but I guess in GPUs AMD is using it first.
 
May be chiplets allow them to run different parts with different frequencies possibly reducing power draw due to more flexible control over them. I saw something about it giving more room for better efficiency. In CPUs this already became the common approach, but I guess in GPUs AMD is using it first.
Im fairly sure they will win the performance/watt and lower power draw. They will probably not win RT and raw power tho.. Im guessing they have not caught up on RT cores yet, its going to be hard too keep up if nvidia keeps going like this. They have almost doubled the performance in rasterasation and 4090 is more power efficent then the 3090(especially if undervolted it seems). Chiplets will probably bring problems too and needs too be developed just like the cpus needed. im quite curious too see how the cooling will work and if it will have a IHS or if it sill be direct die cooling and alot of other things.
 
I think AMD will win both efficiency and raw compute power (but not RT as you suggest). They already basically won with previous generation (i.e. AMD's 6000 series beats Nvidia's 3000 series in raw compute power for comparable models), so I'd expect them to keep being ahead still.

But I guess we'll see soon enough when benchmarks show up in November. I'm personally interested in Linux gaming use case.
 
Top Bottom