Übersampling in The Witcher 3 petition

+
ubersamplg is gonna be pointless ?

with the new drivers of amd and nvidia you can now downsampling a game


for example if you play at 1080 you cang o up to 4k in the same 1080p monitor


so why the game should get an ubersampling option if we can do the same with the drivers
 
with the new drivers of amd and nvidia you can now downsampling a game


for example if you play at 1080 you cang o up to 4k in the same 1080p monitor


so why the game should get an ubersampling option if we can do the same with the drivers

Downsampling is not necessarily better. It's the antialiasing method of last resort in most situations. It does have the advantage that you can always do it in postprocessing, but that is its only advantage.

It takes a lot of VRAM to downsample. Starting from a 4K image is about 8Mpixels, 128Mbytes in unpacked format just for RGB and alpha, for the original image alone.

It always imposes an N^2 computational burden. Downsampling 4K to 1080p requires 4x the workload even before the additional computation to downsample.

Downsampling doesn't support MSAA. You have to do all the computation on every pixel.

Downsampling imposes a specific and inflexible sampling pattern. It doesn't allow for oriented or sparse grids. It doesn't do edge detection or temporal AA.

Basically, downsampling is a cheap crappy solution to a problem that should be solved by the developer shelling out for a better AA library for the damn game in the first place.
 
Last edited:
@GuyNwah

I use "GeDoSaTo" tool on older games a lot. This may not be an official sollution (more like a driver injector?) but in my experience it works for most DirectX 9 and some DirectX 8 titles.

My understanding of things here is quite basic but from what i can tell the tool "simply" intercepts the driver-game comunication and inserts higher available resolutions, and then it handles resizing the frames (you can change the scaling method). Why would such a method not allow for original in-game AA? Shouldn't rendering a frame in this situation follow the exact same rules?

Unless you're speaking strictly of DSR/VSR? I have no idea how those work but in theory at least i think they should be simpler as there is less "pretending" to be done.

In my case the GPU i have (AMD HD 7870) can downsample most DirectX 9 games effortlesly producing really nice image. But as AMD couldn't be assed to introduce VSR support for my card. I have to use GeDoSaTo or driver modifications. Perhaps more suprisingly i also have had a lot of luck downsampling on my HTPC which uses a low profile HD 4550 with 256MB(!) of DDR3 VRAM. Granted, this is becase it's currently connected to an old CRT monitor, but i can run some PS2 era game's in "crazy" resolutions like 2560x2048 (this is downsampled to 1280x1024@85Hz).

Out of curiousity I also used GeDoSaTo for The Witcher 2 and a 4K->1080p downsample has in my expierience very similar performance to 1080p + Ubersampling. (On my GPU both are borderline unplayable :p but the framerate really is comparable). I know that downsampling isn't perfect but i don't think it's as bad as you make it sound. Of course i agree it's still very immportant that the developer includes proper AA support.
 
Funny you mention that..... me and @Sardukhar.479 were talking about this the other day and in his testing 2x2 Downsampling has a higher framerate hit than ubersampling.
 
GeDoSaTo's good, and it can do more things than downsampling. Two reasons it doesn't result in such a nasty hit as DSR is that its downsampling can be set to be less ambitious, and it hooks DirectX 9 so it takes effect before a postprocessor like DSR. Unfortunately, it's also DirectX 9 specific, and DirectX 9's days are numbered.
 
Funny you mention that..... me and @Sardukhar.479 were talking about this the other day and in his testing 2x2 Downsampling has a higher framerate hit than ubersampling.

Well, it's possible that a more powerful GPU will show the differnces better. I dont think my CPU - Core i5 3570K @4,2GHz or RAM are bottlenecks, the VRAM is 2GB so that should be ok too i think. But i was getting around 10-20 fps regardless of the option so maybe the diffrence was simply to small to matter. I know it's not a very scientific comparison :p

Also, as an aside i clearly remember reading an interview with Iwinski in the "Interviews an Articles" thread where he admits that a choice was made to delay Ubersampling post launch of the game to avoid the issues with people calling the game unoptimized but it hasn't been posted to this thread yet. I'm looking for it now. I remember thinking he gives more in depth answers when speaking in his native language so i'm pretty sure the interview was in Polish.

Edit: Damn, can't find it... anyone else remembers this or did i make it up?
 
Last edited:
Funny you mention that..... me and @Sardukhar.479 were talking about this the other day and in his testing 2x2 Downsampling has a higher framerate hit than ubersampling.

I know you just mentioned me to use my cursed number addition, but anyway. My experience with Ubersampling on a Windows 8.1 4.2 ghz OC i5 2500k, 8 gigs Ram, twin watercooled 290x, playing Witcher 2:

Quote Originally Posted by sidspyker
"Yes and Ubersampling is 2x2 supersampling with a few added effects(high quality AF) making it 12800x3200 render and then downsampling it to 6400x1600"

So, with this in mind, I trotted off to test to see if ti was pretty much an exact ratio. I only hard crashed my system once, probably because I tried to force 2560x1600 while Eyefinity was set up. Don't do that. I am -so- just getting a big monitor next time....anyway.

I was running most settings at max or ultra, motion blur and cutscene depth of field off because ugh. Vsync was off.

2560 x 1600, no Ubersampling: 85-135 FPS, depending on if I was looking at a fire or not. (4 million pixels)

2560, Ubersampling On: 50-70 FPS (supposed to be 5120 x 3200, 16+ million pixels)

6400 x 1600, no Ubersampling: 55-60 FPS (10+ million pixels)

6400, Ubersampling On: 25-30 FPS (supposed to be 12800 x 3200, 40.9 million pixels)

I learned a few things.

1) That downsampling must be really effective, because rendering 41 million pixels with a frame rate between 1/3 and 1/4 of rendering 4 million pixels is pretty good.

2)Fire Bad. Rendering the flames in my sample area was quite pricey. I guess I already knew this, but a 20 FPS hit at 2560 with US was ow.

3)6400 x 1600 with or without US is really close in framerate min-max range. Almost like Vsync was on, but it wasn't. Weird.

4)Ubersampling at 6400 IS playable! I already knew that, but still nice. Depending on your definition of playable of course. Not a crazy busy scene, but not bad. NPCs, view across the valley, fire going. Cursed flames. In a totally open area, with lots of NPCs, this is why my GPU hits 97 with watercooling.

So, framerate-wise, it's not as bad as if it was quadrupled, (real 12800 x 3200 render) but it's still pretty tough.
 
Last edited:
Sard - what game was that?
(Getting in before people start thinking you have a preview version of TW3 and decide to kill you for it. Because I'm feeling unusually kind today)
 
Edit: Damn, can't find it... anyone else remembers this or did i make it up?
That's correct, Ubersampling as an option won't be there in the game at launch is what was said. One can obviously just do it manually but it'll help for clueless people enabling Ubersampling and then complaining about performance while not knowing they're rendering at 4x the resolution :p
 
That's correct, Ubersampling as an option won't be there in the game at launch is what was said. One can obviously just do it manually but it'll help for clueless people enabling Ubersampling and then complaining about performance while not knowing they're rendering at 4x the resolution :p

Well. Sort-of. If it's anything like the Witcher 2 version. The performance hit isn't that bad.

Frankly, I'm not in a hurry to run Ubersampling. US on Witcher 2 nearly melted my GPUs. Pushed 100 celsius, ow. Took awhile, but that's what happened.
 
Am I the only here to think these techniques are totally useless? devastating performance drop for the tiniest graphical improvement. As I disliked it in TW2 I will indeed dislike it in TW3 :p
 
Then enjoy your jagged edges everywhere
:hai:

No but seriously they're a nice IQ boost, will make the game look a bit better after a few years but definitely not 'necessary' or even required.
 
Last edited:
with the new drivers of amd and nvidia you can now downsampling a game


for example if you play at 1080 you cang o up to 4k in the same 1080p monitor


so why the game should get an ubersampling option if we can do the same with the drivers

I don't know what are the resolution maxs for downsampling in the green corner but i'm in the red corner and their max res in cards it's a S%&·.

I own a 290x crossfire and the only res i can get for 1080p monitor at 60hz is 3200x1800 and 2560x1440.

I will need to DOWNGRADE my cards to a single 285 to be able to add 4K res as an option to downsample from!

A nonsense.

what kind of weed they are smoking at AMD?

:facepalm:

GeDoSaTo is a fantastic tool but it can only help with DX9 and add also more than downsampling.

There is no limit to downsampling resolution i used to downsample and play Dead Space games at 8K.
 
Ubersampling? Thanks, but no thanks...

I remember how with UberS ON on Witcher 2, I saw almost 0 difference when it was turned off, but at the same time I got tons of more FPS. So no I don't need one.
 
Yeah... How about no, for now. If they can implement it as dlc or a huge update later on, sure. I doubt anyone will be able to run the game on ultra with Ubersampling enabled.)

I'm just sitting here hoping the game will run on medium/high. (Tweaking some settings here and there.)
I don't think I will be buying new pc parts for awhile, first let Nvidia and AMD figure out the 16nm (And Intel with their new gen of chips.) That way the jump forward will atleast be huge for me. (I got dual 680's)
 
Ubersampling? Thanks, but no thanks...

I remember how with UberS ON on Witcher 2, I saw almost 0 difference when it was turned off, but at the same time I got tons of more FPS. So no I don't need one.

This is simply impossible. Ubersampling in witcher 2 set on Anisotropic filter 16x, so the differences between On/Off is Night/Day.
 
I don't see why people say don't do it. If you don't want it just don't activate it. But I think they should do it because I think it does a huge difference.

But how old is the interview? Because They had confirmed during the hands on video commentary on twitch that Ubersampling will be back. It has changed since?
 
I didn't follow the thread, but I think no one would be against having the possibility to activate it, in fact the more options, the merrier. But please do not speak about night/day or huge difference. It's simply untrue. I guess 95% of the players wouldn't even notice. The change is extremely subtle even while comparing screenshots, when you actually play instead of looking at the graphical details it becomes completely unnoticeable.
 
Ubersampling is a wasteful form of antialiasing, as are all other brute-force methods, and excessive antialiasing creates artifacts that are no less objectionable than the jaggies.

What they really need to do is make the game play nice with a variety of AA technologies, not just Ubersampling, MSAA, or TXAA, but injectable technologies like SMAA.

+1

Guy N'wah is right. Ubersampling is not a good form of antialiasing. Witcher 3 needs a shader based form of AA for people who's hardware is not be able to have adequate performance using MSAA which is known to be very intensive; especially in large open world games such as the Witcher 3.

I would prefer SMAA or even a good FXAA implementation over ubersampling anyday. Also, the article seems to confirm that hardware PhysX is still in play! Apparently it will be scalable, with the software version being the standard..

*Edit* Actually, after rereading the article, the CDPR dev is fairly ambiguous on GPU PhysX and doesn't explicitly state that they will be using it.
 
Last edited:
Top Bottom