Other people's references to "lower than low" settings still looking quite decent..I have to agree with. After playing on the desktop all week I find myself leaving home/back on the laptop (i7 3635QM+Radeon 8770M 12GB + 1GB). Mid 20s frame rates (in the teens quite often) just isn't that great of an experience after after a week of spoiling on the HTPC rig. Lowered the triangle budgets, particles,grass density, scaled back shadows severely. Yes objects at medium/far distances come and go, complex tree shadows fade in and out sometimes. Noticeable (though rare) odd looking terrain drawing at the water's edge (mainly in Novigrad for some reason). The game engine seems quite suited to scaling back as it seems to properly favor resource allocation to close in "stuff". It's still the Witcher3. Gameplay is more responsive when the GPU load isn't pegged, so I locked it to 30hz and FPS pretty much sits there. Awesome. My comment is: Top shelf graphics enhance a good game, but a excellent game isn't necessarily any less enjoyable with more modest rendering. With the tweaks I applied it looks like Fallout3 (with HD textures). Mind you Fallout 3 was and is still an excellent game. Scaled back rendering for lesser (but still capable) hardware doesn't make the Witcher3 any less engaging either. On a 15" screen what I see looks quite adequate, the trick seems to be balancing resources to keep objects/characters reasonably detailed. When it comes to gaming laptops hardware costs a bundle but only keeps pace with new titles for a year (as opposed to desktop hardware that good for 2-3). When playing on the laptop I don't expect it to look as good as the desktop rig. This is a note to developers in general (not just CDPR): If the game actually is worth playing you can make graphics scalable to something between PS3-PS4 quality, it still looks reasonably convincing. As time goes by the masses aren't going to be content using landlocked desktop or console hardware, I'm there already. Portable hardware will never be able to match what can be done on a landlocked machine but ultimately is the future of gaming. Scalability is the key and those that refuse to allocate significant R&D to this area, at some point, gamers are largely going to forget about you. If the next gen consoles (after PS4/XBO) are still designed as landlocked + broadband they're likely to be doomed. The performance level of the 14 month old "capable" $2k laptop I have now will equate to the average APU machine for $400 in a couple more years, if not sooner. Developers would be foolish to ignore them in leau of pressing CGI or bust... Devs should be thinking about game engines that internally test/track GPU resources (bandwidth, fill rates, draw capabilities etc) and tweak stuff on the fly (like what can be fiddled with in user.setttings). Settings would be replace by preferences: "target framerate", "Detailed environment -vs- Smooth lines", etc. On a top end rig these settings would have little/no effect, as the engine would sense the resources aren't tapped and just stay pegged at "ultra". Food for thought.