Can this LAPTOP run TW2?

+
Can this LAPTOP run TW2?

So, my laptop's specs are as following:

ASUS K53SV-B1:

Windows 7 64-bit
Intel Core i7 2630QM
Nvidia GeForce GT540M
6GB RAM
640GB 5'400RPM HDD
15.6" 1366x768 Resolution Screen
Uh… what else?

Anyways, question is: can THAT laptop run The Witcher 2 on low/medium settings (with V-Sync turned on) fluidly? The most I'd hate is poor FPS, and I could care less about the graphics. I've looked for games with as fluid of a close combat as this one, to find TW2 over and over again… and the gameplay videos show it's even better than fluid and smooth, it's perfect for a game.

So, can the laptop stand up to TW2 on low/medium, with V-Sync (if there exists) turned off?

If not, would anyone mind comparing TW1 to TW2? I would at least like the first if I can't play it… xD.
 
You should be able to play both as Witcher 1 is not as demanding as witcher 2 . Witcher 2 would be on low with vsync and AA off . Just guessing here but i would say 30 - 40 FPS which is plenty playable .
 
And… would you consider TW2 on low to be acceptable? Let's say with these settings:

Everything on lowest, NOT off (except for Ubersampling…)

1366x768 resolution.


Acceptable or awful?
 
And is 30 frames per second *really* playable?

I am used to games at around 40-120 FPS, (some being dumb low-resource games, some being like JC2 or others [at default resolution, I refuse totally low… except on TW2]).

I mean, 30 FPS during intense action situations right? 'cause I've seen videos or read reviews of a wonderful experience when entering the game (being in "a tent" supposedly) and then as soon as they're off, they're struck by a godly lag. I just don't want to be running, have to face over 8 soldiers OR have too much action on-screen if it means lag :p
 
Berxerker said:
And… would you consider TW2 on low to be acceptable? Let's say with these settings:

Everything on lowest, NOT off (except for Ubersampling…)

1366x768 resolution.


Acceptable or awful?

Acceptable , yes . Full HD no but 1366 x 768 or 1280 x 720 . I`ll include a pic of the configuraor so that you can see what it looks like and except for the top 5 selections it`s either on or off . No high , med. or low


Berxerker said:
And is 30 frames per second *really* playable?

I am used to games at around 40-120 FPS, (some being dumb low-resource games, some being like JC2 or others [at default resolution, I refuse totally low… except on TW2]).

I mean, 30 FPS during intense action situations right? 'cause I've seen videos or read reviews of a wonderful experience when entering the game (being in "a tent" supposedly) and then as soon as they're off, they're struck by a godly lag. I just don't want to be running, have to face over 8 soldiers OR have too much action on-screen if it means lag :p

Like i said i am guessing on the framerate but i`m quite sure there are threads here that will state what FPS they get . I`ll look and see if i can locate 1 or 2 and link them here . It would certainly help if another poster or 2 with a laptop similar to yours would chime in .

Looked at your specs again , i see that it`s a GT540m maybe 25 - 30 FPS . I`m sorry i was thinking GTX540...big difference .
 
Much thanks.

The problem is, I've seen lots of gameplay videos OR forum posts which:

-Some videos show TW2 being played on High settings, but they show NO action at all. Just movin the aim around.
- Posts are from people saying that it SHOULD run on this GPU, yet I doubt it.

Then there's the majority of people saying that the GT540M is too weak for TW2…


I put the info. on game-debate.com and according to their "Can I play this game thingy?" I can play it at low-medium… yet, on a totally different game, it said I could play it on medium-high and I honestly KNOW the limit is high, V-Sync turned on, and native res.

It's just that, to squeeze out $50.00 off my wallet for a game that won't play properly isn't something I want… though, I am DEFINITELY getting a TW game…
 
Berxerker said:
It's a GT540M, I'm not sure if GTX540M's exist though :p

No it doesn`t...brain fart on my part .

Witcher 1 definitely playable . Witcher 2 probably not . But like you i wouldn`t want to spend 50 bucks or 30 on GOG for a game i couldn`t play .
 
I agree; the 540M is somewhere around the minimum of capability that would assure you of a well-playable game. Another member recently got the game playable on a GT 440, which is the same chip but clocked faster.

540M = desktop GT 440 and GT 530 (GF108 chip). These are all handicapped by having only 8 output processors, which affects the GPU's ability to drive finished pixels to the screen.

With laptops, watch for heating problems. This game runs the GPU at 100% for extended periods.
 
Luckily my computer has never ever heated up, no matter what I play. I even one day decided to test my computer's resistance (stupidest thing ever) and decided to launch the following games, all at the same time:

Garry's Mod
Just Cause 2
Minecraft
Assassin's Creed (which refuses to work at the moment…)
The Elder Scrolls: IV



My computer did NOT heat up… yet I admit, it took a few seconds to load 'em all… and RAM was almost at it's limit xD.

By the way, is it normal for my laptop to usually use 1.6-2.0GB RAM from boot up to normal use? I do have Steam and some others as startup programs…
 
Ok… I have definitely decided if I have to struggle to play a game properly, I just won't buy it… so, I'll just get ready for the first The Witcher.

Will I be able to run TW near maximum settings? :D? (Hopefully it's requirements are low)

And, any DLC available? :)
 
Those games don't hammer your GPU so hard as TW2 does, but if you can game regularly without heat problems, that's a good sign.

The first Witcher shouldn't be a problem. But it may not run at maximum settings. Those low-range nVidia GPUs have a big bottleneck at the end of their pipeline.

You have to have a background in Windows internals to understand just what it is using memory for. 2GB occupied before you run anything seems like a lot, but a lot of that is speculative loading (SuperFetch) and can be discarded without penalty if another program needs the memory. It's not a big concern.
 
Yeah, I use Task Manager and Resource Monitor A LOT. Every time I sense something out of normal (as performance decreasing 0.0000001%)


:(… not max? Ah, at which setting you expect?

I knew this laptop wasn't for gaming though, yet the world of games immersed me… lol. I never expected to get games, yet buying them for consoles costs around $74.99 over here PER GAME, comparable to Steam Sales, a LOT. Then good games like this come up, and I say "Eff that PS3, no good games available/nor easy game downloading"…

Lol. What is SuperFetch anyways? (excuse my poor knowledge… :D)
 
Berxerker said:
Yeah, I use Task Manager and Resource Monitor A LOT. Every time I sense something out of normal (as performance decreasing 0.0000001%)


:(… not max? Ah, at which setting you expect?

I knew this laptop wasn't for gaming though, yet the world of games immersed me… lol. I never expected to get games, yet buying them for consoles costs around $74.99 over here PER GAME, comparable to Steam Sales, a LOT. Then good games like this come up, and I say "Eff that PS3, no good games available/nor easy game downloading"…

Lol. What is SuperFetch anyways? (excuse my poor knowledge… :D)

The reason I suggest it may not run at max is the way nVidia redesigned the output processors on newer cards. Older cards have a lot of output processors. By the formula "shader:texture:eek:utput", the old 9600 GT was 64:32:16. 16 of those output processors could deliver 9.6 Gpixel/sec at the 600 MHz clock speed. That's enough to run TW2 comfortably, let alone TW1. Newer cards have shorted the output processors. The 540M is 96:16:4. Those 4 output processors can deliver just 2.7 Gpixel/sec. That's not enough to run TW1 at full graphics.

SuperFetch is Microsoft's way of making idle memory useful. SuperFetch watches what applications you launch and what code (executables, DLLs) they load. Then it takes the applications you use most often and preloads their code into unused memory. Because the memory that SuperFetch uses is read-only, if the memory is needed for another purpose, SuperFetch can abandon the memory instantly, without having to write to the page file.

A Microsoft engineer was once asked by an indignant customer why Vista was using so much memory. He opened up his computer, took a stick of RAM out, laid it in front of the customer, and said "You want free memory? Now it's free." Seriously, SuperFetch is Windows' way of making use of memory that would otherwise be sitting there wasting electrons, to speed up loading of what it guesses you are likely to run next.
 
Wow… the Nvidia "redesigning", I am somewhat shocked D:

So, a 9600GT is better than a GT540M? Lol. Sorry if I didn't understand your answer, I'm just probably astonished. And I really wonder why would they do that…

And well, instead of getting TW2 (which I probably wouldnt be able to play) I'm gonna get TW1, and any other RPG'ish series I find. I shall forever miss TW2… lol :D
 
Berxerker said:
Wow… the Nvidia "redesigning", I am somewhat shocked D:

So, a 9600GT is better than a GT540M? Lol. Sorry if I didn't understand your answer, I'm just probably astonished. And I really wonder why would they do that…

And well, instead of getting TW2 (which I probably wouldnt be able to play) I'm gonna get TW1, and any other RPG'ish series I find. I shall forever miss TW2… lol :D

The reason why they did it is there are a lot of games that are texture-heavy, but not output-heavy. It all depends on what resources the game uses and where in the rendering pipeline it does these things. To oversimplify things a lot, the output processors take the 2-dimensional projection of the scene and convert it to pixels. Some games do a lot of work in this step. Others already produce a very clean 2-D projection, and the conversion to pixels is very fast.

Older nVidia cards, and their modern high-end cards, are well supplied with output processors, typically 32, 40, or 48. This makes them very fast at reducing a 2-D projection to pixels, even when the processing is complex.

Newer low-end nVidia cards don't do that. They have as few as 4 output processors, and they just don't perform well on games that demand a lot of processing at this stage.
 
@steelbom:

It seems odd though, my friend (which has a GT520M, a little weaker than 540) plays Skyrim in high settings and no stuttering… all of our other specs are the same. And yet the review says on "High" it's 17-21 FPS…

That's the reason I wondered if I could take a chance at TW2.
 
I never understand making a new topic for such a query....

I've played through the entire W2 on 2.66 GHz C2D, GF9600M, 8GB Ram. You'll be fine and would know that by simply looking at the required specs.
 
Top Bottom