I believe I get away cheaper than if I were buying a console, these days. Primarily because of Sales and quick decline of prices.Every time I tell myself "maybe I should spend a modest amount on my new computer," I enter this thread and that resolution just crumbles.
This passion costs a lot.
Since you are the expert please correct me if I am wrong on this but...Given the rather short life until obsolescence of modern CPUs, and the absence of arguments with any actual foundation one way or the other, there is no argument that holds water better than a leaky fitting that overclocking will shorten the useful life of your CPU.
I like eskiMoe's point that putting water inside your case is inviting trouble. But any cooling method that works for you and keeps the CPU operating temperature within designed limits is satisfactory.
Not likely...Prepare yourself for a slideshowGot a new badass rig recently that will hopefully manage to run W3 as smoothly as possible when Ubersampling is out.
Agreeed, CPUs can continue to be useful for more than just one or two tick-tock cycles. But that does not lead to the conclusion that overclocking a CPU is likely to reduce its service life to a time less than its obsolescence.Since you are the expert please correct me if I am wrong on this but...
Obsolescence for gaming? Cause that's what we're talking about here after all. As far as I know an I7 3770k Ivy Bridge will run games with very similar performance to the latest Intel CPU, hell even a sandy bridge one could do it so what's the point in upgrading the CPU when it would involve such annoyances as changing your motherboard to be able to make full use of your new hardware?
I've had my Ivy Bridge for over 2 years, getting close to 3, and I see no real reason to change it for at least 2-3 more years at the rate of how it looks for gaming and when I do replace it I want to sell and so overclocking for the mild performance increases is just not worth the heat, the noise, the risk you're taking and the cost of cooling. ( Though frankly I can crank my fans up to max and crank it above 4.3 ghz and maintain good temperatures since I've tested that ).
It's worth it far more to upgrade my GPU, hell even go SLI if I want real meaningful performance increases. With my current GTX 780 I can achieve 1080p/60 FPS+ in most games on their highest settings and even those that I can't I just tone down the AA and maybe a few other things and I'll get there. Once I get a GTX 980 I'll play virtually every game on the highest possible setting at 60 FPS 1080p unless that game is optimized like shit and anything above that in resolution/FPS is just a luxury few can afford/care for. Though if I wanted that I'd go yolo and get a Titan X.
REDpoints are the same as "likes" on other forums. The forum keeps a count of REDpoints you have received.?Whats the difference between red points and thanks?
Welcome, enjoy this RED site.and nice to meet you
---------- Updated at 10:44 AM ----------
and i am iranian
The CPU performance metric (IPC) is measured by instructions per cycle multiplied by clock speed, so clock speed is obviously an important factor in CPU performance.Performance is measured in operations or products per unit of time, not clock frequency which says next to nothing about it.
A CPU bottleneck exists when the CPU isn't fast enough to give the GPU(s) draw calls and the GPU ends up waiting.. We want the GPU to be functioning at maximum performance, and it cannot do so if the CPU isn't fast enough to keep up with it. That said, lots of factors can influence bottlenecks and not just hardware. Settings has a big influence on determining whether a game is CPU or GPU bottlenecked as well..We talked about bottlenecks before. Different CPUs behave differently and if one gives you better performance with a given GPU, it doesn't necessarily mean the other is "bottlenecking" it. A bottleneck is a visible limitation of maximum performance.
This question is too nebulous to answer properly without some context. For someone like me with SLI, CPU overclocking can increase performance substantially in games that use a single thread for draw call commands. In multithreaded games, the performance increase would be smaller as multiple threads are uploading information to the primary thread to send to the GPU, but there would still be a performance gain.Exactly how much faster do games get after a CPU overclock? On average, because some games are completely unaffected. Dramatic is a heavy word and while I understand this is a gaming (not a technical) community, I think we should be more educated posters.
The CPU performance metric (IPC) is measured by instructions per cycle multiplied by clock speed, so clock speed is obviously an important factor in CPU performance.
Source
Instructions per cycle is not an actual metric of performance. But you should read the rest of that wikipedia page:performance gain of 200mhz..
what i meant to say before is "bottleneck" is a buzzword used way too often. Sure you may improve performance by overclocking it, but unless you prove a given CPU is unable to increase game performance with varying, more powerful video cards, you can't really say it's bottlenecking the system, i.e. fixed maximum throughput.A CPU bottleneck exists when the CPU isn't fast enough to give the GPU(s) draw calls and the GPU ends up waiting.. We want the GPU to be functioning at maximum performance, and it cannot do so if the CPU isn't fast enough to keep up with it. That said, lots of factors can influence bottlenecks and not just hardware. Settings has a big influence on determining whether a game is CPU or GPU bottlenecked as well..
Your link is interesting and somewhat useful, but it's measuring SLI scaling (not directly OC). We can use their minimal testing scenarios to draw some conclusions though:This question is too nebulous to answer properly without some context. For someone like me with SLI, CPU overclocking can increase performance substantially in games that use a single thread for draw call commands. In multithreaded games, the performance increase would be smaller as multiple threads are uploading information to the primary thread to send to the GPU, but there would still be a performance gain.
Here is a review which explores the impact of CPU overclocking on performance in games with single GPU and SLI.
You should have pasted the entire sentence, and not just some of it. It said:Instructions per cycle is not an actual metric of performance. But you should read the rest of that wikipedia page:
"instructions per clock is not a particularly useful indication of [the] performance".
It is a factor in determining measurable performance, but not a metric. If you want to measure and compare computer performance, use FLOPS or in the case of games (a very applied scenario), FPS under controlled scenarios.
I never used the word "bottleneck" ambiguously if you recall. I specifically said CPU bottleneck. Lots of things can cause bottlenecks in a system I agree, but when I say CPU bottleneck, thats exactly what I mean..what i meant to say before is "bottleneck" is a buzzword used way too often. Sure you may improve performance by overclocking it, but unless you prove a given CPU is unable to increase game performance with varying, more powerful video cards, you can't really say it's bottlenecking the system, i.e. fixed maximum throughput.
I think it's more useful to disregard the SLI benchmarks, and focus on single card because SLI presents more difficulties due to SLI scaling issues and drivers and what not. BF4 performance for example has increased significantly since that article took place due to patches, SLI profile and driver updates.So after a 40% bump, the best we see is a 28% increase in actual game performance. Judging by that review you suggested, it's not such a great thing really. In parallel computing we measure efficiency, a unit of resource utilization, as the ratio of speedup and processors (or resources). If the resources increase faster than the speedup, it is evident the efficiency decreases and tends to zero as the amount of resources increase. This is actually the case here. so while a rare maximum of 28% might be worth it gamer-wise, it's somewhat inefficient actually. At least with their test setup.
I don't know about CPU lifeor GPU life or what but I can tell you I live in Australia and it gets to 45 degrees here and my liquid cooling system is fucking ace --- computer never gets higher than 50 degrees, on a normal day usually 35~. if you don't touch the unit, it isn't going to leak. it's designed not to. you should never have to open the unit for any reason whatsoever and if you do, whoever sold it to you needs to be hung out to dry.I wonder. Is Water Cooling realistically worth it? Overclocking wise I prefer longevity versus the few bits of FPS I can safely get out of a CPU and noise wise...well I've got a Noctua CPU cooler that I run at a very low RPM and so I can't hear it over my GPU fans. In fact if I ever were to water cool anything it would be the GPU.
http://www.legitreviews.com/images/reviews/1196/noctua_u12p_se2_022.jpg
That's how it looks.