- Jan 29, 2005
- 5,201
- 214
- 106
Hey,
So yeah before you hit the wall the text too hard...
TL;DR @ BOTTOM
Good? Ok now I'm starting...
Alright so I wasn't sure under which forum category I should post this, either Video Cards or the CPUs and OC'ing one. But since it relates to OC'ing CPUs more than anything I opted to post this here (since ultimately I feel that the answers will be in relation to OC'ing CPUs, rather than talking about the GPUs being reviewed). So basically I'm asking, simply (just wondering) why is it that in most graphics cards reviews I've seen lately most of the review sites (and reviewers themselves) opt to review the GPU(s) in question under a heavily over-clocked system? By "heavily" (which of course is quite a subjective adjective) I mean, for example reviewing... say... a GTX 680 on a system that happens to run with a Core i7 that went from a non-OC'ed "base" clock of say... 3.4Ghz or so, to anywhere between 4.0 to as high as 4.7Ghz?
If anything, why not running the benchmark tests under both conditions then? Indeed, why not benchmarking under potential bottleneck conditions (which sometimes we common gamers can't always be sure of since we're almost never presented with actual stock frequencies systems running said benchmarks) and also under over-clocked conditions? Are there any known reviewing sites doing that? I'm asking this because I've bought a new GPU literally this morning (GTX 670, upgrading from my previous GTX 285 which I bought nearly four years ago, back in early Feb. 2009). But before buying it I looked at... well... countless reviews, from known to lesser-known reviewing sites. And, to be honest, I do not remember each and every single one of them by heart but I do clearly remember that most of them (my own estimation is something along the lines of maybe 8 out of 10 of them) reviewed their cards under over-clocked conditions (various cards of course, one of which was the GTX 670 which I was eying for a long time).
The only thing that I remember reading regarding this subject, which was about a year ago, was in relation to what I mentioned above, namely system bottlenecks. I'm going to be frank here I'm mostly a gamer and I don't know much about hardware, just the basics, and what I do know of hardware is related to what is only needed for gaming in general. All I know is that a bottleneck is when one specific PC component, usually being the CPU, is "not fast enough" to "answer" or "provide" enough said power that the other component "asks for" (which usually is the GPU, but I guess it can be the other way around as well). The end result is (regardless of how well or badly I explained it in Layman's terms) that, basically, you're losing frame rates simply because one out of two components can't catch up for the other and is "suffocating", and can't provide to you the consumer the fullest of its potential... so is therefor "bottlenecked".
Which... ultimately, explains why (and excuses, as well as justifies) over-clocking a system (be it CPU, or perhaps the system's memory and GPU as well) is done to simply show to you potential consumer and buyer of said reviewed components what those components can indeed deliver under "real", fully-exploited non-bottlenecked conditions (basically what the hardware can physically do without any sorts of issues whatsoever, and what is "designed" to provide on paper and in pure raw number terms). Well if that's the case, then I agree... but I also disagree, to some extent. It's more like I don't understand why such review sites doing that don't also take the time to do it under "normal" non-overclocked (albeit with potential bottlenecks) conditions and then in the end simply show us both results on a graph (or on multiple graphs anyway).
Would it be... possible? I mean... ok yeah sure it is possible on paper. It's "easy to do" but... then why not doing it? By the way, if there are such reviewing sites please let me know I'd like to see some of their work (and I'm genuinely asking for this, I just can't recall ever seeing a reviewing site being content with reviewing anything under non-overclocked conditions). The thing is, if you happen to be someone who likes to over-clock not only out of simple pleasure and curiosity but also because you can no longer be content with a "stock frequencies" system, and can only game on an OC'ed system, then such review sites doing their work on OC'ed conditions seem completely normal to you. But what if... imagine just a moment... what if there's gamers out there who don't "want to", nor "care to" over-clock? Let's say I buy a Core i7 2600K like I did and just... you know... install it, set up basic options in the BIOS and call it a night for the next five or six years?
My "problem"... I guess I can call it this way, is that going on a review site that tells me that for example (purely an example here, keep this in mind) in a very specific game, and in a very specific screen resolution I am "supposed" to get around that exact frame rate with that exact system (being used to test during the review) and that exact GPU, and that exact CPU over-clocked at exactly 4.2Ghz isn't exactly going to be "accurately representing" the product for ME, me being a person who happens not to over-clock (and haven't over-clocked anything in almost three years by now, even though I did over-clock a bit in the past).
So in other words let's say that review specifies the following system specs being used to do the benchmarks... the CPU (whatever it is) over-clocked @ 4.5Ghz. The memory (whatever it is, but usually being super fast and expensive, rarely the lower speed variants being used) is at stock frequencies (but sometimes OC'ed as well, although of course not as much as CPUs and GPUs are), and the GPUs at stock as well (whatever they are being tested, sometimes themselves being OC'ed too). Now let's presume that only the CPU was OC'ed (and extensively so, not just by 400Mhz or so, more like 1Ghz+ of so called "stable OC"). Alright, say that the benchmark then shows that at the exact screen resolution of 1920 x 1080 with 4xAA this (any) specific GPU spits out a minimum of 76FPS and a maximum of 88FPS in ... let's say ... Battlefield 3 (popular game being used in most benchmarks since it was released).
Alright... that's... cool to know... that under those conditions it delivers that performance. Ok but... what about the minimum and maximum frames with the CPU clocked at its default 3.2, 3.3 or 3.4Ghz? Can I know how many frames (less, surely, but still... want to know, hence reason as to why we look at benchmarks, yeah?) I'd get too? Say the minimum and maximum would respectively change to something along the lines of 68FPS and maybe 77FPS respectively? I mean... anything that would more accurately represent my system's conditions when playing this, or that game, with this or that GPU. Now sure you might say "but lolz dude CPU at 3.2Ghz is bottleneck! lolz you mad bro? OC your cpu or don't buy PCs get a 360 lulz!". Ok, now the thing is regardless of the arguments "for" or "against" there's a fact that remains, in that there's gamers out there who (as I said earlier) simply don't want to nor care about over-clocking, they are content with buying components and installing them and installing their games and playing them.
Let's say that under non-OC'ed conditions that the GPU(s) being reviewed would then be bottleneck (whatever the extent of which happens to be). The thing is even if it is, then I won't know the difference between it (the bottlenecked performance I'd get) and the other way around. In other words... if I never over-clock my system I will never experience that supposed minimum 75+FPS that was shown in the benchmark, because to get that minimum number of frames I need at the very least a CPU of that architecture, probably the exact same model to start with and certainly running that specific over-clocked frequency to "remove the bottleneck" that itself would otherwise remove 5+ or 10+ frames from my overall game's performance.
The bottom line remains, that ignorance is bliss. If I wouldn't have known to start with (by seeing the benchmarks doing that) what would have the performance been like under OC'ed conditions, then I also wouldn't have known that the performance I do get under "normal" conditions is in fact a bottlenecked performance, even if it means that my GPU isn't "working at its full potential"... the thing is... again... I wouldn't be aware of it, hence not being able to care about it and even thinking that the performance I do get is in fact the performance I am even supposed to get. And I would only notice that "extra" increased performance "out of nowhere" the next time I would simply happen to upgrade just the CPU itself which would happen to simply be faster than the previous one was by default, which would "unlock" that famous performance that the GPU was so eagerly waiting for all that time.
So, what's your opinion(s) about this? Are you bothered by the seemingly standardized over-clocked conditions in most benchmarks out there? Would you like it if we could get both conditions just as commonly as we get OC'ed reviews? Have you ever felt "misinformed" due to seeing benchmarks that ultimately never approach the "system you have" or "system you'd like to get"? And do you feel that you have to "interpret" the performances you'd get, by yourself, due to lack of actual tangible data to look at for normal system conditions?
Please do share, I'm just curious basically.
----------
TL;DR
º Over-clocked components in reviews seem to be standardized, does that bother you?
º Not being aware of potential bottlenecks would justify "normalized" non over-clocked conditions for reviews?
º Are potential bottlenecks the only "useful reason" as to why GPUs are more often than not reviewed under OC'ed systems? (usually OC'ed CPUs)
º What are consumers "supposed to buy", if they don't over-clock but can only seemingly justify their GPU purchases by buying CPUs at frequencies that don't exist yet, lest you over-clock them as well as the review sites do?
º Pretending that review sites wouldn't OC in their reviews, then ignorance is bliss would prevail, us not being aware at all of actual bottleneck effects until upgrading actual CPUs later on.
º Please, share you thoughts about this if you want!
----------
Thanks for reading (if you did!).
So yeah before you hit the wall the text too hard...
TL;DR @ BOTTOM
Good? Ok now I'm starting...
Alright so I wasn't sure under which forum category I should post this, either Video Cards or the CPUs and OC'ing one. But since it relates to OC'ing CPUs more than anything I opted to post this here (since ultimately I feel that the answers will be in relation to OC'ing CPUs, rather than talking about the GPUs being reviewed). So basically I'm asking, simply (just wondering) why is it that in most graphics cards reviews I've seen lately most of the review sites (and reviewers themselves) opt to review the GPU(s) in question under a heavily over-clocked system? By "heavily" (which of course is quite a subjective adjective) I mean, for example reviewing... say... a GTX 680 on a system that happens to run with a Core i7 that went from a non-OC'ed "base" clock of say... 3.4Ghz or so, to anywhere between 4.0 to as high as 4.7Ghz?
If anything, why not running the benchmark tests under both conditions then? Indeed, why not benchmarking under potential bottleneck conditions (which sometimes we common gamers can't always be sure of since we're almost never presented with actual stock frequencies systems running said benchmarks) and also under over-clocked conditions? Are there any known reviewing sites doing that? I'm asking this because I've bought a new GPU literally this morning (GTX 670, upgrading from my previous GTX 285 which I bought nearly four years ago, back in early Feb. 2009). But before buying it I looked at... well... countless reviews, from known to lesser-known reviewing sites. And, to be honest, I do not remember each and every single one of them by heart but I do clearly remember that most of them (my own estimation is something along the lines of maybe 8 out of 10 of them) reviewed their cards under over-clocked conditions (various cards of course, one of which was the GTX 670 which I was eying for a long time).
The only thing that I remember reading regarding this subject, which was about a year ago, was in relation to what I mentioned above, namely system bottlenecks. I'm going to be frank here I'm mostly a gamer and I don't know much about hardware, just the basics, and what I do know of hardware is related to what is only needed for gaming in general. All I know is that a bottleneck is when one specific PC component, usually being the CPU, is "not fast enough" to "answer" or "provide" enough said power that the other component "asks for" (which usually is the GPU, but I guess it can be the other way around as well). The end result is (regardless of how well or badly I explained it in Layman's terms) that, basically, you're losing frame rates simply because one out of two components can't catch up for the other and is "suffocating", and can't provide to you the consumer the fullest of its potential... so is therefor "bottlenecked".
Which... ultimately, explains why (and excuses, as well as justifies) over-clocking a system (be it CPU, or perhaps the system's memory and GPU as well) is done to simply show to you potential consumer and buyer of said reviewed components what those components can indeed deliver under "real", fully-exploited non-bottlenecked conditions (basically what the hardware can physically do without any sorts of issues whatsoever, and what is "designed" to provide on paper and in pure raw number terms). Well if that's the case, then I agree... but I also disagree, to some extent. It's more like I don't understand why such review sites doing that don't also take the time to do it under "normal" non-overclocked (albeit with potential bottlenecks) conditions and then in the end simply show us both results on a graph (or on multiple graphs anyway).
Would it be... possible? I mean... ok yeah sure it is possible on paper. It's "easy to do" but... then why not doing it? By the way, if there are such reviewing sites please let me know I'd like to see some of their work (and I'm genuinely asking for this, I just can't recall ever seeing a reviewing site being content with reviewing anything under non-overclocked conditions). The thing is, if you happen to be someone who likes to over-clock not only out of simple pleasure and curiosity but also because you can no longer be content with a "stock frequencies" system, and can only game on an OC'ed system, then such review sites doing their work on OC'ed conditions seem completely normal to you. But what if... imagine just a moment... what if there's gamers out there who don't "want to", nor "care to" over-clock? Let's say I buy a Core i7 2600K like I did and just... you know... install it, set up basic options in the BIOS and call it a night for the next five or six years?
My "problem"... I guess I can call it this way, is that going on a review site that tells me that for example (purely an example here, keep this in mind) in a very specific game, and in a very specific screen resolution I am "supposed" to get around that exact frame rate with that exact system (being used to test during the review) and that exact GPU, and that exact CPU over-clocked at exactly 4.2Ghz isn't exactly going to be "accurately representing" the product for ME, me being a person who happens not to over-clock (and haven't over-clocked anything in almost three years by now, even though I did over-clock a bit in the past).
So in other words let's say that review specifies the following system specs being used to do the benchmarks... the CPU (whatever it is) over-clocked @ 4.5Ghz. The memory (whatever it is, but usually being super fast and expensive, rarely the lower speed variants being used) is at stock frequencies (but sometimes OC'ed as well, although of course not as much as CPUs and GPUs are), and the GPUs at stock as well (whatever they are being tested, sometimes themselves being OC'ed too). Now let's presume that only the CPU was OC'ed (and extensively so, not just by 400Mhz or so, more like 1Ghz+ of so called "stable OC"). Alright, say that the benchmark then shows that at the exact screen resolution of 1920 x 1080 with 4xAA this (any) specific GPU spits out a minimum of 76FPS and a maximum of 88FPS in ... let's say ... Battlefield 3 (popular game being used in most benchmarks since it was released).
Alright... that's... cool to know... that under those conditions it delivers that performance. Ok but... what about the minimum and maximum frames with the CPU clocked at its default 3.2, 3.3 or 3.4Ghz? Can I know how many frames (less, surely, but still... want to know, hence reason as to why we look at benchmarks, yeah?) I'd get too? Say the minimum and maximum would respectively change to something along the lines of 68FPS and maybe 77FPS respectively? I mean... anything that would more accurately represent my system's conditions when playing this, or that game, with this or that GPU. Now sure you might say "but lolz dude CPU at 3.2Ghz is bottleneck! lolz you mad bro? OC your cpu or don't buy PCs get a 360 lulz!". Ok, now the thing is regardless of the arguments "for" or "against" there's a fact that remains, in that there's gamers out there who (as I said earlier) simply don't want to nor care about over-clocking, they are content with buying components and installing them and installing their games and playing them.
Let's say that under non-OC'ed conditions that the GPU(s) being reviewed would then be bottleneck (whatever the extent of which happens to be). The thing is even if it is, then I won't know the difference between it (the bottlenecked performance I'd get) and the other way around. In other words... if I never over-clock my system I will never experience that supposed minimum 75+FPS that was shown in the benchmark, because to get that minimum number of frames I need at the very least a CPU of that architecture, probably the exact same model to start with and certainly running that specific over-clocked frequency to "remove the bottleneck" that itself would otherwise remove 5+ or 10+ frames from my overall game's performance.
The bottom line remains, that ignorance is bliss. If I wouldn't have known to start with (by seeing the benchmarks doing that) what would have the performance been like under OC'ed conditions, then I also wouldn't have known that the performance I do get under "normal" conditions is in fact a bottlenecked performance, even if it means that my GPU isn't "working at its full potential"... the thing is... again... I wouldn't be aware of it, hence not being able to care about it and even thinking that the performance I do get is in fact the performance I am even supposed to get. And I would only notice that "extra" increased performance "out of nowhere" the next time I would simply happen to upgrade just the CPU itself which would happen to simply be faster than the previous one was by default, which would "unlock" that famous performance that the GPU was so eagerly waiting for all that time.
So, what's your opinion(s) about this? Are you bothered by the seemingly standardized over-clocked conditions in most benchmarks out there? Would you like it if we could get both conditions just as commonly as we get OC'ed reviews? Have you ever felt "misinformed" due to seeing benchmarks that ultimately never approach the "system you have" or "system you'd like to get"? And do you feel that you have to "interpret" the performances you'd get, by yourself, due to lack of actual tangible data to look at for normal system conditions?
Please do share, I'm just curious basically.
----------
TL;DR
º Over-clocked components in reviews seem to be standardized, does that bother you?
º Not being aware of potential bottlenecks would justify "normalized" non over-clocked conditions for reviews?
º Are potential bottlenecks the only "useful reason" as to why GPUs are more often than not reviewed under OC'ed systems? (usually OC'ed CPUs)
º What are consumers "supposed to buy", if they don't over-clock but can only seemingly justify their GPU purchases by buying CPUs at frequencies that don't exist yet, lest you over-clock them as well as the review sites do?
º Pretending that review sites wouldn't OC in their reviews, then ignorance is bliss would prevail, us not being aware at all of actual bottleneck effects until upgrading actual CPUs later on.
º Please, share you thoughts about this if you want!
----------
Thanks for reading (if you did!).
Last edited: