But developers don't solely develop for the lowest common denominator. If that were the case, it simply would not be possible to enable AA in Crysis, or increase texture settings from lowest to medium, or increase resolution from 1024x768. We have options because they are optional extras for people whose graphics cards can handle it. Why can't the same be done for CPUs? More powerful CPUs get better physics approximations and better AI, etc.But most people don't have quad-cores. SC2 is not even the same market at those who play Crysis, for example. They want to push the envelope of a dual-core system, but the game needs to be mass-market enough that not only cutting-edge systems can play it. Quad and hexa-core systems may be more prevalent now, but remember that a majority of users still have dual core systems (2006ish to now), and many still use single core A64s or P4s.
You'll blow money on overpriced mice, but not on a good SSD or better processor?
More powerful CPUs get better physics approximations and better AI, etc.
But developers don't solely develop for the lowest common denominator. If that were the case, it simply would not be possible to enable AA in Crysis, or increase texture settings from lowest to medium, or increase resolution from 1024x768. We have options because they are optional extras for people whose graphics cards can handle it. Why can't the same be done for CPUs? More powerful CPUs get better physics approximations and better AI, etc.
So...SC2 has absolutely no optional graphical settings? Everybody runs the game at one resolution, with one texture resolution, one level of shader detail, one level of anti-aliasing and filtering?SC2 IS a mass-market game. It is not a tech show-piece title like Crysis. It would be foolish to design a game that most people cannot install and play (especially when they cannot get it for their consoles). This is exactly what kills PC gaming. Advancing tech is awesome, but SC2 isn't about the tech it's about the gameplay.
So...SC2 has absolutely no optional graphical settings? Everybody runs the game at one resolution, with one texture resolution, one level of shader detail, one level of anti-aliasing and filtering?
actually everyone but those with 4ghz core i7's and top of the line graphics cards, do run it on the same settings-- on the lowest settings-- so that they can get better performance.
So...SC2 has absolutely no optional graphical settings? Everybody runs the game at one resolution, with one texture resolution, one level of shader detail, one level of anti-aliasing and filtering?
That still requires the presence of an option.actually everyone but those with 4ghz core i7's and top of the line graphics cards, do run it on the same settings-- on the lowest settings-- so that they can get better performance.
True. Not very relevant, but true.Usually when you have to turn options down, the game ends up looking worse than older games that ran fine on your hardware
It will increase over time because the gap between the latest tech and what is actually required to run an OS + productivity apps will increase. Apart from a select few apps, my computing experience would be about as satisfactory with a Core i3 540 as with a Core i7 980X. As we get more threads and more cores in CPUs, that baseline, the i3 540 if you will (just as a hypothetical example), isn't going to change as much, because an OS has a much higher standard in regards to compatibility than any game. So while the baseline increases, the performance of the top of the range and affordable mid-high range hardware segments will increase faster.Probably because its not worth the development cost to implement that. While the gap between the fastest i7 and Pentium dual core might be only 2x, the gap for slowest and fastest graphics easily reach 25x. Lowering the requirements for latter with variable settings would reach far more users than doing the same for former.
Probably because its not worth the development cost to implement that. While the gap between the fastest i7 and Pentium dual core might be only 2x, the gap for slowest and fastest graphics easily reach 25x. Lowering the requirements for latter with variable settings would reach far more users than doing the same for former.
Not only performance but a lot of the time lower eye candy actually improves visibility which is very useful for MP. Fond memories of r_picmip 5actually everyone but those with 4ghz core i7's and top of the line graphics cards, do run it on the same settings-- on the lowest settings-- so that they can get better performance.
Umm... there are fast quads, and not so fast quads...
http://www.anandtech.com/bench/Product/157?vs=52
Yes its an extreme example... but I am making a point about your 'no speed difference between from one quad to another'. Thats total BS.
So...SC2 has absolutely no optional graphical settings? Everybody runs the game at one resolution, with one texture resolution, one level of shader detail, one level of anti-aliasing and filtering?
So. For the sake of clarity, let's summarize.Exarkun was 100% correct. I don't know what part you didn't understand here or what your point is, really. Obviously the game has all those things.
Undoubtedly. I never said performance did not scale with general CPU performance. In fact, that is the whole point of my argument. I'm sure you would consider it absurd if Blizzard coded SC2 graphically so that a GTX580 could not run the game at higher quality settings than an 8800GT could. But you don't consider it silly when Black Ops gets 16/33 min/avg fps for a stock Athlon II X2 260 but an OC'd i7 920 can get 91/123 fps, yet there are no settings allowing you to adjust the quality of AI or the quality of the physics simulations used? Black Ops is even a best-case scenario, since it takes advantage of all the cores on a quad-core CPU; whereas SC2's coding means that adding 2 (or 4) more cores essentially makes no difference. I don't see how this is any different, apart from a degree of difficulty in coding, from allowing the GPU to only 150 of its 320, or 512 shaders.SC2 is no different. Having a better system (or CPU, anyway) still makes *quite* a bit of difference with this game, go back to page one and look at the difference between procs. Could they have coded it better? Yes. Are there still nicer graphics to be obtained by running up the sliders that people with a trash comp won't see? Absolutely.
So. For the sake of clarity, let's summarize.
Starcraft is a game that has options that allow a user to vary the graphical quality, and therefore, the performance, of the game on his/her GPU.
This is a good idea, because of the varying performance difference between the high and low ends of the GPU market.
Many games have similar, similarly extensive, options.
Very few such games have similar, similarly extensive options for controlling the performance of a game on a CPU.
However, there also exists in the CPU market a performance difference between the high and low ends.
I don't see what is so very confusing about this argument.
Undoubtedly. I never said performance did not scale with general CPU performance. In fact, that is the whole point of my argument. I'm sure you would consider it absurd if Blizzard coded SC2 graphically so that a GTX580 could not run the game at higher quality settings than an 8800GT could. But you don't consider it silly when Black Ops gets 16/33 min/avg fps for a stock Athlon II X2 260 but an OC'd i7 920 can get 91/123 fps, yet there are no settings allowing you to adjust the quality of AI or the quality of the physics simulations used? Black Ops is even a best-case scenario, since it takes advantage of all the cores on a quad-core CPU; whereas SC2's coding means that adding 2 (or 4) more cores essentially makes no difference. I don't see how this is any different, apart from a degree of difficulty in coding, from allowing the GPU to only 150 of its 320, or 512 shaders.
You can overclock an i7 too...![]()
Undoubtedly. I never said performance did not scale with general CPU performance. In fact, that is the whole point of my argument. I'm sure you would consider it absurd if Blizzard coded SC2 graphically so that a GTX580 could not run the game at higher quality settings than an 8800GT could. But you don't consider it silly when Black Ops gets 16/33 min/avg fps for a stock Athlon II X2 260 but an OC'd i7 920 can get 91/123 fps, yet there are no settings allowing you to adjust the quality of AI or the quality of the physics simulations used? Black Ops is even a best-case scenario, since it takes advantage of all the cores on a quad-core CPU; whereas SC2's coding means that adding 2 (or 4) more cores essentially makes no difference. I don't see how this is any different, apart from a degree of difficulty in coding, from allowing the GPU to only 150 of its 320, or 512 shaders.
yes in general many game settings also have an impact on the cpu.Doesn't lower graphic settings also lower the stress on the CPU?