Supreme Commander Uses *all* 4 Cores

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Originally posted by: apoppin
Originally posted by: aigomorla
Originally posted by: apoppin
Originally posted by: Woofmeister
Here's an interesting question. Assuming this thread is correct and even as of August there will be no 1333MHz FSB quad cores, would you still be better off buying an E6850 for gaming even with games designed to take advantage of multiple cores?

Intel has made this all so complicated. Remember when you just bought the fastest/best overclocking processor you could afford?

i dunno ... that is for each person to answer

i am on a *need to* upgrade basis ...

when i need to upgrade i will decide then

knowing what i know now about Quad core complicates things ... a bit ...
but i believe *my* upgrade path will become totally clear --after Barcelona is released ;)

i'd rather know that the gaming trend is toward multi-core then stumble about upgrading blindly in *hopes* of playing next gen games very well. :p

in fact ... STALKER is supposedly optimized for Quad core

http://www.firingsquad.com/hardware/stalker_mainstream_3d_performance/page9.asp
So that does it for Part 1 of our S.TA.L.K.E.R. performance eval. In Part 2 we?ll be taking a look at high-end cards, and in Part 3 we?ll examine CPU performance. Supposedly S.TA.L.K.E.R. takes advantage of quad-core CPUss We?ll be putting this to the test shortly!


OMG i swear if i see another QX Stalker combo, ima scream.

I had absolute nightmares with that combo in OCing i couldnt break 400fsb. And then the stupid board NERFed on me, and would reset bios each time i shut down.

I think they addressed the quadcore issue, however, EVGA implimented it with there A1 revision. I dont hear ASUS with a stalker revision. :X


Anyhow, currently my system thanks to my fubard AR EVGA, (im starting to hate 680i, but i have the A1 ready to go) at 3.2ghz-> 3.6ghz on dual 7900GT @ 700mhz in sli. I dont lag at all in supreme with max settings on 1600x1200.

So what was that guy saying he calls shins? This C2D is seriously that much faster then my Opty 175 @ 3.080ghz so id upgrade to a E6300. That would seriously clear your shutter issues. Also your video card maybe.... im still waiting for next gen ATI stuff to come out b4 i step on the 8800GTX / 2900 X-fire bandwagon.

OK ... i read your post twice ... three times and i have absolutely *no* idea what you are talking about
:confused:

STALKER is a video game ... :p
... huh?

Me too. I thought it was just because I'm so hung over!:confused: I think he's confusing S.T.A.L.K.E.R.--the computer game with the ASUS Stryker motherboard.
 

Sphexi

Diamond Member
Feb 22, 2005
7,280
0
0
Originally posted by: apoppin
Originally posted by: Sphexi
Umm...I've played it pretty extensively, on a E6300 with no OC'ing, 7950GT, and run it on max settings 1280x1024, it runs great. To be honest, I prefer oldschool TA though, in SC the gameplay itself seems to kind of drag on, and I can't figure out how to get the screen to scroll slower :(

we are not talking at LOW resolutions :p

Since when is 1280x1024 considered low? It's the max my little 17" LCD will handle, no cash to upgrade to some giant Dell something or other. Plus, I'd think that higher resolutions would be more dependant on the video card, not the processor.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sphexi
Originally posted by: apoppin
Originally posted by: Sphexi
Umm...I've played it pretty extensively, on a E6300 with no OC'ing, 7950GT, and run it on max settings 1280x1024, it runs great. To be honest, I prefer oldschool TA though, in SC the gameplay itself seems to kind of drag on, and I can't figure out how to get the screen to scroll slower :(

we are not talking at LOW resolutions :p

Since when is 1280x1024 considered low? It's the max my little 17" LCD will handle, no cash to upgrade to some giant Dell something or other. Plus, I'd think that higher resolutions would be more dependant on the video card, not the processor.

since about 2 years ago :p

that's *why* your E6300 runs it so smoothly ... try the same thing with 16x10

*resolutions* are dependent mostly on your CPU

*details* are your GPU's department
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: apoppin
*resolutions* are dependent mostly on your CPU
*details* are your GPU's department

Most of the reviews out there point to the opposite though. Just using this one for a quick example shows that raising the resolution to what one might consider high-- 16x12 and 19x12-- that the FPS between all processors with the same video card are about equal.

Granted, that was just Far Cry, but after flipping through the whole article, I can even see, for example, a CPU-hungry game like Company of Heroes show that a C2Dx6800 with an XTX couldn't surpass a much weaker C2D4300 with a 320GTS at a res where the 320 is known to choke with AA/AF (19x12).

There are exceptions here and there, sure. But, overall, from the reviews I've seen, as resolutions go up, stronger video cards begin to distinguish themselves, and the processors they are matched with begin to blend together. i.e. why we see even a C2D at 1.8ghz matched with a GTX perform the same as a C2D at 3.4ghz with the GTX in many games (link).

I'm far from an expert, but everything I have seen indicates that as resolutions rise, the power of the CPU matters less and less. That's the main reason why when they run gaming benchmarks for CPU testing that they often run them at resolutions such as 800x600-- taking the GPU out of the equation.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Far Cry is considered an *exception* ... the GPU is doing most of the work ... that's why it doesn't run really well at super-high resolutions except with some serious video card HW

look at it this way ... if you can run comfortably frame rates with a SC CPU at 10x7 ... you will get a point where raising the resolution to a point - e.g. - 12x10 - where the FPS will suffer and eventually you will get a slideshow

it doesn't matter if you have a x1950xt running or or a 8800GTX ... the Max frames are [largely] determined by the CPU ...
in the case of being *unable* to run at 12x10, you are said to be "bottlenecking" your Video card ... it needs a faster CPU

now running with the 8800GTX you will be able to get all the *details* you cannot get with the x1950xt [for example] ... you will be able to run the HiQ Max 8xAA/16xAF at the maximum rate your CPU can run

if you want higher resolution [basically], upgrade your CPU

if you want *max* details and AA/AF at your current resolution, Upgrade your videocard

look at their conclusion:
With so much graphics horsepower however, you run the danger of your CPU bottlenecking the graphics card. This happens most frequently with older games and at lower screen resolutions, particularly if you have a slower CPU. We saw this just recently in our GeForce 8800 GTX/GTS Performance with Athlon 64 article, where the X2 3800+ wasn?t able to keep up with the 8800 GTX. As a result, the GeForce 8800 GTX/X2 3800+ system was outperformed in some cases by some configs with slower graphics cards.

Those of you who may have been worried that we?d see a repeat of the same situation with the GeForce 8800 GTX when paired with the Core 2 Duo E4300 and E6400 should be encouraged by our results today. At its stock speed of 1.8GHz, the E4300 was largely able to keep up with the GeForce 8800 GTX, there were only two cases (Quake 4 and more urgent, HL2: Lost Coast) where the 8800 GTX really could stand to benefit from a faster CPU, or a little bit of E4300 overclocking.

The E6400 was also CPU-bound under Lost Coast, with the GeForce 8800 GTS 640MB delivering the exact same performance with the E6400 as the 8800 GTX. Fortunately by 1600x1200 this problem largely went away for the E6400.

.... Therefore if you do intend on pairing a high-end card like the GeForce 8800 GTX with a less expensive CPU, it becomes more important that you OC the processor so the graphics card and CPU are well balanced. You don?t want to get stuck in a situation where the graphics card is held up by the processor if you can avoid it.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,074
3,577
126
Originally posted by: apoppin
Originally posted by: aigomorla
Originally posted by: apoppin
Originally posted by: Woofmeister
Here's an interesting question. Assuming this thread is correct and even as of August there will be no 1333MHz FSB quad cores, would you still be better off buying an E6850 for gaming even with games designed to take advantage of multiple cores?

Intel has made this all so complicated. Remember when you just bought the fastest/best overclocking processor you could afford?

i dunno ... that is for each person to answer

i am on a *need to* upgrade basis ...

when i need to upgrade i will decide then

knowing what i know now about Quad core complicates things ... a bit ...
but i believe *my* upgrade path will become totally clear --after Barcelona is released ;)

i'd rather know that the gaming trend is toward multi-core then stumble about upgrading blindly in *hopes* of playing next gen games very well. :p

in fact ... STALKER is supposedly optimized for Quad core

http://www.firingsquad.com/hardware/stalker_mainstream_3d_performance/page9.asp
So that does it for Part 1 of our S.TA.L.K.E.R. performance eval. In Part 2 we?ll be taking a look at high-end cards, and in Part 3 we?ll examine CPU performance. Supposedly S.TA.L.K.E.R. takes advantage of quad-core CPUss We?ll be putting this to the test shortly!


OMG i swear if i see another QX Stalker combo, ima scream.

I had absolute nightmares with that combo in OCing i couldnt break 400fsb. And then the stupid board NERFed on me, and would reset bios each time i shut down.

I think they addressed the quadcore issue, however, EVGA implimented it with there A1 revision. I dont hear ASUS with a stalker revision. :X


Anyhow, currently my system thanks to my fubard AR EVGA, (im starting to hate 680i, but i have the A1 ready to go) at 3.2ghz-> 3.6ghz on dual 7900GT @ 700mhz in sli. I dont lag at all in supreme with max settings on 1600x1200.

So what was that guy saying he calls shins? This C2D is seriously that much faster then my Opty 175 @ 3.080ghz so id upgrade to a E6300. That would seriously clear your shutter issues. Also your video card maybe.... im still waiting for next gen ATI stuff to come out b4 i step on the 8800GTX / 2900 X-fire bandwagon.

OK ... i read your post twice ... three times and i have absolutely *no* idea what you are talking about
:confused:

STALKER is a video game ... :p
... huh?

ack.. im tripping out... I ment to say STRIKER. not stalker.

yeah seriously i think im losing it. :p
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: apoppin
look at their conclusion:
The E6400 was also CPU-bound under Lost Coast, with the GeForce 8800 GTS 640MB delivering the exact same performance with the E6400 as the 8800 GTX. Fortunately by 1600x1200 this problem largely went away for the E6400.

.... Therefore if you do intend on pairing a high-end card like the GeForce 8800 GTX with a less expensive CPU, it becomes more important that you OC the processor so the graphics card and CPU are well balanced. You don?t want to get stuck in a situation where the graphics card is held up by the processor if you can avoid it.

And, don't get me wrong, I agree with the final statement-- about not having your GPU held up by a slow CPU if you can avoid it.

The part I can't see is the part about resolution being determined by CPU. CPU affecting max framerate has been mentioned a lot, but, unfortunately, reviewers rarely give max results unless they run FEAR's benchmark (and then it's only sometimes). So, I haven't seen enough reviews to really discuss that topic much.

This part of their conclusion:
the X2 3800+ wasn?t able to keep up with the 8800 GTX. As a result, the GeForce 8800 GTX/X2 3800+ system was outperformed in some cases by some configs with slower graphics cards.
while, correct, leaves an important piece of information out. It goes away (or at least reduces a lot) as the resolution goes up.

Here is their X2 article showing Dark Messiah. This is a perfect example. At 1280, a FX62/GTS combo outperforms the X23800/GTX combo by a good margin, just as the conclusion in their other article said. Bump that up to 16x12, and the increase is now less than 10%. Bump it again, and it's now reversed. But the 19x12 graph shows how important CPU power is with a GTX, even at a high res, with this particular game, with the FX62 stomping all over the 3800 with the same GTX. One thing to also notice with both articles, as the resolution went all the way up to 19x12, no CPU when matched with the GTX had trouble with any games. All performed well. The same cannot be said for fast CPUs matched with the likes of XTXs and 7950GTs.

I don't really see Far Cry as an exception, per se. Dark Messiah is using the same engine as HL2. Both of those games improve greatly in performance with fast CPUs. Valve did a great job of developing an engine that scales very well with all kinds of hardware. I see them as more of an exception than games like FEAR, Oblivion, Quake4, Far Cry, etc.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Source Engine that is the weakest of the modern Engines :p ... and you said it even though you are missing the point ...
Both of those games improve greatly in performance with fast CPUs
the CPU-GPU interaction is complex while playing a game - i am simplifying it ... there is a point [which the charts do not illustrate - since they are trying to NOT bottleneck the Videocard] where your *maximum* frame rate is ultimately limited by the CPU ... no matter what video card you put in ... you canNOT get *more* FPS ... only more *details* and AA/AF ... that's *why* the Ax2 3800+/GTX combo was slower than a Core2Duo/1900xt in some cases. The CPU is the "limiter" of the Benchmark's frame rates at a high [for it] resolution and it "bottlenecks" the GPU

my minor point was that FEAR is less dependent on the CPU than many other modern games engines ... there was a test running it was a slow-ass Celeron and a modern GPU at low resolution - however, IF you crank the resolution above what your CPU can handle, it won't help the frame rates to get a faster video card.
 

HannibalX

Diamond Member
May 12, 2000
9,359
2
0
I have an FX-60 (stock 2.6GHz), X1950 Pro 256, 2GB of DDR400 Ram and I run SP at 16x10 with FULL DETAIL. The game runs fine. Will it run maybe 15 fps faster if I had a quad? Maybe.

I will wait for AMD's quad solution though before I go off the deep end.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Pale Rider
I have an FX-60 (stock 2.6GHz), X1950 Pro 256, 2GB of DDR400 Ram and I run SP at 16x10 with FULL DETAIL. The game runs fine. Will it run maybe 15 fps faster if I had a quad? Maybe.

I will wait for AMD's quad solution though before I go off the deep end.
the benchs suggest that your frame rates will *suffer* when the screen is absolutely *full* ...

anyway ... what is "fine" to you is not so fine to others ;)

for example, i am waiting also for Barcelona QC ... with a lesser system than yours --including a less demanding 14x9 LCD - that is "also satisfactory" for me...- even though i don't play even SC
[i am primarily FPS/RPGer]

--but it is *good* to know the "trend" is for QC ... and that the new games are being coded for it right now.

good to keep in mind if you are considering a CPU upgrade
;)

that's the *purpose* of this thread
[since i am the OP]

 

HannibalX

Diamond Member
May 12, 2000
9,359
2
0
Originally posted by: apoppin
Originally posted by: Pale Rider
I have an FX-60 (stock 2.6GHz), X1950 Pro 256, 2GB of DDR400 Ram and I run SP at 16x10 with FULL DETAIL. The game runs fine. Will it run maybe 15 fps faster if I had a quad? Maybe.

I will wait for AMD's quad solution though before I go off the deep end.
the benchs suggest that your frame rates will *suffer* when the screen is absolutely *full* ...

anyway ... what is "fine" to you is not so fine to others ;)

for example, i am waiting also for Barcelona QC ... with a lesser system than yours --including a less demanding 14x9 LCD - that is "also satisfactory" for me...- even though i don't play even SC
[i am primarily FPS/RPGer]

--but it is *good* to know the "trend" is for QC ... and that the new games are being coded for it right now.

good to keep in mind if you are considering a CPU upgrade
;)

that's the *purpose* of this thread
[since i am the OP]

By fine I mean 60 fps or better never dropping below 60. ;)

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Pale Rider
Originally posted by: apoppin
Originally posted by: Pale Rider
I have an FX-60 (stock 2.6GHz), X1950 Pro 256, 2GB of DDR400 Ram and I run SP at 16x10 with FULL DETAIL. The game runs fine. Will it run maybe 15 fps faster if I had a quad? Maybe.

I will wait for AMD's quad solution though before I go off the deep end.
the benchs suggest that your frame rates will *suffer* when the screen is absolutely *full* ...

anyway ... what is "fine" to you is not so fine to others ;)

for example, i am waiting also for Barcelona QC ... with a lesser system than yours --including a less demanding 14x9 LCD - that is "also satisfactory" for me...- even though i don't play even SC
[i am primarily FPS/RPGer]

--but it is *good* to know the "trend" is for QC ... and that the new games are being coded for it right now.

good to keep in mind if you are considering a CPU upgrade
;)

that's the *purpose* of this thread
[since i am the OP]

By fine I mean 60 fps or better never dropping below 60. ;)
running FRAPS?

*massive battles* ... screen full? at 16x12

i can only give the experience of a friend with SC ... who benched it with GTX sli and x1650p with DualCore and Quad core ...

his experience was *significant* improvement with QC over DC - either GPU setup ... better than 10FPS with the GTX SLI/QC over GTX SLI/DC

...and the 3 links i posted in the first post, tend to support my comments that QC offers a significant improvement in SC's massive battles at hi-res. ;)

here ... more proof:

http://forums.anandtech.com/messageview...atid=31&threadid=2021035&enterthread=y
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: apoppin
there is a point [which the charts do not illustrate - since they are trying to NOT bottleneck the Videocard] where your *maximum* frame rate is ultimately limited by the CPU ... no matter what video card you put in ... you canNOT get *more* FPS ... only more *details* and AA/AF ... that's *why* the Ax2 3800+/GTX combo was slower than a Core2Duo/1900xt in some cases. The CPU is the "limiter" of the Benchmark's frame rates at a high [for it] resolution and it "bottlenecks" the GPU

Yeah, I wish more reviews showed things such as max/min along with avg. Granted, I care more about the average, but it would be nice to know how the differing combinations affect framerate. The only review I quickly found was this one, but it is limited in settings that it shows.

IF you crank the resolution above what your CPU can handle, it won't help the frame rates to get a faster video card.

I guess that's the point I'm trying to dig deeper into. We need more info on what would be considered resolutions that certain CPUs cannot handle. For example, and I know FEAR might not be the best example, but these 2 charts show some interesting things, specifically what you are talking about with max framerates. FEAR has AA/AF added, and, at that res, even the 6300 can handle a high res max frames just as easy as a 3.6ghz x6800. So even a stock 6300 isn't held back at reaching a high res (if FEAR with AA/AF can be used as a proper example of this). The NFS chart has no AA/AF and you can begin to see some, granted only 5%, differentiation between the CPUs at max fps.

If anybody can find anymore good benches with some different games and settings, please post (though I guess this is going way off topic from the OP).

To poorly attempt to get my discussion back on topic of SC and quad-core... I think it's great that games are beginning to use multiple-cores. I've held off upgrading even my single-core A64 simply because I didn't see much benefit for me. I, like a tiny few others, will likely pass up dual-core completely and hit the cheap Q6600 in August (or whatever Barecelona might be doing by then). Though, the upgrade bug has been biting for a while, and a quick C2D build in May could happen if I don't police myself well.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
we discussed this *a lot* in Video ... i am carrying the conclusions from our discussions over to here ...

CPUs ... where the also really belong

too bad search is broken here ... i am *running* off to work right now ... but i will be glad to dig deeper into our discussions in Video and post relevant links

the last one that was *somewhat relevant* was this on on P4 bottlenecking the x1950p/512M ... quite a few of us tested it and there is quite a bit of info in the OP's charts [which are NOT exhaustive, by any means]

http://forums.anandtech.com/messageview...atid=31&threadid=2017733&enterthread=y

gotta run!
:Q

:clock:
 

Sphexi

Diamond Member
Feb 22, 2005
7,280
0
0
I'm betting one of the major issues isn't so much that the CPU determines resolution limits, it's that when the screen begins to fill up with units/buildings (as the resolution increases, so does the number of things on the screen), the CPU has to work more to run the AI for those units, calculate what they're doing, and then let the GPU know so it can render it.

Obviously Quad-Core would help out a lot more than that, but the same can be said for almost any game of this magnitude. I pulled up TA after reading this thread, set it to a 5k unit max limit, and started a 4-player skirmish on the highest resolution in the game (1280x1024). I set all the graphics settings on high, built a bunch of standard bot factories, and cranked out some Pee-Wees. Then I sent about 1500 of them out into battle all at once, and when I had the screen centered on that little skirmish the whole thing basically died (1fps or so). That's a game that's definitely not multi-threaded, but lots of things on the screen had the same effect.

Bottom line, I'm still going to play SC, and I'm not going to buy a QC to do so, and probably not in the next 3 to 4 years that this current rig will be good for.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: Sphexi
I'm betting one of the major issues isn't so much that the CPU determines resolution limits, it's that when the screen begins to fill up with units/buildings (as the resolution increases, so does the number of things on the screen), the CPU has to work more to run the AI for those units, calculate what they're doing, and then let the GPU know so it can render it.

I've said that since day one. That IS the primary reason it bogs down - the AI calculations necessary for larger numbers of units in play.

One way to confirm this is that your orders issued to units will start to become very delayed when there are more units in play than your system can comfortably handle.


It happened in TA (more noticeably on the processors we used back then of course), it plagued C&C Generals, and now it's even more apparent in SupCom, which is odd since it should make better use of more than one core than those games which were single-CPU threaded.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: yacoub
Originally posted by: Sphexi
I'm betting one of the major issues isn't so much that the CPU determines resolution limits, it's that when the screen begins to fill up with units/buildings (as the resolution increases, so does the number of things on the screen), the CPU has to work more to run the AI for those units, calculate what they're doing, and then let the GPU know so it can render it.

I've said that since day one. That IS the primary reason it bogs down - the AI calculations necessary for larger numbers of units in play.

One way to confirm this is that your orders issued to units will start to become very delayed when there are more units in play than your system can comfortably handle.


It happened in TA (more noticeably on the processors we used back then of course), it plagued C&C Generals, and now it's even more apparent in SupCom, which is odd since it should make better use of more than one core than those games which were single-CPU threaded.

of course ...

QC shows an improvement primarily for that very reason ... i DID state that SC chokes the fastest Dual-core with *massive battles* at high resolutions :p

--now, Sphexi *drop* your resolution from 12x10 to 4x6 and see if you get better than 1 FPS in TA. ;)

it does not negate the info on CPU-GPU interaction at changing resolutions
 

Mogadon

Senior member
Aug 30, 2004
739
0
0
Originally posted by: apoppin
Originally posted by: Woofmeister
Thanks apoppin (I've said that on other occasions haven't I?). If the Mods aren't paying attention at least you are.

Up until the Enthusiast article, I would have called Quad Core a waste of money. More than that, given how hot Quads run compared to Core 2 Duo, I would have called a Quad purchase a serious mistake for anyone thinking of overclocking. Now, I'm thinking of buying the QX6700 when the April price drops roll around.

Alan Wake and Unreal Tournament 3 beckon.
this is SO important to enthusiasts ... i am glad to be of service

i am figuring ... sometimes next year .... we are gonna see a *leap* ... like we did with DX8 > DX9 ... sure we will see Crysis and some other nice DX10 patched games this year ... but then there is gonna be *a game* ... maybe Alan Wake - DX10/Vista only and multi-threaded, that will just *blow* us away ... and an A64/8800gts will only be good for *med* details and slow ... a Core2/8900gtX will get faster ... but we will *need* QC and SLI or Xfire to make it an extraordinary experience

i think everyone upgrades after that

this is my prediction based on what i see in the gaming industry

[excellent] pre-made Game Engine dev kits are *so polished* it now takes only *one* year to bring a game from "concept to gold" whereas before it took two or three

Dev Kits for DX 10 have been out for about a year, and Alpha HW and emulators long before that ... so ... only another year or ... max two ... for a virtual DELUGE of DX10/multi-threaded games ...

the *first* multi-threaded game using 4 cores is already here ... if one cares to notice ... the *others* can't be far behind

what dev wants to look like he is old-fashioned? ... they always offer the latest .. and the Crysis Devs were quoted as saying it would bring the fastest DC CPU and SLI to its knees - with *everything maxed* at high resolution. Of course it is still playable on a lesser rig.

DX10 clears out the cheap gamers from the serious ones ... sure there will be games *available* for DX9 in 3-4 years ... just like there are games still "playable" on DX8 ... in 2007. ... but less incentive for Devs to code for their pathway. :p


I think your prediction is totally correct, except for the fact that 'everyone will upgrade' after seeing how extraordinary that certain game is.

The thing is that most people's upgrades are driven by available money. Sure everyone wants the latest and greatest and that's as true now as it will be when that certain game suddenly proves the worth of quad core and SLI/crossfire.

Unfortunately the majority of peoples upgrade paths aren't driven by what they want but what they can afford. It doesn't make any difference if 'duke nukem 2008' runs 300% better on an octo-core, quad SLI-d machine w/ 8GB RAM, if you can't afford the dollar for said machine you aint gonna buy it.

So yes, you're totally correct DX10 might separate the 'cheap gamers' from the 'serious (ie. rich) gamers' but there's nothing new about that, they've always been separated.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: MarcVenice
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.

and you're a total retard if you think i think that

:D

Devs program for the LCD ... the lowest common denominator

but the LCD is creeping up - *quickly* compared to years past ... some PC games require SM 3.0 just to run ;)
-x850xt is 'out' even if it is 'powerful' enough

Devs *also* make these same games *challenging* for the fastest rigs - if you crave detail and high resolution

after Next year, the LCD will be "vista and DX10 and multicore"

these devs also realize that the 'weak' PC gamers are likely to have a console ...


times are a changing
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
Originally posted by: apoppin
Originally posted by: MarcVenice
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.

and you're a total retard if you think i think that

:D

Devs program for the LCD ... the lowest common denominator

but the LCD is creeping up - *quickly* compared to years past ... some PC games require SM 3.0 just to run ;)
-x850xt is 'out' even if it is 'powerful' enough

Devs *also* make these same games *challenging* for the fastest rigs - if you crave detail and high resolution

after Next year, the LCD will be "vista and DX10 and multicore"

these devs also realize that the 'weak' PC gamers are likely to have a console ...


times are a changing

And they also realize that many PC games pirate their games - I cannot tell you how many times I see people who get a dual core, crazy cooling, SLI of some sort....as well as massive hard drives only to pirate all the software.

hence they will move to develop for the consoles as well
times are a changing

:p

that and ultimately the most successful PC games works on a WIDE range of Hardware.

If it sucks monkey balls on anything less than the top 15% of hardware...it isn't going to sell well.

Half Life 2 didn't just have pure hype, the game was still very pretty on my 2200+ , 64 megabyte Geforce 4 MX420 (think Geforce2 on steroids) and 512 megs of ram. Even Unreal Tournament 2003 ran pretty smooth at 1024*768 - when I moved to a G4ti4200 I was still fairly smooth.
Every wildly successful games always scales well
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Originally posted by: MarcVenice
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.

If you want your game to still be interesting a year or two down the road, you program a game that can be played well currently but that offers additional features that can be more readily enabled by the higher-performing systems people will have a year from now and two years from now. SupCom has done this by creating a game that is indeed playable now even on my single-core system, yet in a few months when I upgrade to a dual-core rig, I will be able to play with more AI, I will be able to bump up unit counts, and when I upgrade the GPU in several months, I'll be able to max out the graphical settings. Thus the game stays fresh and interesting all along the way. There is also the anticipated DX10 patch that will add further visual enhancements to the game for those running Vista/DX10 and a DX10 GPU.

Bottom line: You make a game playable by the majority of folks but you include features that some folks might have rigs potent enough to use right now but that most people will be able to take advantage of in 6-12 months. That's a smart way to build the game.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: magomago
Originally posted by: apoppin
Originally posted by: MarcVenice
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.

and you're a total retard if you think i think that

:D

Devs program for the LCD ... the lowest common denominator

but the LCD is creeping up - *quickly* compared to years past ... some PC games require SM 3.0 just to run ;)
-x850xt is 'out' even if it is 'powerful' enough

Devs *also* make these same games *challenging* for the fastest rigs - if you crave detail and high resolution

after Next year, the LCD will be "vista and DX10 and multicore"

these devs also realize that the 'weak' PC gamers are likely to have a console ...


times are a changing

And they also realize that many PC games pirate their games - I cannot tell you how many times I see people who get a dual core, crazy cooling, SLI of some sort....as well as massive hard drives only to pirate all the software.

hence they will move to develop for the consoles as well
times are a changing

:p

that and ultimately the most successful PC games works on a WIDE range of Hardware.

If it sucks monkey balls on anything less than the top 15% of hardware...it isn't going to sell well.

Half Life 2 didn't just have pure hype, the game was still very pretty on my 2200+ , 64 megabyte Geforce 4 MX420 (think Geforce2 on steroids) and 512 megs of ram. Even Unreal Tournament 2003 ran pretty smooth at 1024*768 - when I moved to a G4ti4200 I was still fairly smooth.
Every wildly successful games always scales well
the LCD is getting higher .... MUCH quicker then in the past - some games can't even be played on a x850xt because it lacks SM3.0 - although it is plenty powerful enough

it looks like someone is finally - rightly - predicting the DEATH of the Consoles ... the "PC wannabe" consoles like the 360 and especially the PS3 --as PCs get more powerful and cheaper ... and 'wired' in every room

http://forums.anandtech.com/messageview...adid=2033611&STARTPAGE=4&enterthread=y


btw, HL2 looks like crap compared with other modern engines ... no good lighting and shadows, a *joke* of a 'flashlight' and a sssStutter they still can't fffix. ... good 'artists' though at Valve.

Next year Alan Wake will be DX10 and Vista only ... no console will be able to 'touch it' ... nor will Crysis look so good on a console as it's PC counterpart, a less demanding game then AW.

pirating is a small reason Devs are programming for the consoles .... there is *money* to be had .... money for nothing

... chicks for free
:D
 

Darrvid

Member
Nov 17, 2005
38
0
0
Originally posted by: MarcVenice
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.

It's already been said, but developers aren't coding for what the majority of people are currently running. They're coding for the future man.... the FUTURE! When you're making a game with as much money invested as developers do, you don't want your product to be outdated within 6 months. There needs to be longevity in the product so people are still compelled to buy it later in its lifespan.

So, try not to cast the first stone, because they often bounce back and cause brain damage :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Darrvid
Originally posted by: MarcVenice
your a total retard if you think game developers will develop games with eye candy that only hardcore gamers, with the best possible rigs, who make up like 5% of the total gamer population, can see. They would be losing out on like 50% of their sales. Of course at max everything only a few ppl can run it, but everything at near max should be available to the mainstream public.

It's already been said, but developers aren't coding for what the majority of people are currently running. They're coding for the future man.... the FUTURE! When you're making a game with as much money invested as developers do, you don't want your product to be outdated within 6 months. There needs to be longevity in the product so people are still compelled to buy it later in its lifespan.

So, try not to cast the first stone, because they often bounce back and cause brain damage :p
[/quote]

indeed ... here is what Mark Rein has to say about the Unreal3 Engine games ...
[in PART]

http://www.gameinformer.com/News/Story/200701/N07.0126.1423.58507.htm?Page=1
MGD 07: Mark Rein Interview

GI: Will UT3 have DX10 right out of the box?
Rein: Absolutely. It?ll support DX10.

. . .

GI: Unreal has always scaled really well, from low-end hardware all the way to the high end. Where do you think the sweet spot is? What do you think it takes for a rig to be able to put Unreal through all of its paces?

Rein: We always aim Unreal for systems that people don?t have yet. (laughs) Whether its UT or any Unreal game, so I think the sweet spot has yet to show up. Again, it?s 64-bit and a ton of RAM, like an NVIDIA dual 8800s and Core 2 Extreme Quad processor?you could certainly build a super rig, but UT3 with everything turned up all the way is still going to struggle on that kind of thing. A year from now, it?ll still be a game that is a showcase game for whatever hardware you happen to be getting then. That?s normal. That?s exactly the way we?ve done it every time from the original. The format hasn?t changed there. But you?re right, we try very hard to make sure it runs well on what the average gamer has. It?ll definitely be hard to reach the bottom this time
:p

the LCD is getting higher ... and quickly

they are gonna dump the PC-HW bottom-feeders :p


try not to cast the first stone, because they often bounce back and cause brain damage
LoL

i'd like to rip that off for a possible sig, Daarvid
:D