Rollo's 6600GT SLI benches

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Originally posted by: WT
Call me comfortably numb for using my current graphics settings, but I get very acquainted with seeing the screen a certain way and I don't change it ... ever ...
So, on a 19" monitor with plenty of vid card, would I gain ANY advantage in running a game (Enemy Territory mainly) at 16x12 ? Considering I'm at 1024x768 now, what will I see that makes me accept it as my new standard ? Pardon me if this is a dumb question.

I just made the transition from 1024x768 to 1280x1024 in all of the games I currently have installed, so I have an idea what you mean when you say comfortable. The difference? Finer onscreen detail. This equates directly to a sense of immersion, at least for me. Immersion and suspension of disbelief go hand-in-hand with allowing you to feel more entwined with the environment around you.

What will "make" you accept a new standard? Nothing, that I know of. You have to make that decision for yourself, based on your available resources, how committed you are to gaming, what games you want to play and how you want to play them. I can tell you what drives me though, and that's seeing what's coming up next. Currently I'm most interested in BF2 and S.T.A.L.K.E.R. I want to be able to play them at the detail level I've seen in the AVI's and screens. If I have to upgrade I will, even if that means SLI. I guess this is hardcore gaming in a nutshell. :)


 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: fbrdphreak
t's basically a once-you-try-you-can-never-go-back sort of thing.
I never needed vsync until I tried it and noticed tearing.
I never used AA until I had a video card powerful enough to be able to use it, and now I notice jaggies very easily and must have at least 2x enabled.
Same thing with AF...took me a while with using it for me to see what things were like with it off again...games felt blurry. So now I must have AF on as well.
As far as higher resolutions...that's the only thing I'm having a hard time adopting...due to my eye sight. Things just look so small when my screen is set higher than 1024x768, but when I'm playing a game I can manager 1280x960 now with my new monitor, and it's fine on my eyes and looks better to boot.

To directly answer your question...if you can maintain a comfortable refresh rate AND playing experience at that resolution, give it a try. You may never want to go back.

I hear the AA/AF thing, but to be even more OT than this already is, how big of a diff does vsync make to you? I've never tried it because everything I hear says that it reduces performance and can cause compatibility issues. Right now I'm playing HL2 (again) @ 16x12 4xAA 8xAF, Doom 3 (if I ever feel like reinstalling it :roll;) @ 16x12 4xAA 8xAF, and just now Chronicles of Riddick @ 16x12 Shader 2.0 (I'm excited, this game looks sweet!). Recommendations? Thanks

Vsync makes all the world for me, but I am an extremely picky person. I notice tearing the moment it is off. My games feel "smoother" to me with vsync on. That's merely subjective...I'm not getting higher framerates or anything, but the feel factor is definitely there. However, performance can be an issue. The only game that's stopping my softmodded 6800NU right now is Counterstrike:Source on de_dust2 around bombsite A. I don't know what it is about that particular spot (abundance of floating particles?) but my video card cannot render as many frames as I have my refresh rate set to (1280x960 @ 85hz), so my framerates drop in half to 42fps whenever I'm in bombsite A. The drop itself is highly irritating and disruptive, but once I'm in the 42fps zone, I can usually manage, as a steady 42fps isn't too bad for my standards. It's steady, and that's what makes it tolerable. But, like I said, the initial dip is brutal. My card is able to keep up in all other games, so it's mostly just a few maps in CS:S that cause me problems. As far as compatibility issues...I've not run into a single one.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Isn't that what I essentially said? The question was of price/performance, not overall absolute performance. Do you think that $10K for something only slightly faster than two 6800 Ultra cards would offer ever remotely comparable price/performance? $10K vs. $1K, for not much faster? *boggle*

Paying a ~115% price premium for a ~70% performance increase at the top end of PC hardware would normally be considered a raging bargain. In the processor market you tend to pay ~600% premiums for ~15% performance boosts, why not go off on Intel and AMD constantly? Their crime is far worse then nV's using that as a baseline.

Of course the highest-end is dominated by multi-chip solutions. But they also have a much greater associated cost. Should the mainstream market have to accept that different pricing model, with a lower price/performance ratio than what currently exists in the mainstream market? That's what I meant, when I spoke of NV sending the entire graphics market back in time by quite a few years.

Who is being forced into anything in anyway?

I'm not sure why personal attacks are needed here.

Come on man, I had the emoticon there and everything.

Should the "mainstream" of the market, now be centered around the lowest-end of the price/performance curve? Is that really needed? That's what my "agenda" is, and what my rant was about. No "SLI tax" for me, please.

So you are upset that your budget has left you in the dust when compared to the highest end offerings? Is that what this is about? On a realistic level, all of those SLI buyers are doubling up the margins that nVidia and soon ATi will see increasing margins and then R&D budgets- faster acceleration of the mainstream parts.

Not certain how the V1 factors in to this, as it's primary native API was Glide, and only later did they support OpenGL and lastly Direct3D. I'm certain that there were OpenGL games, although most were probably on Macs and SGI workstations. PC gaming was still running on 2D cards primarily, and all early 3D games were based on software renderers, and then later retro-fitted to support 3D hardware accel.

The first real OpenGL based game was GLQuake, which came out after the V1. Having a high end workstation board for either a software rendered or 2D game wasn't going to do anything for you. Macs had utterly horrific OpenGL support btw, I was working with them as my primary platform back then and they were pretty much unuseable.

But the point of the comment was that, even back then, the "high-end" of 3D acceleration hardware did exist, so why the contrast between then and now, in terms of apparent market desire to support/purchase those high-end solutions?

What contrast? Back then the pro level boards utterly sucked for gaming, and if you check out 3DLabs/Integraph or Irix based machines the same holds true. Only ATi's and nV's parts are worth anything in the gaming market. Go ahead and drop $2K on a Wildcat, you will get GF2 levels of performance in most games(well, maybe not quite that bad but nothing remotely close to the big twos parts).

Because that's really what's going on here, in NV-land. Hopefully within six to twelve months, market competition will right this wrong, and "NV SLI" will just be a bad memory, just like the ATI Rage MAXX was six months after it was released, and the additional "SLI tax" will be gone too.

ATi is following nVidia with multi GPU setups. AGP didn't allow the setups to work properly- PCIe does.

You do have a good point about the giant conspiracy of the developers and graphics card manufacturers though. I heard they are going to use this money to begin financing the Russian Mafia's plans to confiscate nuclear weapons and obliterate Area51 as the aliens have been using that facility to manipulate the world leaders after they are reprogrammed to listen to the lower level signals being sent over NPR :p
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Ben Skywalker:

You had asked about higher resolution, no AA/AF benches:

HL2 19X14X32, all high, reflect world

at_c17__12_rev7...48fps
at_canals_08_rev7...34fps
at_coast_05_rev7...79fps
at_prison_05_rev7...68fps

Canals was sad, but the rest of them ran smooth. Is that what you're after?

BTW- I was surprised at how nice it looked, although I shouldn't have been after the Far Cry experience.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,574
10,211
126
Originally posted by: BenSkywalker
Paying a ~115% price premium for a ~70% performance increase at the top end of PC hardware would normally be considered a raging bargain. In the processor market you tend to pay ~600% premiums for ~15% performance boosts, why not go off on Intel and AMD constantly? Their crime is far worse then nV's using that as a baseline.
Well, other than not getting my point when I made my original statement, and thus this point of the discussion diverged a bit - what if both Intel and AMD dropped their lower-end ("mainstream") CPU chips off of their roadmaps entirely. That they were going to sell only P4EEs and AMD64 FXs, because the profit margins on those were higher?
Originally posted by: BenSkywalker
Who is being forced into anything in anyway?
Uhm, the buyers of graphics hardware, based on the required demands of the software developers (games developers).

IOW, if OS vendors started designing OSes that required a P4EE or an AMD64 FX to perform at even basic levels... wouldn't that effectively "demand" that users of those OSes, also purchase the required hardware platform to support it?
Originally posted by: BenSkywalker
Should the "mainstream" of the market, now be centered around the lowest-end of the price/performance curve? Is that really needed? That's what my "agenda" is, and what my rant was about. No "SLI tax" for me, please.
So you are upset that your budget has left you in the dust when compared to the highest end offerings? Is that what this is about? On a realistic level, all of those SLI buyers are doubling up the margins that nVidia and soon ATi will see increasing margins and then R&D budgets- faster acceleration of the mainstream parts.
Again, this re-inforces that you didn't even get my original point - it was centered around the price/performance curve argument, and you simply ignored that and started discussing about the max performance argument, ignoring cost. (Which is why you suggested a $10K budget, which is 10X what current top-end consumer solutions cost.)

And you admit that SLI systems double-up NV's profit margins - how are you so certain, that NV doesn't plan for SLI to be a new, permanent and required feature? One designed specifically to be able to extract 2X the profit from the customer base? With very little actual gain to the customer - if they want to stay in the "mid-range" segment, they now have to buy 2X mid-range cards, not just one.

And it has nothing to do with my budget compared to the cost of the highest-end offerings - the problem is the eventually total erosion of all of the market segments below that of the "high-end" one. We've already witnessed that the vast majority of current graphics-card developments, in terms of products being sold on the market, are almost nearly all at the high-end, with a few token low-end offerings. There are almost *no* good new mid-range gaming card offerings. Hence still the longevity and usefulness of people still using GF4 Ti 4x00 and Radeon 9500/9700/9800 cards. Where are the mid-range cards, at mid-range prices, based on newer chipsets? Where are they? The GF 6600 is about the only one out there. (Unless you consider the X300 or the 6200TC to be mid-range cards. I don't, I consider them to be part of the low-end/budget segment.) The fact that the "best" mid-range cards today, are actually the high-end cards of yesteryear, should tell you something about the rate of recent developments in that market segment.

Originally posted by: BenSkywalker
But the point of the comment was that, even back then, the "high-end" of 3D acceleration hardware did exist, so why the contrast between then and now, in terms of apparent market desire to support/purchase those high-end solutions?
What contrast? Back then the pro level boards utterly sucked for gaming, and if you check out 3DLabs/Integraph or Irix based machines the same holds true. Only ATi's and nV's parts are worth anything in the gaming market. Go ahead and drop $2K on a Wildcat, you will get GF2 levels of performance in most games(well, maybe not quite that bad but nothing remotely close to the big twos parts).
You are comparing todays cards with yesteryears (and todays) workstation cards. I was talking about the time when there effectively were no consumer-level 3D cards available, and the only (and relatively-speaking, expensive) hardware-accelerated solution was a workstation-class graphics card.

Basically, the basis of my argument is the current erosion of the mid-range market, and the apparent consumer desire to go along with that and allow themselves to be forced to purchase hardware from the "high-end" market segment, in order to obtain acceptable performance from their current software.
Originally posted by: BenSkywalker
Because that's really what's going on here, in NV-land. Hopefully within six to twelve months, market competition will right this wrong, and "NV SLI" will just be a bad memory, just like the ATI Rage MAXX was six months after it was released, and the additional "SLI tax" will be gone too.
ATi is following nVidia with multi GPU setups. AGP didn't allow the setups to work properly- PCIe does.

You do have a good point about the giant conspiracy of the developers and graphics card manufacturers though. I heard they are going to use this money to begin financing the Russian Mafia's plans to confiscate nuclear weapons and obliterate Area51 as the aliens have been using that facility to manipulate the world leaders after they are reprogrammed to listen to the lower level signals being sent over NPR :p
Ben, please don't be an ass. You and I both know (I hope you do, anyways), that these days, graphics card hardware developers work hand-in-hand with the software developers. There is no malicious "giant conspiracy" at work here. Simply the real world as it works today. Some of that work involves deals cut between those hardware mfgs and the game developers, with an eye towards increasing the market-share of those mfgs. Or did you think that the "The Way It's Meant To Be Played" logos/animations, were just added to games as a running joke gag?

PS. There's nothing inherent technically that prevents multiple GPUs to be put onto a card that has an AGP system interface. (Whether it is cost-effective to develop the necessary bridge silicon, compared to the volume of multi-GPU cards that would utilize it, is another question entirely, and is largely the reason that such solutions have been rare. In fact, weren't the 3Dfx "Rampage" boards based on just such a configuration?)
 

Muhadib

Member
Jan 11, 2005
168
0
0
I went for an SLI board because I saw a trend of CPU?s performance not increasing over time as greatly as it once did and games limited by GPU power more than anything else. The need to upgrade my motherboard along with those to factors made sense to spend some money now to avoid the inevitability of what the future games would require.

Bottom line: Software will expect more from hardware in the future. It always does. But this time, CPU progress is slowing down and GPU power is steadily increasing over time.

I?ve got a 6600GT overclocked well (see sig), it was 180 bucks and it runs 3D as fast as I need it for now. When 6800?s go down in price, I?ll have a good upgrade path should I need it.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Thanks Rollo, those are the kind of numbers I've been looking forward to seeing. It appears that 6600GTs in SLI don't have quite enough juice to push the really high resolution settings that I like to use for HL2, likely an issue with their 128MB of RAM.

Larry-

Well, other than not getting my point when I made my original statement, and thus this point of the discussion diverged a bit - what if both Intel and AMD dropped their lower-end ("mainstream") CPU chips off of their roadmaps entirely. That they were going to sell only P4EEs and AMD64 FXs, because the profit margins on those were higher?

What if ATi started charging $500,000 for their low end parts and went up from their in terms of prices? Your point isn't being understood as it makes no sense whatsoever- what if the basics for capitalism one day will fall simply due to the will of nVidia or ATi? Not likely.

Uhm, the buyers of graphics hardware, based on the required demands of the software developers (games developers).

What game today won't run on a Ti4200? You need to be able to list those of first to see if that line of discussion has any merit whatsoever.

Again, this re-inforces that you didn't even get my original point - it was centered around the price/performance curve argument,

Which is precisely my end of the discussion. GPU price/performance factors are significantly in favor of the consumer when compared to any other element of PC technology. SLI makes this even more pronounced giving a price/performance level at the top end that is completely out of the question in any other area of PC hardware.

And you admit that SLI systems double-up NV's profit margins - how are you so certain, that NV doesn't plan for SLI to be a new, permanent and required feature?

Because capitalism works.

We've already witnessed that the vast majority of current graphics-card developments, in terms of products being sold on the market, are almost nearly all at the high-end, with a few token low-end offerings.

This is some of the most ignorant drivel I have seen out of you. If you honestly think that then you don't know anything at all about the graphics card market. The high end parts still constitute a miniscule portion of the total market, Intel is still the largest player in the graphics market and nV's 200 series products(52,6200) significantly outsell both their 6600 and 6800 series of parts combined. Honestly, if you think the high end parts sell remotely close to their low end counterparts you don't have a clue about how the graphics industry works. We aren't talking a little bit here either, these are orders of magnitude issues.

There are almost *no* good new mid-range gaming card offerings. Hence still the longevity and usefulness of people still using GF4 Ti 4x00 and Radeon 9500/9700/9800 cards. Where are the mid-range cards, at mid-range prices, based on newer chipsets? Where are they? The GF 6600 is about the only one out there.

The same evil company that brought SLI is the one making the current worthwhile mid range solution. I honestly don't get what you are talking about. The 6600GT launched at $200, 50% less then the 9700/9800 and was reasonably faster then the prior gens highest end part. How is this not a good thing on the high end? The 6200 is by far the best part we have seen for less then $100, again easily destroying the previous baseline offerings. How is this not significant improvement?

The fact that the "best" mid-range cards today, are actually the high-end cards of yesteryear, should tell you something about the rate of recent developments in that market segment.

They aren't. The 6600GT is the best mid range card(assuming you are talking enthusiast) regardless of which up until quite recently mid range cards had always been the previous gens high end offerings with essentialy no changes.

I was talking about the time when there effectively were no consumer-level 3D cards available, and the only (and relatively-speaking, expensive) hardware-accelerated solution was a workstation-class graphics card.

There were no hardware accelerated games at that time.

Some of that work involves deals cut between those hardware mfgs and the game developers, with an eye towards increasing the market-share of those mfgs. Or did you think that the "The Way It's Meant To Be Played" logos/animations, were just added to games as a running joke gag?

Like TR:AoD? Please, just because they work on promotional deals in no way whatsoever means that technical limitations are going to be artificially put in place(well, maybe TwinkieMan :p ).

There's nothing inherent technically that prevents multiple GPUs to be put onto a card that has an AGP system interface. (Whether it is cost-effective to develop the necessary bridge silicon, compared to the volume of multi-GPU cards that would utilize it, is another question entirely, and is largely the reason that such solutions have been rare. In fact, weren't the 3Dfx "Rampage" boards based on just such a configuration?)

Yes, there are technical limitations which is why the bridge chip is needed. Even then, no multi rasterizer chip ever worked properly in an AGP slot using an AGP mode under WinNT based OSs.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Ben:
Thanks Rollo, those are the kind of numbers I've been looking forward to seeing. It appears that 6600GTs in SLI don't have quite enough juice to push the really high resolution settings that I like to use for HL2, likely an issue with their 128MB of RAM.

Perhaps 6800NUs for higher res, bargain SLI?
 

BrokenVisage

Lifer
Jan 29, 2005
24,771
14
81
The only issue im tapping the break on before I go for an SLI board is the unknown of say.. 2 6800GT's being enough to run Unreal Engine 3 when it becomes available. I saw some pretty sad estimated results of what an ULTRA would look like running it, so i hesitate to think how far a 500$+ SLI configuration will take me, because I'd feel pretty crappy playing UE3 at 800x600 just to get 60 fps, you know? I'd rather just pay for a half-decent card now then just play my cards and fish out a deal for an even better card that I can see results from benches. Can I get a second opinion on how 2 6800GT's might handle that beast of a graphics engine.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: BrokenVisage
The only issue im tapping the break on before I go for an SLI board is the unknown of say.. 2 6800GT's being enough to run Unreal Engine 3 when it becomes available. I saw some pretty sad estimated results of what an ULTRA would look like running it, so i hesitate to think how far a 500$+ SLI configuration will take me, because I'd feel pretty crappy playing UE3 at 800x600 just to get 60 fps, you know? I'd rather just pay for a half-decent card now then just play my cards and fish out a deal for an even better card that I can see results from benches. Can I get a second opinion on how 2 6800GT's might handle that beast of a graphics engine.

I doubt you can get any opinions because only the people making it would have seen it, and it won't be optomised at all yet anyway.

In any case, saying "I'm waiting for UE3" makes no sense as you don't know when/if it will come out.

<remembers all the 'I'm getting a V5 for Duke Nuke'm 4ever" posts>
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: Rollo
Originally posted by: BrokenVisage
The only issue im tapping the break on before I go for an SLI board is the unknown of say.. 2 6800GT's being enough to run Unreal Engine 3 when it becomes available. I saw some pretty sad estimated results of what an ULTRA would look like running it, so i hesitate to think how far a 500$+ SLI configuration will take me, because I'd feel pretty crappy playing UE3 at 800x600 just to get 60 fps, you know? I'd rather just pay for a half-decent card now then just play my cards and fish out a deal for an even better card that I can see results from benches. Can I get a second opinion on how 2 6800GT's might handle that beast of a graphics engine.

I doubt you can get any opinions because only the people making it would have seen it, and it won't be optomised at all yet anyway.

In any case, saying "I'm waiting for UE3" makes no sense as you don't know when/if it will come out.

<remembers all the 'I'm getting a V5 for Duke Nuke'm 4ever" posts>

Fear not, DNF will be out soon ;)
 
Feb 18, 2005
32
0
0
Originally posted by: Rollo

3DMark2001SE 10X7 0x0x = 19833
3DMark2005 10X7 0x0x = 5277(!)
(my X800XT PE was 5825)

What I noticed most in these runs was the huge performance leap at what I consider this rigs sweet spot 12X10 2X8X, compared to 12X10 4X8X.

In any case, I am convinced SLI is the way forward, my five year old has my AGP X800 XT PE, and life with SLI has been good so far.

My 6800U at stock speed is beating you and with a decent overclock completely blows you away. I am beating you in every single benchmark you have here. I do not know what brand 6800U you are talking about this setup beating at any setting because it sure would not beat mine in any scenario.

How is that a huge performance leap?

Looks like no performance leap at all from where I am sitting.

I am not flaming you or trying to put you down at all.

I just do not get it.

You should sell your X800XTPE, your PSU and both of your 6600GT's on ebay and get dual 6800gt's and a 600 watt enermax psu. I did the math and roughly it is almost an even amount.

Now that would be something. That would be the benchmark scores to brag about. You would be stomping the guts out of quite a few machines.

Dual 6600Gt's is a so-so setup and a waste of money in my opinion. I am a go big or stay home kind of guy. If they had SLI and the FX-55 out when I built this rig I would be running an A8N SLI DELUXE an Fx-55 and a pair of 6800 Ultra's.


I am not buying anything new until dual core cpu's are out and perfected. Then I may just have to sell this machine on Ebay and build a new 1337 pimp rig.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Anthony The Daddy


My 6800U at stock speed is beating you and with a decent overclock completely blows you away. I am beating you in every single benchmark you have here. I do not know what brand 6800U you are talking about this setup beating at any setting because it sure would not beat mine in any scenario.

How is that a huge performance leap?

Looks like no performance leap at all from where I am sitting.

I am not flaming you or trying to put you down at all.

I just do not get it.

You should sell your X800XTPE, your PSU and both of your 6600GT's on ebay and get dual 6800gt's and a 600 watt enermax psu. I did the math and roughly it is almost an even amount.

Now that would be something. That would be the benchmark scores to brag about. You would be stomping the guts out of quite a few machines.

Dual 6600Gt's is a so-so setup and a waste of money in my opinion. I am a go big or stay home kind of guy. If they had SLI and the FX-55 out when I built this rig I would be running an A8N SLI DELUXE an Fx-55 and a pair of 6800 Ultra's.


I am not buying anything new until dual core cpu's are out and perfected. Then I may just have to sell this machine on Ebay and build a new 1337 pimp rig.

Your advice on what I should do with my hardware is noted.

My five year olds A64 3000+/XTPE rig is approximately equal to your Intel platform 6800U rig, if I need to game at that level I'll use it. :)

I guess I didn't get a FX55/6800U SLI rig for the same reason I drive a Silverado 4X4 and not a Cadillac Escalade, choice of how to spend money and bang for buck?

 

ahurtt

Diamond Member
Feb 1, 2001
4,283
0
0
I got 2 6600GT's in my A8N SLI dlx with Winchester A64 3200+ @ 2.3GHz setup and I did it for basically one reason. Curiosity. I wanted to try SLI but I didn't want to spend like $800-$1000 on 2 video cards when I could easily build an entire system for that price! So I got 2 6600GT OC cards from BFG and actually spent less at the time than it would have cost to buy 1 6800GT. With this system I am upgrading from a P4 2.53GHz 8x AGP system with a 128MB GeForce 5900XT. And for me, the difference is incredible. This is how I always wanted to game and I am satisfied that for now, 2 6800's would have been way overkill. I am not a guy who needs to run at 16x12x8x4. I am happy at 11x8x2. In light of Rollo's benchmarks maybe I'll try upping it a little and see how my system fares but for me anything above and beyond what this setup is already delivering for me is just icing on the cake. And I didn't break the bank and I got to try something new to me with SLI.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,574
10,211
126
Originally posted by: BenSkywalker
Your point isn't being understood as it makes no sense whatsoever- what if the basics for capitalism one day will fall simply due to the will of nVidia or ATi? Not likely.
My point was that price/performance, for the non-enthusiast gaming market, was as or more important than simply performance alone. I don't know what the heck you are rambling about with the fall of capitalism due to ATI or NV.
Originally posted by: BenSkywalker
Uhm, the buyers of graphics hardware, based on the required demands of the software developers (games developers).
What game today won't run on a Ti4200? You need to be able to list those of first to see if that line of discussion has any merit whatsoever.
Anything that requires DX9-level support in hardware won't run. Of course most commercial games today, also implement a DX8 path, just so that they won't exclude too many people from their potential customer base, but that won't always be the case. It can largely depend on developer resources, similarity between different API levels, and the difficulty, expense, and comparisons between the sizes of the intalled-base of each generation of hardware, whether the developer chooses to implement alternate code-paths or not. (For example, how many games, coded today, make the assumption that gaming cards support hardware T&L? I would assume all of them. Soon, that will be true for other features as well, and eventually, prev-gen cards, no matter what their performance level, become unsupported by current games because of their lack of certain features.)

Doom3 was a well-documented case of an exception to the rule - it was coded such that it would even be able to run on a GF2MX, although doing so caused the design of the engine to be more-or-less far more CPU-heavy than other engines that put more performance "weight" onto the video card itself. (FarCry for one example.) It can also affect what the artists are allowed to create with the art assets - if the game is designed to be playable on the least-common-denominator feature-set in terms of hardware, then the artists have to keep that in mind, and cannot be as free with their usage of high-end features. (At least not without expanding the art assert budget, and having two create two sets of assets. So the issue of card feature levels can also figure into more than just the technical side of a game's development costs.)

Originally posted by: BenSkywalker
Again, this re-inforces that you didn't even get my original point - it was centered around the price/performance curve argument,
Which is precisely my end of the discussion. GPU price/performance factors are significantly in favor of the consumer when compared to any other element of PC technology.
Well, that comparison with any other area of general PC hardware is somewhat outside of the context of this discussion, but I did bring up the question of CPU performance relative to the demands of the software running on the system, so that much at least is fair game.

But I personally find it difficult to believe that a multi-GPU or multi-card solution, could possibly be any more price/performance efficient, when compared to a single-GPU/single-card solution, with the same level of capability.

I don't deny that "the market will eventually find the more efficient solution" - in fact, I do indeed feel that was one of the major strikes that caused 3Dfx's demise in the market. And NV is now falling into the same trap. The "curse of 3Dfx" idea might well have merit after all.

Originally posted by: BenSkywalker
SLI makes this even more pronounced giving a price/performance level at the top end that is completely out of the question in any other area of PC hardware.
Again, you keep holding on to your "top end" arguement, when I was argueing along the price/performance curve. We're not even discussing the same curve here, nevermind the same area along it! (Which is why I suggested that you seemed to fail to understand my original statement referencing price/performance.)

Originally posted by: BenSkywalker
And you admit that SLI systems double-up NV's profit margins - how are you so certain, that NV doesn't plan for SLI to be a new, permanent and required feature?
Because capitalism works.
So you are suggesting, that the market will indeed eliminate SLI solutions, eventually? (Which logically follows from that you are apparently arguing against that they will become a "permanent and required feature".) I guess that would mean that you're agreeing with me then, or at least hopeful of the same outcome - that SLI isn't useful long-term solution, at least for the mid-range market. :)

Originally posted by: BenSkywalker
We've already witnessed that the vast majority of current graphics-card developments, in terms of products being sold on the market, are almost nearly all at the high-end, with a few token low-end offerings.
This is some of the most ignorant drivel I have seen out of you. If you honestly think that then you don't know anything at all about the graphics card market. The high end parts still constitute a miniscule portion of the total market, Intel is still the largest player in the graphics market and nV's 200 series products(52,6200) significantly outsell both their 6600 and 6800 series of parts combined. Honestly, if you think the high end parts sell remotely close to their low end counterparts you don't have a clue about how the graphics industry works. We aren't talking a little bit here either, these are orders of magnitude issues.
Obviously you have difficulty following the context of the discussion here. In terms of sheer sales volume, of course the low-end/budget integrated graphics solutions sell in far more numbers. I was talking about the rate of developments, meaning the number of various models introduced. I didn't even bring up sales volumes, which had nothing to do with my point.

(I will give you a point though Ben, you are a master of the "straw man" arguement, aren't you. Introducing something that I never brought up, and then seemingly shooting it down, as if I were the one that mentioned it. No dice for you this time though.)

Additionally, the context of the discussion here was in terms of gaming cards. Things like FX 5200s and Intel integrated graphics are not even remotely usable as gaming cards, and so were not even part of the context of the discussion contrasting low-end gaming cards with the mid- and higher-end ones. The variety of different models, is clearly more diverse near the high-end, almost non-existant currently at the mid-range, and then there are the low-end cards, still not a lot of variety there.

Originally posted by: BenSkywalker
There are almost *no* good new mid-range gaming card offerings. Hence still the longevity and usefulness of people still using GF4 Ti 4x00 and Radeon 9500/9700/9800 cards. Where are the mid-range cards, at mid-range prices, based on newer chipsets? Where are they? The GF 6600 is about the only one out there.
The same evil company that brought SLI is the one making the current worthwhile mid range solution. I honestly don't get what you are talking about. The 6600GT launched at $200, 50% less then the 9700/9800 and was reasonably faster then the prior gens highest end part. How is this not a good thing on the high end? The 6200 is by far the best part we have seen for less then $100, again easily destroying the previous baseline offerings. How is this not significant improvement?
Note the highlighted word from your statement - "the". The simple fact is that there indeed are a current lack of choices amoung gaming cards in the mid-range market. In a supposedly highly-competitive market, why is this? And again, you bring up high-end, I'm talking about the mid-range (primarily), and price/performance comparisons vis-a-vie SLI systems and single-card/GPU solutions.

The 6200 (turbocache) is an interesting development, and I don't really know much about it yet to comment. What market segment is it aimed at? Low-end/non-gaming (desktop use), low-end gaming, mid-range gaming? (I know it's no-where near high-end gaming.) Considering that it seems primarily designed to cut costs by reducing the amount of memory needed onboard, I would probably rate it low-end. Does it give acceptable frame-rates and features on today's games? Is it faster than a Ti 4200/4400 card, or a R9700/9800? If not, then it must be a low-end card, since those prev-gen high-end cards only offer what I would consider mid-range gaming performance today. Which still leaves the 6600GT sitting alone as the primary viable mid-range gaming card being sold today with an up-to-date feature-set.

Originally posted by: BenSkywalker
The fact that the "best" mid-range cards today, are actually the high-end cards of yesteryear, should tell you something about the rate of recent developments in that market segment.
They aren't. The 6600GT is the best mid range card(assuming you are talking enthusiast) regardless of which up until quite recently mid range cards had always been the previous gens high end offerings with essentialy no changes.
That was part of my point, and the reason that I put "best" in quotes - that the mfg's have essentially stopped development of chips aimed at the mid-range gaming market, instead only seemingly focusing on the high-end, and letting pre-gen cards trickle-down to the mid-range. That might be an acceptable market solution purely in terms of performance, but when new APIs get released (SM3.0 being a big one), and new games are developed to support or more likely require those features, that renders those prev-gen high-end cards (now considered mid-range for gaming performance), effectively obsolete and useless. Thus the need for cards specifically designed for the mid-range segment in terms of performance, but including the newest features, in order to make them viable to play the newest games.

The fact that the 6600GT is essentially the "only" de-facto choice for a mid-range gaming card, with the newest (SM3.0) feature-set, in a market as competitive as the video-card GPU market is, should tell you something right there about the seeming lack of development going on in the mid-range gaming market.

I guess what I'm trying to say is that prev-gen high-end cards may be performance-competitive in terms of being a suitable mid-range gaming card, but they aren't necessarily going to befeature-competitive, especially at those points where the major APIs change. Thus the need for continued development for a current-gen (in terms of features) mid-range (in terms of performance) gaming card. (Unless the mfgs want to cut their R&D costs, and effectively eliminate the mid-range gaming market segment.)

Originally posted by: BenSkywalker
I was talking about the time when there effectively were no consumer-level 3D cards available, and the only (and relatively-speaking, expensive) hardware-accelerated solution was a workstation-class graphics card.
There were no hardware accelerated games at that time.
Although I admit my example of OpenGL gaming on workstation-class cards was a bit arbitrary, I believe that you are wrong there, SGI workstations had several games available for them to play, although I don't believe any of them were sold commercially as seperate software. I'll have to dig up some more on that, I'm certain that I've seen something about that. I believe that Spectre and NetTrek evolved from those. I'll concede that my example was ill-concieved though, in relation to the current commercial PC games market.

Originally posted by: BenSkywalker
Some of that work involves deals cut between those hardware mfgs and the game developers, with an eye towards increasing the market-share of those mfgs. Or did you think that the "The Way It's Meant To Be Played" logos/animations, were just added to games as a running joke gag?
Like TR:AoD? Please, just because they work on promotional deals in no way whatsoever means that technical limitations are going to be artificially put in place(well, maybe TwinkieMan :p ).
Who the hell is "TwinkieMan"? Are you referring to Gabe Newell? (Interestingly, I have a friend that looks just like him, I had to do a double-take when I was reading some of the HL2 review material, to make sure that my friend wasn't actually the author. Scary. :p )
But you are somehow steadfastly suggesting, that game devs don't often specifically optimize some features of their games, for a particular mfg's hardware? LOL.

Originally posted by: BenSkywalker
There's nothing inherent technically that prevents multiple GPUs to be put onto a card that has an AGP system interface. (Whether it is cost-effective to develop the necessary bridge silicon, compared to the volume of multi-GPU cards that would utilize it, is another question entirely, and is largely the reason that such solutions have been rare. In fact, weren't the 3Dfx "Rampage" boards based on just such a configuration?)
Yes, there are technical limitations which is why the bridge chip is needed. Even then, no multi rasterizer chip ever worked properly in an AGP slot using an AGP mode under WinNT based OSs.
Ahh, now the caveat appears. That's more of a limitation of NT's driver architecture than anything else, certainly not any inherent limitation on the hardware end of things. IIRC, ATI's official excuse for why NT OS drivers for their Rage MAXX card were never released, was that NT's architecture somehow prohibited them from implementing them, but personally I think that was just an excuse. If you recall at the time, ATI's NT driver offerings were very immature across their entire product line at the time. (AIW cards only functioning as video cards, no multimedia features, etc.) So ATI probably didn't feel that it would be cost-effective to put their driver engineering resources into producing a NT driver for a card that would have been obsoleted by the time that the drivers would have been done anyways. (Since they were hard at work on the Radeon in-house at the time.)

I'm pretty certain that there were high-end workstation AGP solutions with multiple GPU chips at the time. The existance of such things (for the workstation market), is what motivated Intel to design the AGP Pro specifications, to support the power and cooling requirements that were estimated for such cards. They just largely didn't exist in the consumer market, because of the inherently poor price/performance ratio of those sorts of solutions.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,574
10,211
126
Originally posted by: Rollo
In any case, saying "I'm waiting for UE3" makes no sense as you don't know when/if it will come out.

<remembers all the 'I'm getting a V5 for Duke Nuke'm 4ever" posts>

LOL. Has anyone checked up on the DNF developers lately? They might all be dead from some sort of horrible accident, and the world might never know, and just keep on hoping that the game will be released eventually.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,574
10,211
126
Originally posted by: Avalon
Originally posted by: Rollo
<remembers all the 'I'm getting a V5 for Duke Nuke'm 4ever" posts>
Fear not, DNF will be out soon ;)
Want to start a pool, of whether or not DNF will be released, before NV will make right on the PVP-less 6800 cards? Personally, I put my bets on DNF. :p
 
Feb 18, 2005
32
0
0
[/quote]

Your advice on what I should do with my hardware is noted.

My five year olds A64 3000+/XTPE rig is approximately equal to your Intel platform 6800U rig, if I need to game at that level I'll use it. :)

I guess I didn't get a FX55/6800U SLI rig for the same reason I drive a Silverado 4X4 and not a Cadillac Escalade, choice of how to spend money and bang for buck?

[/quote]

Actually you wasted money, did not think clearly and wasted a x800xtpe.

Also Actually with how I have everything clocked my machine would smoke a wimpy 3000+ x800xtpe.

N*gg* please.
 

user1234

Banned
Jul 11, 2004
2,428
0
0
Originally posted by: Rollo
Originally posted by: Zebo
Welcome back Rollo.

I've always said two 6600GT's was a good bang for buck, dispite most peoples ridicule it still offers better bang for buck than 6800GT PCIe. Because they can be had for $350 vs. $500 if you're lucky on GT. The GT certainly does'nt perform 30% better to justify it's price premium, maybe at MSRP it will.

Oh ya and PVP works as well;)

Thanks Zebo, definitely missed the community here.

HardOCP seems to think the 6600GT SLI offers good bang for buck:

http://www.hardocp.com/article.html?art=Njk2LDg=
When looking at the big picture, we can?t keep from going back and focusing on our GeForce 6600 GT SLI gaming experience. It took our gaming to levels that we had not even hoped for. The average framerate numbers that we had been exposed to previously that were supplied by NVIDIA simply undervalued the real gaming experience the cards would deliver. Almost across the board, the 6600 GT SLI delivered a better quality gaming experience than did a Radeon X800Pro or a GeForce 6800 GT. There is more bang for the buck value in a GeForce 6600 GT SLI configuration than anything comparable


HardOCP is the suckiest, most pathetic hardware website imaginable, they alway compare aplles-to-oranges in the most irrelevant way, not to mention questionable testing methodology and worse of all - error in performing tests. Like recently, they declared the MSI sli board to be 10% faster then asus's, only to admit that they ran the MSI with automatic overclocking turned on.... talk about incompetent morons.
 

user1234

Banned
Jul 11, 2004
2,428
0
0
Originally posted by: malak
Originally posted by: Rollo
Malak:

That is not bleeding edge tech. You have 2 budget cards from nvidia using old tech to make them work together. Furthermore, the performance from both combined still isn't as good as your single x800XT PE. Sure, get a couple 6800GT's.. waste more money. You know, you didn't have to go do all this. There were already plenty of benchmarks out there...

Actually this stuff IS bleeding edge tech:
nForce 4 SLI- available a whole 2 months now ?
6600GTs- most advanced feature set of any video card (SM3, PVP)

SLI is old tech, was used years ago. Tech bought by nvidia, not developed by nvidia.

Oh and let's take a look at those advanced features... SM3.. let's see... wait no games have that. Oh, Farcry has a beta version of it in a patch. Yeah, I can see SM3 being useful. You know last person that pulled that "advanced" stuff on me actually posted a link to an interview with developers about it. But I guess he didn't read it himself, since the developer actually said it was useless in current games. :thumbsup:

And 2 budget cards in SLI that can't match the performance of the one card you already had, yeah that's not a waste of money. I'm sorry, but SLI has one marketable feature. Buy one really good card now that can handle everything, then 2 years from now when it can't play things on high settings, buy another one for a very low price and you now can have all that performance again. Buying 2 cards now defeats the point. If they are cheap budget cards like your 6600GTs, they can barely handle new games now and won't be worth anything in 6 months. If they are 2 ultras, they will cost waaaay too much now for performance you can't fully appreciate for at least 6 months.


couldn't agree more. A single 6800GT, which is overclockable to ultra speed, is way way better than 2x6600GT sli. It's fater 95% of the time. Usually by a huge margin, but sometimes by only about 10-15%. And btw, for a single 6800GT, the AGP is slightly faster then pci-e. And btw, the socket 939 agp boards, are just as fast as the pci-e boards. And btw, the nforce3 is just as fast as nforce4. Overalll, a 939/agp/6800gt would be faster and cheaper than 939/pci-e/sli/2x6600gt. The only advantage of the latter is upgradability, but it hardly offsets the fact that it costs $150 more and provides 15% slower performance.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Anthony The Daddy

Your advice on what I should do with my hardware is noted.

My five year olds A64 3000+/XTPE rig is approximately equal to your Intel platform 6800U rig, if I need to game at that level I'll use it. :)

I guess I didn't get a FX55/6800U SLI rig for the same reason I drive a Silverado 4X4 and not a Cadillac Escalade, choice of how to spend money and bang for buck?

[/quote]

Actually you wasted money, did not think clearly and wasted a x800xtpe.

Also Actually with how I have everything clocked my machine would smoke a wimpy 3000+ x800xtpe.

N*gg* please.
[/quote]


Ah and here we get to the heart of the matter- the Epeenos Warrior wants to brag about his rig and tell everyone why he would have done it different than the OP.

1. "Wasted money"? You mean like your P4 that offers much lower performance per dollar than a much lesser priced A64 rig? Or your 6800U that offers ~10% more performance than a GT at a 20% price premium? Or your Lian Li case that does nothing at all for performance, but costs much more than other cases? I can see I'm being schooled by the "King of Thrift"......

2. "smoking my wimpy 3000+/XTPE"? Well, as admirable as it is you're on a quest to burn up your high end parts, the fact of the matter is there aren't any settings you can run on that rig I can't run equally well on my A64 3000+/X800 XT PE. Sorry chief- aint happening.
http://graphics.tomshardware.com/graphic/20040721/coolfx_ultra-07.html
There's a 450/1200 6800U running Far Cry the same speed as a X800 XT PE.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=2
There's a "wimpy" A64 3000+ running HL2 as fast as a P4 at 3.6GHz. (LOL)

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2149&p=7
There's a "wimpy" A64 3000+ running Doom3 as fast as a P4 3.4EE. (LOL)

So it looks like one of us needs advice on not "wasting money" Anthony, and it gives me some satisfaction to know my year old motherboard I paid $450 for back then is as fast as your $1000 Intel uber-board. ;)

Face it Anthony- my five year old is gaming as large as you are. You're thread crapping in the wrong place. :)


 
Feb 18, 2005
32
0
0
Originally posted by: Rollo
Originally posted by: Anthony The Daddy

Your advice on what I should do with my hardware is noted.

My five year olds A64 3000+/XTPE rig is approximately equal to your Intel platform 6800U rig, if I need to game at that level I'll use it. :)

I guess I didn't get a FX55/6800U SLI rig for the same reason I drive a Silverado 4X4 and not a Cadillac Escalade, choice of how to spend money and bang for buck?

Actually you wasted money, did not think clearly and wasted a x800xtpe.

Also Actually with how I have everything clocked my machine would smoke a wimpy 3000+ x800xtpe.

N*gg* please.
[/quote]

1. "Wasted money"? You mean like your P4 that offers much lower performance per dollar than a much lesser priced A64 rig? Or your 6800U that offers ~10% more performance than a GT at a 20% price premium? Or your Lian Li case that does nothing at all for performance, but costs much more than other cases? I can see I'm being schooled by the "King of Thrift"......

2. "smoking my wimpy 3000+/XTPE"? Well, as admirable as it is you're on a quest to burn up your high end parts, the fact of the matter is there aren't any settings you can run on that rig I can't run equally well on my A64 3000+/X800 XT PE. Sorry chief- aint happening.
http://graphics.tomshardware.com/graphic/20040721/coolfx_ultra-07.html
There's a 450/1200 6800U running Far Cry the same speed as a X800 XT PE.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2330&p=2
There's a "wimpy" A64 3000+ running HL2 as fast as a P4 at 3.6GHz. (LOL)

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2149&p=7
There's a "wimpy" A64 3000+ running Doom3 as fast as a P4 3.4EE. (LOL)

So it looks like one of us needs advice on not "wasting money" Anthony, and it gives me some satisfaction to know my year old motherboard I paid $450 for back then is as fast as your $1000 Intel uber-board. ;)

Face it Anthony- my five year old is gaming as large as you are. You're thread crapping in the wrong place. :)


[/quote]

First thing I dont know what analogies you are using but I am no young punk I am a 33 yr old man that has building and overclocking computers since the Pentium2 and 8mb videocard days.

It is simply not possible what you are saying and you are just mad that I said dual 6600Gt's are mediocre at best. Sorry but they simply are mediocre at best.

I have the hardware to back it up. When I run benchmarks I overclock to 3.95 stable on the Northwood 3.4 P4 only reaching the temp of 60 at load and 460/1200 on the 6800U only reaching the temp of around 80. I am more than sure this would completely blow away a 3000+ even overclocked. The X800XTPE is a terrible overclocker with the stock cooling solution and in no way can hang with a 6800U overclock or flashed to extreme speed. That is why ATI released the 850 because they were embarrassed. I think everybody knows that excpept people that hide under rocks.

And for your info MR Fake 1337 computer tough guy My Machine is faster than last years Maximum PC Dream Machine especially when I overclock for bragging benchmarks. So you really do not know what you are talking about.

Now I will talk cases.

Lian-Li is just over priced? Now you are really showing your noob nature. Every true modder and power computer enthusiast knows that Lian-Li is top notch if not the best. Very few cases are of this quality.

I am not a 20 yr old punk. I am a 33 yr old hardcore modder and computer enthusiast. You really do not know who you are talking to.

Wasted money??? your just ticked again that you spent all this money and have a mediocre setup. Enough of your crying blabbermouth putdowns and shenanigans and links.

Payed how much for a motherboard??? I bought this motherboard wholesale through my computer buisiness I just started a few months ago for only 120.00 dollars. So again you really are showing that you do not know me or know what you are talking about.

In the words of Spike from Buffy The Vampire Slayer "OH BULLOCKS IM OUT OF HERE".

Enjoy that mediocre SLI setup you have jerk. Sell the majority of the junk you bought and get some real videocards for your SLI setup and then talk trash.