Hints from Nvidia?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: Skott
I'm thinking 3 way SLI is not going to be all that popular. SLI setups arent that much more powerful than one good highend card as it is. Maybe 25% on average performance increase? And thats not with all games or resolutions. Ontop of that regular SLI is prohibitively expensive and return value isnt there for most users. Get regular SLI perfected first then work on 3 way SLI I say. And the part about waiting a couple months is just his hinting at a new high end GTX/Ultra killer card most likely. Just my two cents worth.

Overall, it wont be that popular. SLI/CF as it is, is but a very small percentage of video card usage. That doesnt mean that some people wont really like it.

And SLI is much more than a 25% increase over a single card, generally speaking. In lots of games, its around 80%. Pile on the high levels of AA, and it distances itself even more. Of course the higher the res, the better the return. Sure there are some games where its not good, but those are few and far between. And generally fixed rather soon. Using SLI is not for the financial wary people. Often times a card will come out six months to a year later, thats is faster than your two cards in SLI, for the same price or less. Its about getting more frames, at a higher res, with more AA, no questions asked. Not everyone makes the same amount of money, and its not a big deal to drop $1000 on video cards.

Article showing the advantage it can have; http://www.firingsquad.com/har...erformance/default.asp
 

ricleo2

Golden Member
Feb 18, 2004
1,122
11
81
Originally posted by: Ackmed
Originally posted by: Skott
I'm thinking 3 way SLI is not going to be all that popular. SLI setups arent that much more powerful than one good highend card as it is. Maybe 25% on average performance increase? And thats not with all games or resolutions. Ontop of that regular SLI is prohibitively expensive and return value isnt there for most users. Get regular SLI perfected first then work on 3 way SLI I say. And the part about waiting a couple months is just his hinting at a new high end GTX/Ultra killer card most likely. Just my two cents worth.

Overall, it wont be that popular. SLI/CF as it is, is but a very small percentage of video card usage. That doesnt mean that some people wont really like it.

And SLI is much more than a 25% increase over a single card, generally speaking. In lots of games, its around 80%. Pile on the high levels of AA, and it distances itself even more. Of course the higher the res, the better the return. Sure there are some games where its not good, but those are few and far between. And generally fixed rather soon. Using SLI is not for the financial wary people. Often times a card will come out six months to a year later, thats is faster than your two cards in SLI, for the same price or less. Its about getting more frames, at a higher res, with more AA, no questions asked. Not everyone makes the same amount of money, and its not a big deal to drop $1000 on video cards.

Article showing the advantage it can have; http://www.firingsquad.com/har...erformance/default.asp

Great article on 3 way SLI. Sounds worth it to me. And fun to put together. But I am still going to wait for the 8800Ultra killer. Come on Nvidia release it allready!
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: ricleo2
Originally posted by: aznium
actually . you would be surprised . all game engines are pretty inefficient............

When I think of efficient game engines, Doom3 and the newer Painkiller, Overdose come to mind.

The ID software games are probably reasonably efficient. At Quakecon this year (or maybe some interview) John Carmack was talking about optimizing their game engines to remove every last bit of lag possible in the code, thus providing a solid feel between your mouse movement and the movement you see on your screen (some would call this twitchiness).

Originally posted by: Slugbait
Personally, I always felt the Unreal engines seemed most efficient when compared to other current competing engines. Tho I admit, the Doom3 engine was very nice.

Yeah, they've always been able to get decent performance and looks on even old hardware.

Originally posted by: myocardia
Originally posted by: KingstonU
Anandtech and other review sites all highly praised Crysis for scaling very well to be playable even on lower end systems.

My single core A64 4000 and 7600GT played the Crysis just fine, with all Medium in-game settings.

That's because you set out to play the game ;) , not to stand around complaining about how it doesn't run fast enough.
 

Slugbait

Elite Member
Oct 9, 1999
3,633
3
81
Originally posted by: myocardia
Originally posted by: Slugbait
Currently, the fastest PureVideo 2 card is the 8800GT

Actually, it's the 512MB 8800GTS, assuming you meant "fastest for gaming".

No, I actually meant exactly what I said. The fastest PureVideo 2 card is the 8800GT. The 8800GTS/GTX/Ultra are not PureVideo 2 cards...
 

recoiledsnake

Member
Nov 21, 2007
52
0
0
Originally posted by: Slugbait
Originally posted by: myocardia
Originally posted by: Slugbait
Currently, the fastest PureVideo 2 card is the 8800GT

Actually, it's the 512MB 8800GTS, assuming you meant "fastest for gaming".

No, I actually meant exactly what I said. The fastest PureVideo 2 card is the 8800GT. The 8800GTS/GTX/Ultra are not PureVideo 2 cards...
Are you talking about the 640/320MB 8800GTS or the new 512MB G92 8800GTS?
 

Ares202

Senior member
Jun 3, 2007
331
0
71
Originally posted by: KingstonU
Originally posted by: ricleo2
Has Crysis been deliberately designed with an inefficient game engine?

Anandtech and other review sites all highly praised Crysis for scaling very well to be playable even on lower end systems.

thats all a matter of opinion

I call half life 2 a very scalable engine, ive seen it run on a radeon 7200 32mb, which is what 8-9 years old, ive seen intergrated graphics card run it perfectly fine too

I mean crysis wont even nearly work on my brothers older Athlon xp 2400+ 768mb of ram and radeon 9500, whereas it runs HL2, CSS on high
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
nvidia havent finished milking the 8800 yet, that card should of been called a Nvidia GeForce Daisy cos its Nvidia's biggest cash cow to date.

Talk about dragging your heals, come on somone and give us a new highend card.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Zap
Originally posted by: ricleo2
Originally posted by: aznium
actually . you would be surprised . all game engines are pretty inefficient............

When I think of efficient game engines, Doom3 and the newer Painkiller, Overdose come to mind.

The ID software games are probably reasonably efficient. At Quakecon this year (or maybe some interview) John Carmack was talking about optimizing their game engines to remove every last bit of lag possible in the code, thus providing a solid feel between your mouse movement and the movement you see on your screen (some would call this twitchiness).

Originally posted by: Slugbait
Personally, I always felt the Unreal engines seemed most efficient when compared to other current competing engines. Tho I admit, the Doom3 engine was very nice.

Yeah, they've always been able to get decent performance and looks on even old hardware.

Originally posted by: myocardia
Originally posted by: KingstonU
Anandtech and other review sites all highly praised Crysis for scaling very well to be playable even on lower end systems.

My single core A64 4000 and 7600GT played the Crysis just fine, with all Medium in-game settings.

That's because you set out to play the game ;) , not to stand around complaining about how it doesn't run fast enough.

Turning down effects when you run a Quad core at 3.4Ghz and an 8800GTX with 4GB of memory is pathetic if you ask me. Obviously it's not my system, but the reviews all show me that the game runs like ass on everything.
 

Slugbait

Elite Member
Oct 9, 1999
3,633
3
81
Originally posted by: recoiledsnake
Originally posted by: Slugbait
Originally posted by: myocardia
Originally posted by: Slugbait
Currently, the fastest PureVideo 2 card is the 8800GT

Actually, it's the 512MB 8800GTS, assuming you meant "fastest for gaming".

No, I actually meant exactly what I said. The fastest PureVideo 2 card is the 8800GT. The 8800GTS/GTX/Ultra are not PureVideo 2 cards...
Are you talking about the 640/320MB 8800GTS or the new 512MB G92 8800GTS?

Hmmm...nVidia's site lumps the new G92 version with the generic 8800 feature set, specifying PureVideo HD (which isn't PureVideo 2). Two different Wikipedia pages (one for PureVideo, the other for the 8 series) list PureVideo cards, but have not been updated specifically to call out which version of PureVideo is on the new G92 8800. A quick Google search revealed tons of review sites that are stating "PureVideo HD" for the new G92...however, I now believe they are possible all incorrect, probably due to an ill-informed nVidia PR department or something.

Because it's a G92, I will assume (for now) that the new 8800GTS 512 is NOT actually a PureVideo HD card as reported on so many sites, and is instead a PureVideo 2 card. At least until something conclusive is shown. In which case, I shall stand momentarily corrected.
 

ricleo2

Golden Member
Feb 18, 2004
1,122
11
81
Originally posted by: Slugbait
Originally posted by: recoiledsnake
Originally posted by: Slugbait
Originally posted by: myocardia
Originally posted by: Slugbait
Currently, the fastest PureVideo 2 card is the 8800GT

Actually, it's the 512MB 8800GTS, assuming you meant "fastest for gaming".

No, I actually meant exactly what I said. The fastest PureVideo 2 card is the 8800GT. The 8800GTS/GTX/Ultra are not PureVideo 2 cards...
Are you talking about the 640/320MB 8800GTS or the new 512MB G92 8800GTS?

Hmmm...nVidia's site lumps the new G92 version with the generic 8800 feature set, specifying PureVideo HD (which isn't PureVideo 2). Two different Wikipedia pages (one for PureVideo, the other for the 8 series) list PureVideo cards, but have not been updated specifically to call out which version of PureVideo is on the new G92 8800. A quick Google search revealed tons of review sites that are stating "PureVideo HD" for the new G92...however, I now believe they are possible all incorrect, probably due to an ill-informed nVidia PR department or something.

Because it's a G92, I will assume (for now) that the new 8800GTS 512 is NOT actually a PureVideo HD card as reported on so many sites, and is instead a PureVideo 2 card. At least until something conclusive is shown. In which case, I shall stand momentarily corrected.

Ok, please correct me if I am wrong. PureVideo 2 has to do with any video playback on the computer?