Why do the 7800/7900 series get hurt so bad in Crysis?

nosfe

Senior member
Aug 8, 2007
424
0
0
When they post those comparisons it's usually with about 8 month old drivers, that's probably the reason why
 

Paratus

Lifer
Jun 4, 2004
17,536
15,605
146
I actaully managed to play Crysis on my P4 X1950 Pro 512 at 10x7 and medium details.

The X19xx series had a lot more Sps than the 78/79 series did.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Paratus
I actaully managed to play Crysis on my P4 X1950 Pro 512 at 10x7 and medium details.

The X19xx series had a lot more Sps than the 78/79 series did.
Yep. If I had to take a guess, the G70 parts are bottlenecking on shader power. Although with those numbers, I pity anyone who actually tried to play Crysis like that, X1900 or 7900.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: ViRGE
Originally posted by: Paratus
I actaully managed to play Crysis on my P4 X1950 Pro 512 at 10x7 and medium details.

The X19xx series had a lot more Sps than the 78/79 series did.
Yep. If I had to take a guess, the G70 parts are bottlenecking on shader power. Although with those numbers, I pity anyone who actually tried to play Crysis like that, X1900 or 7900.

Interesting, thanks folks :)

I tried it on my 6800GS (Native PCI-E core on AGP, overclocked comfortably past an ultra in terms of core and mem bandwidth), that was an 'experience' I won't forget ;)
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Like what the previous posters have said, the G70/G71 based cards lacks the required shader power for modern games. It was the limitation of NV4x architecture. That is why their equivalent competitors are almost twice to triple the performance in today's titles (Showing that R580 indeed was future proof to some extent).
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
I wonder, does SM 3.0a vs SM 3.0b have anything to do with it? I understand the 7xxx series were pretty bad at Oblivion due to lack of 3.0b support.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
I played at 1680x1050 on my 7800GT and while it wasn't brilliant, I still enjoyed it. Obviously quality settings were quite far down though.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Astrallite
I wonder, does SM 3.0a vs SM 3.0b have anything to do with it? I understand the 7xxx series were pretty bad at Oblivion due to lack of 3.0b support.

I don't think there is such a thing as SM 3.0a or b. The 7xxx series simply doesn't have enough pixel shaders.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
No there isn't. It was jargon back in the day on nvidia cards for "partial 3.0 support", lol. ATI cards had full 3.0 support. Reviewers just called it 3.0b for full support.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Originally posted by: Astrallite

I wonder, does SM 3.0a vs SM 3.0b have anything to do with it? I understand the 7xxx series were pretty bad at Oblivion due to lack of 3.0b support.
There's no such thing as SM 3.0a or SM 3.0b. You're thinking of SM 2.0a (GF 5xxx series) and SM 2.0b (R4xx series).
 

Andrew1990

Banned
Mar 8, 2008
2,153
0
0
Originally posted by: BFG10K
Originally posted by: Astrallite

I wonder, does SM 3.0a vs SM 3.0b have anything to do with it? I understand the 7xxx series were pretty bad at Oblivion due to lack of 3.0b support.
There's no such thing as SM 3.0a or SM 3.0b. You're thinking of SM 2.0a (GF 5xxx series) and SM 2.0b (R4xx series).

Maybe they are talking about that entire HDR + AA thing which ATI cards could do while Nvidia couldnt.