ATI 9100 not substantially better than GF2 GTS-V

OS

Lifer
Oct 11, 1999
15,581
1
76
Well, I got a visiontek 9100 and it's not exactly light years ahead of my old GTS-V. It basically only buys me a res step up in UT2k3 and 32 bit color over 16 bit. I guess since I'm a very casual gamer, that difference is not that impressive at all.
 

clicknext

Banned
Mar 27, 2002
3,884
0
0
Strange, it's quite a bit better to me. I had an overclocked Radeon32MB Before, which shouldn't be far off from a GF2 GTS. I went to an 8500 which is the same as 9100. Got double the frames in CS, was able to play things like UT2003 at a good speed. The 9100 is already dated though, and doesn't perform that well in the latest games.
 

OS

Lifer
Oct 11, 1999
15,581
1
76
Originally posted by: nemesismk2
What are the other parts to your pc like your cpu, memory etc? Maybe these are limiting the performance of your R9100.

1700 xp, 512 ddr

I might just be jaded, I remember having this feeling when I went from a V3 to the GF2 I had. Probably cause the video card market is reaching maturity and hitting diminishing returns. There aren't any more 2d only to voodoo 1, V1 to V2/TNT jumps anymore. When I first got into video cards, the visual differences between generations were huge and immediately noticeable. Doesn't seem that way any more.

 

clicknext

Banned
Mar 27, 2002
3,884
0
0
Originally posted by: OS
Originally posted by: nemesismk2
What are the other parts to your pc like your cpu, memory etc? Maybe these are limiting the performance of your R9100.

1700 xp, 512 ddr

I might just be jaded, I remember having this feeling when I went from a V3 to the GF2 I had. Probably cause the video card market is reaching maturity and hitting diminishing returns. There aren't any more 2d only to voodoo 1, V1 to V2/TNT jumps anymore. When I first got into video cards, the visual differences between generations were huge and immediately noticeable. Doesn't seem that way any more.

Ah yes... all you get is some more frames and flashy flashy now.
 

cobra77

Member
Aug 5, 2003
33
0
0
Check out the Nature scene from 3Dmark 2001 to see what a true DX8 card like the 9100
is capable of. The Geforce2 can not even run it.

As for speed, Unfortunatly video card companies use different GPU chip speeds and memory speeds
(slower) to save money, They usually keep the same name as the regular clocked versions and sell
them to people who have no idea there is a difference.
In other words company A's 9100 is not nessesarily the same speed as company B's. :|
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I like my Vtek 9100, but I was upgrading from an nForce1 IGP ("GF2MX100") to a $60 64MB card, so I wasn't expecting miracles. ;) All Vteks are 250/250, so card clocking isn't an issue. Perhaps you're just not playing games that will show off your card? Try enabling 16xAF--it's a great improvement in IQ for only a 10% performance hit.

Otherwise, the GF2 and 9100 are very similar for DX7 games. Both are 4x2 architectures, so you'll basically see a speed bump proportional to the 9100's core and memory clock bumps. And UT2K3 is probably CPU-bound on your XP1700+.

Still, my 9100 is good enough at most recent games at 10x7 16xAF, with some more intensive ones (like UT2K3 and Wolf:ET) requiring a bump down to 8x6 16xAF to feel smooth (~50+fps).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Any derivative of the 9100 should annihilate the GTS-V. If it doesn't you've probably forgotten to uninstall nVidia's drivers before you swapped the cards. Also make sure you have the latest BIOS, chipset drivers and Catalyst 3.6s for your system.

In addition most of today's games are very CPU limited and thus require very fast processors to run well; your 1700+ isn't a particularly fast processor.
 

TerryMathews

Lifer
Oct 9, 1999
11,464
2
0
Did yous guys miss the part about 16bit vs. 32bit? A res bump and doubling the color depth is a pretty major increase in bandwidth. As an example:

800x600x16 = 7,680,000 bits per frame. 30 frames per second would yield 230,400,000 bits per second or ~27.5MB/sec
1024x768x32 = 25,165,824 bits per frame. 30 frames per second would yield 754,974,720 bits per second or 90MB/sec.

Bear in mind this is raw blit performance and does not take into consideration geometry, lighting, shading, or other GPU effects. That being said, it should be a fairly accurate scale representation of how much bandwidth is required since all (or almost all) GPU effects have to be written into the framebuffer.
 

OS

Lifer
Oct 11, 1999
15,581
1
76
Originally posted by: BFG10K
Any derivative of the 9100 should annihilate the GTS-V. If it doesn't you've probably forgotten to uninstall nVidia's drivers before you swapped the cards. Also make sure you have the latest BIOS, chipset drivers and Catalyst 3.6s for your system.

In addition most of today's games are very CPU limited and thus require very fast processors to run well; your 1700+ isn't a particularly fast processor.


Well, part of it is because you can overclock the GTS-V so damn far, but not really for my 9100. I know that's not a fair comparison, but that's the results I see.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
It basically only buys me a res step up in UT2k3 and 32 bit color over 16 bit. I guess since I'm a very casual gamer, that difference is not that impressive at all.
Your system is very underpowered to play UT2003. If 9100s are basically 8500s, you're using a 2 year old gpu, 2 year old cpu, to play one of the most resource hungry games available. UT2003 didn't exist when your hardware was considered good gaming hardware.
Beyond that, the 8500 was about equivalent to the GF3, which WASN'T a huge step up from the GF2. One resolution higher and maybe some AA/AF is all you should expect. If you wanted to see some major differences, you should have spent $240 and got a 9700/5800. (and preferably a faster cpu)
 

mindless1

Diamond Member
Aug 11, 2001
8,761
1,764
136
True enough, the 9100 is only about 60% faster (depending on how far the GF2 O'Ced). After increasing resolution and bit-depth the increase is largely cancelled but it should certainly look better.

 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
hmmm i went from geforce2 gts to geforce3... and i had a huge performance gain... notably UT2003...
 

movinslow

Senior member
Jul 15, 2002
246
0
0
hmm, I average about 40fps or higher at 1024x768x32 with close to max quality.

running retail 8500 128mb (275/275) with 1700xp and 512pc2100

I think the 9100's run at 250/250 stock. I'm pretty sure they could hit 275/275 without a problem.
In any case, try overclocking some and see if it helps.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
i went from geforce2 gts to geforce3... and i had a huge performance gain... notably UT2003...
I guess that depends on how you define "huge".

One man's huge, is another man's small
The GF3 does offer double the framerate at 10X7, which would be pretty huge if you had a GF2 Ultra. It would be the difference between "playable" and "not".
The double framerate is still half of a 9700Pro, which isn't even the fastest card anymore. If you factor in the AA/AF factors, the gains would be even less.

This is the year 2003, he's trying to play UT2003, and trying to do it on 2 year old hardware just isn't going to be your optimal setup.
 

BoberFett

Lifer
Oct 9, 1999
37,562
9
81
I had a GF2 GTS and moved to a 9100 and it was a big difference on my Celeron 800. Moved to an Athlon 1700+ o/ced and the Radeon gained even more over a GF2 on the same CPU.