9800 review at guru of 3d

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Interesting.

I'm still disappointed with the 9800's performance tho. It's only marginally better than a 9700.
 

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
All in all ATI once again managed to place a product on the market that's is the fastest thing available, a very pleasant product. The GeForce FX 5800 Ultra however is trying to keep up with the Radeon 9800 Pro the newer Detonator 43.45 drivers definitely close the gap very well.
Is this guy's first language English? Much of it doesn't make sense.

At any rate, I'm looking forward to the 9600 or the 5600 for my HTPC setup. I really hope ATI fixes the "rolling bars" problem or I'll have to go for nVidia- and I suspect the 5600 doesn't have the full MPEG2 decoding their "Go" mobile products have.
 

Malladine

Diamond Member
Mar 31, 2003
4,618
0
71
The 9700 runs every game today at max settings with 40+ fps so, unless you're trading the 9700, why would anyone shell out for the 9800?! IMO anyone who has a geforce ti500 or better won't need to upgrade at all until Doom 3. Even then it will be another year, possibly two, before enough decent games are released using the Doom 3 engine to warrant spending $250+ on a new VPU.

And SickBeast: marginally better than the 9700?? You need to read this: http://www6.tomshardware.com/graphic/20030306/radeon9800pro-31.html

Wag: i'd recommend the 9500 PRO instead of either the 9600 or the 5600. Granted, the 9600 hasn't truly been tested yet, but with a smaller bandwidth, I can't imagine it will be able to keep up with the 9500 PRO. And the 5600? :disgust:

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
The 9500s and 9700s are rife with the "rolling bar" problem. ATI's products have had it for almost 3yrs now- my 8500's output looks so bad on my HDTV it's almost unwatchable.

I had a MX200 before the 8500 and it didn't have this problem on my HDTV. Although the PQ wasn't quite as sharp as the 8500 @ 1280*720/60Hz- it didn't have those damn rolling horizontal bars.
 

Malladine

Diamond Member
Mar 31, 2003
4,618
0
71
Try it at 1024x768 @ 85hz instead of 60. I recently discovered that, aside from visual glitches, 60hz can cause a great deal more eye stress...
 

nippyjun

Diamond Member
Oct 10, 1999
8,447
0
0
I wish he had compared all 3 cards with fsaa and AF instead of just showing the non-aa/af comparison and then the numbers for the 9800 w/ aa/af.

The 5800U did very well IMHO in all the tests he showed.
 

Malladine

Diamond Member
Mar 31, 2003
4,618
0
71
Chizow, that's an interesting article, thanks.

Having read it, i'd have to say that it's feasible that ATI has "cheated" by making it the card seem better in these tests than it actually would prove to be in a practical sense. However, if they have fine tuned the card to that degree, then surely it would perform better not just in the benchmarks of say UT2k3 but also in the game itself?
 

Malladine

Diamond Member
Mar 31, 2003
4,618
0
71
It did fine, sure. But nvidia still seems to be playing catch-up. Except for serious sam 2. They have ATI beat in that game :D
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Malladine
Chizow, that's an interesting article, thanks.

Having read it, i'd have to say that it's feasible that ATI has "cheated" by making it the card seem better in these tests than it actually would prove to be in a practical sense. However, if they have fine tuned the card to that degree, then surely it would perform better not just in the benchmarks of say UT2k3 but also in the game itself?

That's a completely logical response, but considering the performance gain was realized on a 9700pro, it indicates to me that ATi is holding back performance on the 9700pro and that there really isn't any difference between the cards other than clockspeeds.

If you've got an hour or so to kill, you can read through this discussion on the 9800 vs 9700.

Chiz
 

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
1024x768 @ 85hz
I would, but that's not an ideal resolution for HDTV. It's 4:3. I have tried 960*720p and lower resolutions but for some reason my HDTV doesn't format well unless it's 60Hz or multiples of that.
 

J5im8yo

Senior member
Nov 8, 2002
233
0
0
So, R350 core had basically reached the .15 technology's limit. 380 ok, 20 away from 400 which is maxed out. After, all Nvidia was courageous enough to try their hands on .13 which they had trouble but I'd say not bad for a new technology. It'll be very interesting to see R400 core or something using the .13 technology. Wonder if ATi could successfully implement it. :confused: