VIA or ATI powered XBOX 2 ???

ST4RCUTTER

Platinum Member
Feb 13, 2001
2,841
0
0
Wow! I didn't expect Nvidia to be supplanted by anyone else in this market segment. It seems to me that Nvidia is leaps and bounds over the other graphics chip-makers when it comes to gaming exclusively. Apparently Graph-zilla was charging too much per chipset for Microsoft's taste. This could be a boon to either of the two other companies (like it did Nvidia)...especially if Microsoft puts out some of the R&D costs.

Link
 

nortexoid

Diamond Member
May 1, 2000
4,096
0
0
just how is nvidia "leaps and bounds" over ATI?

just check out the piece from Carmack commenting on ATI's vs. Nvidia's hardware implementation for Doom...
(sorry, no linkie - forgot it)

don't get me wrong, nvidia has a solid performing chip w/ the GF4, but it's nothing too far beyond what ATI currently has for a gaming console, nor what they're allegedly have in the pipeline.
 

ST4RCUTTER

Platinum Member
Feb 13, 2001
2,841
0
0
Repost from OT

Damn, I should have included that in my search.


just how is nvidia "leaps and bounds" over ATI?

Well, considering the Radeon 8500 barely squeaks past the GF3 500 with its new drivers and can't stand up to the GF4, I'd say Nvidia is at least a small "leap or bound" ahead of ATI. Not to mention the rate at which Nvidia seems to update their product lineup. I think the phrase "blisteringly fast" comes to mind.

As far as John Carmacks opinion on the GF4, he was mainly referring to the deficiencies of the GF4MX which is utterly worthless IMO. Here is a snippet:

"Nvidia has really made a mess of the naming conventions here. I always
thought it was bad enough that GF2 was just a speed bumped GF1, while GF3 had
significant architectural improvements over GF2. I expected GF4 to be the
speed bumped GF3, but calling the NV17 GF4-MX really sucks."


Here is a link to the interview Carmack gave the upcoming Doom title and the GF4/Radeo 8500.

He points out that the ATI card has the ability to render a texture on a single pass instead of two or three passes on the GF4 due to its superior fragment level processing.

"The fragment level processing is clearly way better on the 8500 than on the
Nvidia products, including the latest GF4. You have six individual textures,
but you can access the textures twice, giving up to eleven possible texture
accesses in a single pass, and the dependent texture operation is much more
sensible. This wound up being a perfect fit for Doom, because the standard
path could be implemented with six unique textures, but required one texture
(a normalization cube map) to be accessed twice. The vast majority of Doom
light / surface interaction rendering will be a single pass on the 8500, in
contrast to two or three passes, depending on the number of color components
in a light, for GF3/GF4."


This last paragraph indicates to me that while this may be great for Doom, it may not translate as well to other titles. Carmack's later statement strengthens my belief on this:

"I can set up scenes and parameters where either card can win, but I think that
current Nvidia cards are still a somewhat safer bet for consistent performance
and quality."


 

Vegito

Diamond Member
Oct 16, 1999
8,329
0
0
So basically it's the programmers fault... guess all these guys are not optimizing their codes

He points out that the ATI card has the ability to render a texture on a single pass instead of two or three passes on the GF4 due to its superior fragment level processing.


 

Syborg1211

Diamond Member
Jul 29, 2000
3,297
26
91
The geforce 4 is far less than a leap and bound over ATI. I wouldn't even call it a step ahead. ATI and Nvidia are on slightly different releasing schedules. The radeon 8500 was beating out the gf3 ti500 earlier... then came nvidia's turn to respond.. now ati's. It's all a cycle. Neither company is leaps and bounds above the other or else one of them wouldnt be here ;)
 

fatbaby

Banned
May 7, 2001
6,427
1
0


<< remember ATI's vid card cost considerably less than nVidia's >>



?

2 months ago, Ati's msrp for a radeon 7500 was $199 which was nvidias msrp for a gf3 ti200, a superior card
 

AGodspeed

Diamond Member
Jul 26, 2001
3,353
0
0
I think the most important thing to take from this is that XBox 2 won't be out for at least another 2-3 years. I don't think this estimate isn't too far off from reality.

Therefore, there's no reason to think MS is going to choose such and such a CPU, GPU, or chipset now, but they'll see what products will pan out later. Remember, MS decided at the last minute to go with the Celeron (a smart choice) over an AMD processor. No matter what, AMD didn't have a processor that was well suited for XBox simply because of the cooling solution it would have required.

And therefore, MS for all we know might use a ClawHammer processor in XBox 2. :D
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
The XBox has a true PIII with 128k cache, the celeron is a little different from that.