• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

x800pro or 6800 or 9800pro? Need to order today

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
SOME MORE on R500
While Microsoft says that the chip created for the Xbox 2 console will be ?custom?, graphics companies typically tend to re-use almost all circuitries they once implemented. NVIDIA created its extremely successful GeForce4 Titanium-series based on the GPU it developed for Microsoft?s Xbox, the same may happen with ATI and Xbox 2.

It is interesting to note that both ATI and Microsoft need to supply game developers with Shader Model 3.0-enabled hardware as soon as possible so to take advantage out of additional functionality in the first batch of Xbox 2 games. Since ATI does not currently have such graphics processors, Microsoft may either supply game developers NVIDIA-based graphics cards or wait for ATI?s SM 3.0-offering. The latter case probably requires ATI to have its graphics processor supporting pixel and vertex shaders 3.0 months before the Xbox 2 is scheduled for release. In case we assume that the Xbox 2 is to be launched in late 2005, then ATI will have to deliver its chip with SM 3.0 either in later 2004 or early 2005.

ATI Technologies? next product is rumoured to be code-named R480, suggesting that this is a yet another Shader Model 2.0 chip with boosted performance. The next-generation product code-named R500 was originally said to release in 2005. Most probably, it will be the base for Microsoft Xbox 2 console.

Specifications of the graphics engine the Xbox 2 console is reported to have impress much: the chip seems to have 10 times higher geometry and 4 times higher pixel performance compared to the RADEON X800 XT. In case the same applies to the desktop R500, then next year we will see processors outperforming today?s chips in graphics-intensive applications by a factor of 3, at least?


Xbox 2 Hardware Specs Sneak into Web
Xbox 2 Graphics Processor to Use Shader Model 3.0

The graphics processor designed for the Xbox 2 console is a custom 500MHz chip from ATI Technologies.

The shader core has 48 Arithmetic Logic Units (ALUs) that can execute 64 simultaneous threads on groups of 64 vertices or pixels. ALUs are automatically and dynamically assigned to either pixel or vertex processing depending on load. The ALUs can each perform one vector and one scalar operation per clock cycle, for a total of 96 shader operations per clock cycle. Texture loads can be done in parallel to ALU operations. At peak performance, the GPU can issue 48 billion shader operations per second.

The GPU has a peak pixel fillrate of 4 or more gigapixels/sec (16 gigasamples/sec with 4x antialiasing). The peak vertex rate is 500 or more million vertices/sec. The peak triangle rate is 500 or more million triangles/sec. Microsoft reportedly states that the figures are attainable with non-trivial shaders.

pretty exciting, really
😉
 
Originally posted by: apoppin
Let's look at the FIRST post again . . . carefully:
Originally posted by: FFactory0x

Whats my best bet. I dont want to spend a fortune plus i never really use antialiasing or play over 1024x768 or 1280.

I was leaning towards the 6800 with the option to step up
in 90days since my bday is around the corner


1) IF he NEVER plays with AA and uses 10x7, the 9800p is a good card . . . still "fast" for that res

2) the 6800 - with a View to UPgrading is a "good idea" (getting the "free" upgrade as a B-day present or money is hinted at here)

3) <<"I also might get a new $200 17" or 19" monitor">>, says to ME that IF he saves enough money on the card, he might be able to afford the 9800p AND the monitor now.

so there it is in a "nutshell"

1) the 9800p for cheap at 10x7 (and maybe a new CRT monitor for ~$200)

or

2) the 6800 with a B-day UPgrade.

Since he doesn't want to wait for the x800p - forget that choice.

OK, since i 'contributed', i'm off to play a great videogame that i unfortunately set aside to finish Doom III . . . anyone want my copy of Doom III for $20 + shipping, PM me. 😉

(i'll check in tomorrow)

Wel in response to this if he isn't using AA or AF wouldn't the 5900XT be his best bang for buck card if he is updating in 90days?

Well ATI is at 24bit partial precision whereas Nvidia is at 32bit full precision. Those rumors mean nothing until we get and oficial response from ATI. Those rumors are just like the ones created by Enquirer about ATI cancelling the XT(PE) and having the Pro as its top o the line card :\ like that would ever happen.

-Kevin
 
Originally posted by: Gamingphreek
Originally posted by: apoppin
Let's look at the FIRST post again . . . carefully:
Originally posted by: FFactory0x

Whats my best bet. I dont want to spend a fortune plus i never really use antialiasing or play over 1024x768 or 1280.

I was leaning towards the 6800 with the option to step up
in 90days since my bday is around the corner


1) IF he NEVER plays with AA and uses 10x7, the 9800p is a good card . . . still "fast" for that res

2) the 6800 - with a View to UPgrading is a "good idea" (getting the "free" upgrade as a B-day present or money is hinted at here)

3) <<"I also might get a new $200 17" or 19" monitor">>, says to ME that IF he saves enough money on the card, he might be able to afford the 9800p AND the monitor now.

so there it is in a "nutshell"

1) the 9800p for cheap at 10x7 (and maybe a new CRT monitor for ~$200)

or

2) the 6800 with a B-day UPgrade.

Since he doesn't want to wait for the x800p - forget that choice.

OK, since i 'contributed', i'm off to play a great videogame that i unfortunately set aside to finish Doom III . . . anyone want my copy of Doom III for $20 + shipping, PM me. 😉

(i'll check in tomorrow)

Wel in response to this if he isn't using AA or AF wouldn't the 5900XT be his best bang for buck card if he is updating in 90days?

Well ATI is at 24bit partial precision whereas Nvidia is at 32bit full precision. Those rumors mean nothing until we get and oficial response from ATI. Those rumors are just like the ones created by Enquirer about ATI cancelling the XT(PE) and having the Pro as its top o the line card :\ like that would ever happen.

-Kevin
um, no . . . the UPgrade is offered thru eVGA and is ONLY for the 6800 series (6800 > 6800GT > 6800U).

Curious, what did FFactory0x order (yesterday) ????

i really DO think we will see the x800XT next month or ATI will really be in the crapper (as nVidia will emerge as the clear performance winner . . . IMpossible from ati's standpoint). 😉
 
No i called yesterday and the rep said the upgrade is availible if you buy from any retailer. As long as you register and provide a reciept. I specfiically asked him


I ordered the eVga 6800 regular for $278 shipped 2nd day zipzoomfly.com
 
Thanks for the links Apoppin. I just think logically things DONT add up for me.

First of all, ATI decided to push back their major generational introductions from 12-15 months to 18-24 months. Secondly, ATI is yet to release its refresh rate R480 card. If R500 is released in 1H/05 then what is the point of releasing R480 exactly? Thirdly, since X800xt is not even widely available, releasing a card 2x as fast in early 05 would mean that X800xt was introduced for nothing? And then the most important question in my mind: how can a company double its performance in 8-10 months (X800xt paper launched in May, so according to those articles, say if it is introduced in Feb-March) when such has never happened over the last 5 years? Basically even if ATI could double the performance of their videocards, why would they have such short product cycles? This would stop people from buying anything high end because in 6 months its 2x slower? I understand how in 6 months, the high-end card is no longer the best one, but maybe 10-20% slower, not 100%. Besides RV410 is only about to be introduced. Introducing a new R500 core would mean new midrange RV500s will have to follow shortly. I dont doubt that a new card will come out sometime in 8-10 months, but I highly doubt it will double the performance of x800xt (saying it will be at least 3 times faster in benchmarks is completely out of whack). I mean just think about it, if it was 3 times faster, every cpu would majorly bottleneck it, making that card perform much slower than its theoretical 3x performance jump, no? And finally, if ATI released their card 2x+ faster in 10 months, then they'd surely have to follow up with a card 2x faster in the next 1 year again or disappoint investors and consumers. That kind of exponential and constant increase in performance is simply too hard to sustain. That would mean that every 2 years, the videocard performance would have to increase 4x. Well considering CPU performance roughly increases 2x every 18 months, according to this logic, when time catches up the cpus will be so slow, it would provide no benefit to step up to the next card, and so on.....
 
10x7 with no eye candy? Really? Get a 9600 NP. That should cover most games with these settings! Seriously though, there's no reason to buy the *admittedly better* 6800 series if you don't play games @ high res or with AA/AF on. There is no perceptible difference between 60 FPS and 80 or 200 for that matter. I am not trolling, just stating opinion. I am not trying to be controversial, but restore logic.

If a given graphics card can give you an acceptable frame rate in the games that you want to play @ the resolutions you want play, then all other factors except price are null and void. There is no point in buying the fastest card available if it increases your FPS from 70-140. Twice as fast, but no difference in games. Obviously 20-40 is completely different, so please don't flame me on that.

I'm just saying value and performance have an intersection on each individual's chart, and everyone's chart is different.
 
Originally posted by: superkdogg
10x7 with no eye candy? Really? Get a 9600 NP. That should cover most games with these settings! Seriously though, there's no reason to buy the *admittedly better* 6800 series if you don't play games @ high res or with AA/AF on. There is no perceptible difference between 60 FPS and 80 or 200 for that matter. I am not trolling, just stating opinion. I am not trying to be controversial, but restore logic.

If a given graphics card can give you an acceptable frame rate in the games that you want to play @ the resolutions you want play, then all other factors except price are null and void. There is no point in buying the fastest card available if it increases your FPS from 70-140. Twice as fast, but no difference in games. Obviously 20-40 is completely different, so please don't flame me on that.

I'm just saying value and performance have an intersection on each individual's chart, and everyone's chart is different.

Your sort of right. I just sold my 5900Ultra because I couldn't get hi-res framerates where I wanted them. To me, if I can't play a game "The way it was meant to be played" (no pun intended, it just works for my example) I am not happy. Other people may very well be satisfied playing at lower resolutions with features disabled or lowered. More power to them. If you check out the consolidation thread for Doom3, you can see my benches on my 5900U and my new 6800GT. Major differences. FarCry and Painkiller as well, not to mention CoD. I like the feeling when my video card cant be slowed down, where it has power to spare. That way I know that I am not missing out on anything. I was always one partial to this line of thinking; "If you're going to do it, do it right or don't do it at all.". And I don't believe in re-incarnation, so we only live once as far as I'm concerned. Life is short, why live it mediocre.
 
Guys I think Apponin is right in one thing.
If the guy is only playing at 10x7 with no AA/AF then he might as well have gone for a 9800pro/xt,depending on the best price. We don't even know what's his cpu for starters. I have a 2.8c and I consider myself cpu limited with a 6800GT at 10x7,so...
But Aponnin you're blatantly wrong if you believe that R500 will be so fast as rumours claim. And as I said this gen of video cards already require a faster cpu to show their real potential.You might as well consider what the cpu requirements will be for the next gen... Does he have that option to upgrade?
Anyway 6800 is a very nice gpu and WTF? He already has a fast card with SM3.0 support that he can hold on till Longhorn and DX10 is out, so issue resolved.
 
Originally posted by: Gamingphreek
Originally posted by: apoppin
Let's look at the FIRST post again . . . carefully:
Originally posted by: FFactory0x

Whats my best bet. I dont want to spend a fortune plus i never really use antialiasing or play over 1024x768 or 1280.

I was leaning towards the 6800 with the option to step up
in 90days since my bday is around the corner


1) IF he NEVER plays with AA and uses 10x7, the 9800p is a good card . . . still "fast" for that res

2) the 6800 - with a View to UPgrading is a "good idea" (getting the "free" upgrade as a B-day present or money is hinted at here)

3) <<"I also might get a new $200 17" or 19" monitor">>, says to ME that IF he saves enough money on the card, he might be able to afford the 9800p AND the monitor now.

so there it is in a "nutshell"

1) the 9800p for cheap at 10x7 (and maybe a new CRT monitor for ~$200)

or

2) the 6800 with a B-day UPgrade.

Since he doesn't want to wait for the x800p - forget that choice.

OK, since i 'contributed', i'm off to play a great videogame that i unfortunately set aside to finish Doom III . . . anyone want my copy of Doom III for $20 + shipping, PM me. 😉

(i'll check in tomorrow)

Wel in response to this if he isn't using AA or AF wouldn't the 5900XT be his best bang for buck card if he is updating in 90days?

Well ATI is at 24bit partial precision whereas Nvidia is at 32bit full precision. Those rumors mean nothing until we get and oficial response from ATI. Those rumors are just like the ones created by Enquirer about ATI cancelling the XT(PE) and having the Pro as its top o the line card :\ like that would ever happen.

-Kevin

I think u might have got that totally wrong, ATi has full 24 bit precision, while nVidia has 32 bit precision, but it requires twice as much memory, so developers use partial 32 bit and 16 bit for nVidia GPUs, while for ATi, devs can use the full 24 bit precision....

Here are some statements from nVidia, ATi, and HardOCP

?The design of the pixel shader units is optimized for handling color data in RGBA (Red/Green/Blue/Alpha) format, although it can handle a wide range of alternative data formats. The vector ALUs perform operations on the three color components (RGB), while the scalar ALUs perform operations on the alpha or transparency component (A). The separate texture address ALU allows texture accesses to occur independently of other pixel shader operations. All operations are processed with a full 24 bits of precision for each component. The RADEON X800 driver software ensures optimal utilization of these ALUs by analyzing and reordering incoming pixel shader instructions wherever possible, without affecting the output.?

Important key point here, FP24 precision all the time every time, no fallbacks to a lower precision.

Whereas, nVidia will use 32 bit precision as much as possible but it can fall back to 16 bit precision if 32 bit is getting too much. This is what nVidia and HardOCP say...

"The GeForce architecture has always enabled developers to choose the necessary level of precision for each image or scene. Now the choice is simpler, because the performance degradations associated with full 32-bit floating-point precision have been eliminated."

"Floating point operands can be treated in native 32-bit or optional 16-bit format, which are the standard formats in the film industry. Although both modes deliver equivalent performance, the 32-bit floating point mode uses twice as much memory to store the operands. Programmers can choose between native 32-bit mode and the optional 16-bit mode to achieve the required level of precision in each case. Plus, they can efficiently manage memory usage in situations where space is a consideration. Other data formats are also supported."

In DirectX9.0, Floating Point 24 was required as the minimum value for PS/VS 2.0. For PS/VS 3.0 the minimum requirement is FP32.

Floating Point 32 (FP32) is supported across the entire pipeline in NV40. The NV30 series also supported FP32. However, performance of FP32 in the NV30 series was very slow and often precision was lowered to gain performance in games. With NV40, NVIDIA claims that performance has been upgraded and is now very fast. In fact, above it states FP32 performance and FP16 performance are ?equivalent?, though keep in mind FP32 uses twice as much memory to store operands.

Keep in mind tho that the human eye cannot detect the difference between 24 bit and 32 bit full precision anyway, so we wont be able to see any notable difference between quality even tho there might be
 
Originally posted by: FFactory0x
No i called yesterday and the rep said the upgrade is availible if you buy from any retailer. As long as you register and provide a reciept. I specfiically asked him


I ordered the eVga 6800 regular for $278 shipped 2nd day zipzoomfly.com
Good deal and a nice choice - especially to upgrade to a GT or Ultra within 90 days!


Originally posted by: RussianSensation
Thanks for the links Apoppin. I just think logically things DONT add up for me.

First of all, ATI decided to push back their major generational introductions from 12-15 months to 18-24 months. Secondly, ATI is yet to release its refresh rate R480 card. If R500 is released in 1H/05 then what is the point of releasing R480 exactly? Thirdly, since X800xt is not even widely available, releasing a card 2x as fast in early 05 would mean that X800xt was introduced for nothing? And then the most important question in my mind: how can a company double its performance in 8-10 months (X800xt paper launched in May, so according to those articles, say if it is introduced in Feb-March) when such has never happened over the last 5 years? Basically even if ATI could double the performance of their videocards, why would they have such short product cycles? This would stop people from buying anything high end because in 6 months its 2x slower? I understand how in 6 months, the high-end card is no longer the best one, but maybe 10-20% slower, not 100%. Besides RV410 is only about to be introduced. Introducing a new R500 core would mean new midrange RV500s will have to follow shortly. I dont doubt that a new card will come out sometime in 8-10 months, but I highly doubt it will double the performance of x800xt (saying it will be at least 3 times faster in benchmarks is completely out of whack). I mean just think about it, if it was 3 times faster, every cpu would majorly bottleneck it, making that card perform much slower than its theoretical 3x performance jump, no? And finally, if ATI released their card 2x+ faster in 10 months, then they'd surely have to follow up with a card 2x faster in the next 1 year again or disappoint investors and consumers. That kind of exponential and constant increase in performance is simply too hard to sustain. That would mean that every 2 years, the videocard performance would have to increase 4x. Well considering CPU performance roughly increases 2x every 18 months, according to this logic, when time catches up the cpus will be so slow, it would provide no benefit to step up to the next card, and so on.....
letsee, the pojnt of r480 is to add support to Use Shader Model 3.0 - otherwise X-Box next is gonna be released with some sad-looking games unless the game developers use nVidia cards . . . 😉

as to the x800xt being for 'nothing' . . . no way, it's to compete NOW with the 6800u (a 'better' GPU, IMO)

i think ati got some real cash to work with . . . their X-box next deal must have some real future profit in it. . . .

Remember, the X-Box next GPU is supposed to be REvolutionary - this GPU must support a multi core CPU . . . likely the r500 "desktop" GPU will also be the similar . . . we should be seeing some 4000+ Mhz CPUs next year and even multi core is not out of the question . . . soon. Perhaps ATI has even found some way to address the CPU "bottleneck" - the x800xt does MUCH better with a slower CPU than the 6800u 😉

and i doin't think they have to sustain such a high level of doubling every two years . . . when do you think X-Box III is likely due . . . '08?

it's my guess that nVidia is falling behind - that may be why they are readying their SLI (ati ALSO has it inherent in the r300) for their nv50

of course ALL of this is speculation . . . but i believe it is the way it's gonna happen . . .

we'll see
 
Originally posted by: apoppin
Originally posted by: FFactory0x
No i called yesterday and the rep said the upgrade is availible if you buy from any retailer. As long as you register and provide a reciept. I specfiically asked him


I ordered the eVga 6800 regular for $278 shipped 2nd day zipzoomfly.com
Good deal and a nice choice - especially to upgrade to a GT or Ultra within 90 days!


Originally posted by: RussianSensation
Thanks for the links Apoppin. I just think logically things DONT add up for me.

First of all, ATI decided to push back their major generational introductions from 12-15 months to 18-24 months. Secondly, ATI is yet to release its refresh rate R480 card. If R500 is released in 1H/05 then what is the point of releasing R480 exactly? Thirdly, since X800xt is not even widely available, releasing a card 2x as fast in early 05 would mean that X800xt was introduced for nothing? And then the most important question in my mind: how can a company double its performance in 8-10 months (X800xt paper launched in May, so according to those articles, say if it is introduced in Feb-March) when such has never happened over the last 5 years? Basically even if ATI could double the performance of their videocards, why would they have such short product cycles? This would stop people from buying anything high end because in 6 months its 2x slower? I understand how in 6 months, the high-end card is no longer the best one, but maybe 10-20% slower, not 100%. Besides RV410 is only about to be introduced. Introducing a new R500 core would mean new midrange RV500s will have to follow shortly. I dont doubt that a new card will come out sometime in 8-10 months, but I highly doubt it will double the performance of x800xt (saying it will be at least 3 times faster in benchmarks is completely out of whack). I mean just think about it, if it was 3 times faster, every cpu would majorly bottleneck it, making that card perform much slower than its theoretical 3x performance jump, no? And finally, if ATI released their card 2x+ faster in 10 months, then they'd surely have to follow up with a card 2x faster in the next 1 year again or disappoint investors and consumers. That kind of exponential and constant increase in performance is simply too hard to sustain. That would mean that every 2 years, the videocard performance would have to increase 4x. Well considering CPU performance roughly increases 2x every 18 months, according to this logic, when time catches up the cpus will be so slow, it would provide no benefit to step up to the next card, and so on.....
letsee, the pojnt of r480 is to add support to Use Shader Model 3.0 - otherwise X-Box next is gonna be released with some sad-looking games unless the game developers use nVidia cards . . . 😉

as to the x800xt being for 'nothing' . . . no way, it's to compete NOW with the 6800u (a 'better' GPU, IMO)

i think ati got some real cash to work with . . . their X-box next deal must have some real future profit in it. . . .

Remember, the X-Box next GPU is supposed to be REvolutionary - this GPU must support a multi core CPU . . . likely the r500 "desktop" GPU will also be the similar . . . we should be seeing some 4000+ Mhz CPUs next year and even multi core is not out of the question . . . soon. Perhaps ATI has even found some way to address the CPU "bottleneck" - the x800xt does MUCH better with a slower CPU than the 6800u 😉

and i doin't think they have to sustain such a high level of doubling every two years . . . when do you think X-Box III is likely due . . . '08?

it's my guess that nVidia is falling behind - that may be why they are readying their SLI (ati ALSO has it inherent in the r300) for their nv50

of course ALL of this is speculation . . . but i believe it is the way it's gonna happen . . .

we'll see

Very true, since ATi are doing this deal with Microsoft with multicores, and it happens to also be an AMD processor also, and since ATi will be be working alongside AMD's multicores, i think the R500 would be prepped to work supremely well with a dual core setup, which i suppose would help, and in the end the GPU wont be the bottleneck, and if im correct, AMD's dual cores are being produced at the moment arent they, for a release in Q1 of 2005, which happens to be when the R500 will be out, and if speculation is correct thats Q2 is the closest time the Xbox 2 will be out also...

So it seem to fit in pretty well really.
 
Originally posted by: Drayvn
Originally posted by: apoppin
letsee, the pojnt of r480 is to add support to Use Shader Model 3.0 - otherwise X-Box next is gonna be released with some sad-looking games unless the game developers use nVidia cards . . . 😉

as to the x800xt being for 'nothing' . . . no way, it's to compete NOW with the 6800u (a 'better' GPU, IMO)

i think ati got some real cash to work with . . . their X-box next deal must have some real future profit in it. . . .

Remember, the X-Box next GPU is supposed to be REvolutionary - this GPU must support a multi core CPU . . . likely the r500 "desktop" GPU will also be the similar . . . we should be seeing some 4000+ Mhz CPUs next year and even multi core is not out of the question . . . soon. Perhaps ATI has even found some way to address the CPU "bottleneck" - the x800xt does MUCH better with a slower CPU than the 6800u 😉

and i don't think they have to sustain such a high level of doubling every two years . . . when do you think X-Box III is likely due . . . '08?

it's my guess that nVidia is falling behind - that may be why they are readying their SLI (ati ALSO has it inherent in the r300) for their nv50

of course ALL of this is speculation . . . but i believe it is the way it's gonna happen . . .

we'll see

Very true, since ATi are doing this deal with Microsoft with multicores, and it happens to also be an AMD processor also, and since ATi will be be working alongside AMD's multicores, i think the R500 would be prepped to work supremely well with a dual core setup, which i suppose would help, and in the end the GPU wont be the bottleneck, and if im correct, AMD's dual cores are being produced at the moment arent they, for a release in Q1 of 2005, which happens to be when the R500 will be out, and if speculation is correct thats Q2 is the closest time the Xbox 2 will be out also...

So it seem to fit in pretty well really.[/quote]You seem to read the same stuff i do 😀

:roll:

. .. seems logical . . . i am planning my next (major) upgrade in late '05 (pciE/a64 dual-core if available/and a SLI-type GPU setup) . . . Longhorn is gonna be out pretty soon after that and a 9800p will probably struggle just to run it . . . 😛
(who really knows?)
. . . however, we are looking for REvolutionary changes in the next couple of years and i believe we will have totally photo-realistic games with the Unreal III engine . . . (not "real", just "photo-real" like Final Fantasy: Spirits Within - the movie)
 
Well my 9700pro got sold for $150 so basically it cost me like $125 to upgrade with possiblity to upgrade in 90days for a small cost.
 
Back
Top