Ati's R500 . . . *Update* X-b0xNext info*

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: jiffylube1024
Originally posted by: Insomniak
Originally posted by: Ackmed
Probably because the X800 cards are pretty much 2 year old tech. People love to bring that up all the time. Their two year old tech, is keeping up with NV's brand new tech.

IF (very big if) ATi can get the same kind of performance leap from their new tech card, it could very well be a beast of a card. If the NV50 or whatever is a refresh of the NV40, it *could* be trouble for NV.


Saying r420 is two year old tech is like saying Prescott is 12 year old tech.

What is r420? Yes it's r300 based, but they've doubled the pipelines, added more vertex/pixel units, changed supported features...the same way that Intel's newest CPUs are just 486/DX's with a longer pipeline, faster clockspeeds, larger caches, and support for new instruction sets. They're both still x86 processors.

This was one of the stupidest arguments I've ever heard. If you take two year old tech, shrink it to provide more efficient power consumption and heat dissipation, and then double the numer of pipelines and double the clockspeeds, YES, it SHOULD keep up with modern hardware with the same number of pipes a slower clocks....If I could overclock an old 486/DX to 75Ghz and add new cache and instruction support, you'd be amazed at how well it decimates an Athlon 64...


Seriously, please think before posting.

Touchez! Very good points - calling the X800 series "two year old tech" is pretty nonsensical. It's new technology, just without SM3 support. If it had SM3, it would all of a sudden be this magical "brand new, innovative" design.


ATI went 'safe' with the X800 series and as a side benefit got higher clock speeds out of their chips. Nvidia went all new (as they had to - the FX5800/FX5900 series had PS 2.0 deficiencies). Now, Nvidia is the market leader with the newest tech, kind of like when ATI sprung out the 9700 Pro and caught their competitors with their pants down. Except the difference from PS 1.0 to 2.0 is bigger than PS 2.0 to SM 3.0 (or at least what we've seen so far).

With that said, it seems FAR easier for developers to add SM 3.0 support than going from PS 1.0 (1.1 - 1.4) to 2.0, so SM 3.0 support is a nice feature to have.
it's nVidia's argument. ;)

That said - the 420 IS a direct decendent of the r300/350/360 - it is EXPANDED and REworked, but not "new" like the nV40 is "new".

r500 will be "newer" (w/SM 3.0 support) and r600 BRAND new (a completely new ati design team).
 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
Originally posted by: bunnyfubbles
I don't think we'll see R500 in the XBox 2 unless MS doesn't want top of the line tech right when the XBox 2 debuts. I'd heard XBox 2 might be here around winter 05, of which 05 sounds pretty early when both Nintendo and Sony have both made claims that they'll release 06 or possibly later. So unless R500 is held off until the end of next year I think we might see something else in XBox 2, that or we'll see XBox 2 insanely early as well as MS's own early death in the console market...

umm why would it be the death of MS in the console market if PS3 is still atleast 6-12mo off when xbox gets released its advantage microsft in getting a large gamer base, and a good amount of games out. (this is the same reason PS2 domanated this round in the consol wars not because its got better graphics just more consoles sold which mean more games made for it).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Falloutboy
Originally posted by: bunnyfubbles
I don't think we'll see R500 in the XBox 2 unless MS doesn't want top of the line tech right when the XBox 2 debuts. I'd heard XBox 2 might be here around winter 05, of which 05 sounds pretty early when both Nintendo and Sony have both made claims that they'll release 06 or possibly later. So unless R500 is held off until the end of next year I think we might see something else in XBox 2, that or we'll see XBox 2 insanely early as well as MS's own early death in the console market...

umm why would it be the death of MS in the console market if PS3 is still atleast 6-12mo off when xbox gets released its advantage microsft in getting a large gamer base, and a good amount of games out. (this is the same reason PS2 domanated this round in the consol wars not because its got better graphics just more consoles sold which mean more games made for it).
i'll try ONCE more:

Playstation 3 out by March next year
Wednesday 14 July 2004, 09:52

SONY WILL have models of the Playstation 3 in shops by March next year, the Kyodo News service reported earlier this week.
:Q

PS III is DUE in 9 months . . . b4 X-box II
(if the above is to be believed) ;)


:roll:
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
jus think if ati hadnt of had to cancle R400 Nv may well be on its knees.........

Heh you assume the R400 in the form ATI had it in was working as expected. My guess is it wasnt performing as well as expected and thus got pushed back and the R420 based off the R3.xx core was used.

It could very well be the things ATI was trying to do would not or could not be done in time to compete with the NV40 and thus if they released it. The chip could have been a complete dud and thus thrown ATI back into the days previous R3.xx

Note: I believe the R3.xx chip isnt even an ATI design but a design from a company they bought.

I wonder what nvidia is gonna do next, they spent all this effort on the new 6800 and in less than a year ATI is gonna release this beast. Not like the 6800 is stomping on the ATI cards right now.

NV50 has been in the works since I believe the NV30 started its production cycle. Supposedly this is going to be a radical design. If ATI is going to bring out the big guns they are going to want to do it against the NV50.

As for when the R500 will be released? No earlier than next Spring.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Why dont you just say that Nvidia has a new architecture? ATI has new "extensions/advancements" to their architecture. ALso seriously who really cares if its old, if its old and it can still keep up with everything else on the market then why complain.
Yeah Nvidias chip is brand spankin new with new features everywhere, but still ATI has once again made a very effective competitive solution.

-Kevin
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: g3pro
.09 micron process? hahahahahah. give me a break. Intel and IBM are having bitches of problems making the transition. It just isn't gonna happen for GPUs. .11 micron, but definitely not .09. They would be shooting themselves in the foot for even thinking it at this point. But then again, this is our friendly ATi who forgot to make their cards SM3.0 capable. ;)
Nah, it'd be fine. Look at how well NVidia did getting the .13u part out on time!
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Insomniak
Originally posted by: Ackmed
Probably because the X800 cards are pretty much 2 year old tech. People love to bring that up all the time. Their two year old tech, is keeping up with NV's brand new tech.

IF (very big if) ATi can get the same kind of performance leap from their new tech card, it could very well be a beast of a card. If the NV50 or whatever is a refresh of the NV40, it *could* be trouble for NV.


Saying r420 is two year old tech is like saying Prescott is 12 year old tech.

What is r420? Yes it's r300 based, but they've doubled the pipelines, added more vertex/pixel units, changed supported features...the same way that Intel's newest CPUs are just 486/DX's with a longer pipeline, faster clockspeeds, larger caches, and support for new instruction sets. They're both still x86 processors.

This was one of the stupidest arguments I've ever heard. If you take two year old tech, shrink it to provide more efficient power consumption and heat dissipation, and then double the numer of pipelines and double the clockspeeds, YES, it SHOULD keep up with modern hardware with the same number of pipes a slower clocks....If I could overclock an old 486/DX to 75Ghz and add new cache and instruction support, you'd be amazed at how well it decimates an Athlon 64...


Seriously, please think before posting.

Tell that to NV then, they said it themselves, as have many reviewers.

edit, like for you. http://www.elitebastards.com/page.php?pageid=4929&head=1&comments=1

"ATI X800 XT
An Act of Desperation!

-Built on last year's architecture and software
(R300 Shader Model 2.0 Architecture)"

So again, what I said was true. I said its "pretty much two year old tech".
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Regs
Ackmad, with your logic; that would make you a 15 year old fetus.

Its not my logic, as you can see, its nVidia's. I said "pretty much 2 year old tech", with means "just about", or something similar. Nitpick all you want, nVidia's said it, as well as many other reviewers. And the fact is, there is hardly anything new about the core since the 9700 Pro, two years ago.

I notice the people who dont have any positive input, resort to personal attacks. Considering I have a 5 year old daugheter, I dont see how I can be 15 years old. But good try?
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Ackmed
Its not my logic, as you can see, its nVidia's.

And, obviously, NVIDIA is going to be as objective as possible when evaluating their main competitor's new product. :roll:

There's nothing wrong with R420 being "old" -- or, as you might see it from another perspective, "mature". ATI doesn't have to deal with all the driver hassles of a brand-new architecture (somehow this gets twisted around into "well, NVIDIA has more 'headroom' with their drivers", which I still don't understand). They don't have to deal with brand-new hardware that may be buggy. They don't have to deal with implementing and debugging a new feature set of thus far dubious value.

NV40 fixed the glaring flaws of the NV30 (most notably, its SM2.0 performance), and added SM3.0 support (which, so far, is doing absolutely nothing for them) and an onboard video processor (also currently doing absolutely nothing). So unless SM3.0 starts showing some dramatic gains in the next few months (which I doubt, but I guess it could happen), or their video processor starts performing miracles, the differences are not as dramatic as you might think.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Ackmed
Its not my logic, as you can see, its nVidia's. I said "pretty much 2 year old tech", with means "just about", or something similar. Nitpick all you want, nVidia's said it, as well as many other reviewers. And the fact is, there is hardly anything new about the core since the 9700 Pro, two years ago.

OK, so Nvidia's wrong too.

(it's only marketing speak to try and put ATi's products in the worst possible light anyway)

What's your point? That you get to be wrong together? Well, yay I guess. Have a blast.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Matthias99
Originally posted by: Ackmed
Its not my logic, as you can see, its nVidia's.

And, obviously, NVIDIA is going to be as objective as possible when evaluating their main competitor's new product. :roll:

There's nothing wrong with R420 being "old" -- or, as you might see it from another perspective, "mature". ATI doesn't have to deal with all the driver hassles of a brand-new architecture (somehow this gets twisted around into "well, NVIDIA has more 'headroom' with their drivers", which I still don't understand). They don't have to deal with brand-new hardware that may be buggy. They don't have to deal with implementing and debugging a new feature set of thus far dubious value.

NV40 fixed the glaring flaws of the NV30 (most notably, its SM2.0 performance), and added SM3.0 support (which, so far, is doing absolutely nothing for them) and an onboard video processor (also currently doing absolutely nothing). So unless SM3.0 starts showing some dramatic gains in the next few months (which I doubt, but I guess it could happen), or their video processor starts performing miracles, the differences are not as dramatic as you might think.

Nothing you said was news to me, I didnt say having old tech was bad.

The whole point was someone said what was NV going to do when the R500 came out. Then another said they would realse the NV50.

What I said, is that the R500 is (probably) going to be new tech thru and thru, and if its the same kind of performance leap from the R200, to the R300, its going to be a landmark card. And if NV just does a refresh of some sort of the NV40 with the NV50, they could be in trouble. Since a refresh isnt likely to bo good enough. Since the NV40 was basically built from the ground up. But then its worked for ATi pretty well, so who knows.
 

413xram

Member
May 5, 2004
197
0
0
Does anybody in here ever learn a lesson? Why all the loyalty to a certain commpany? Both companies have paper launched and both have lied. But here everyone is flaming once again! Unfnnnn beleivable:) I must be missing something here. Where is my paycheck from one of these companies????LOFL
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Ackmed
Nothing you said was news to me, I didnt say having old tech was bad.

The whole point was someone said what was NV going to do when the R500 came out. Then another said they would realse the NV50.

What I said, is that the R500 is (probably) going to be new tech thru and thru, and if its the same kind of performance leap from the R200, to the R300, its going to be a landmark card. And if NV just does a refresh of some sort of the NV40 with the NV50, they could be in trouble. Since a refresh isnt likely to bo good enough. Since the NV40 was basically built from the ground up. But then its worked for ATi pretty well, so who knows.

Sorry, I misread some of your quotes taken out of context in other people's responses. And you brought up that ridiculous NVIDIA PR presentation as backing you up (which was not a good move, IMO. :p)

I agree, although it's all speculation at this point. :beer:
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: SickBeast
Originally posted by: coldpower27
My predictions on NV50 and R500
0.11 micron process minimum
600MHZ core speed to start for both vendors. assuming ATI becomes as efficient as NV.
400 million + transistors for NV50, 300+ million transistors for R500
NV Shader Model 3.0+, ATI Shader Model 3.0 support and 32Bit Precision
24 Pipelines minimum
10 Vertex Shader Units

However if both parts conform to Shader Model 4.0 Specifications, we could have something like 32 Total Shader Unit on both cards.

Speculating is fun :D

If Shader Model 4.0 is part of the DX10 spec, you can expect the R500 to support it. Someone above mentionned that it is rumoured to support 128-bit precision, not 32.

Ahh so now the graphics manufacturers are going to support features 1 full year before the APIs release? So 2 full years before actual use, and 3 full years before wide use...
 

Johnbear007

Diamond Member
Jul 1, 2002
4,570
0
0
Originally posted by: apoppin
Originally posted by: Insomniak
Originally posted by: BFG10K
Something really has to be done about the temperatures as GPUs (and CPUs too I guess) are really getting out of hand. If this continues water cooling will become mandatory.



People are trying to beat Moore's law - that's the problem. Think about how quickly NV40 came after NV30 - really, about 12 months. That's halfway through the cycle.

Prescott cooks, but the A64 series isn't that bad. GPUs though, ATi and NV chips both run hot as the dickens.
What EVER happened to ATI's announcement that they were going to a LONGER product cycle of AT LEAST 18 months? . . . looks like the graphics wars are heating up again. . . .i guess they want us to spend $600 every year. :p

The 9800xt is the first of the really hot gpus (i think); watercooling will probably become standard . . . . eventually. . . . . that .09 micron process will probably just cram MORE transistors in . . .
. . . i bet nVidia's next gpu has at least 400 Million transistors (if it follows its 'history') :Q



I have serious doubt's that water cooling will become "standard" any time soon.
sure there are heat problems, but when people get worked up about them it just remings me when so many people and sites said that we would NEVER reach 2 ghz because of the heat, and that it is PHYSICALLY IMPOSSIBLE for a cdrom drive to read at more than 12x.

Water cooling is incredibly impractical for the normal end user. Most of us here buying these high end video cards are part of a small market.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Johnbear007
Originally posted by: apoppin
Originally posted by: Insomniak
Originally posted by: BFG10K
Something really has to be done about the temperatures as GPUs (and CPUs too I guess) are really getting out of hand. If this continues water cooling will become mandatory.



People are trying to beat Moore's law - that's the problem. Think about how quickly NV40 came after NV30 - really, about 12 months. That's halfway through the cycle.

Prescott cooks, but the A64 series isn't that bad. GPUs though, ATi and NV chips both run hot as the dickens.
What EVER happened to ATI's announcement that they were going to a LONGER product cycle of AT LEAST 18 months? . . . looks like the graphics wars are heating up again. . . .i guess they want us to spend $600 every year. :p

The 9800xt is the first of the really hot gpus (i think); watercooling will probably become standard . . . . eventually. . . . . that .09 micron process will probably just cram MORE transistors in . . .
. . . i bet nVidia's next gpu has at least 400 Million transistors (if it follows its 'history') :Q



I have serious doubt's that water cooling will become "standard" any time soon.
sure there are heat problems, but when people get worked up about them it just remings me when so many people and sites said that we would NEVER reach 2 ghz because of the heat, and that it is PHYSICALLY IMPOSSIBLE for a cdrom drive to read at more than 12x.

Water cooling is incredibly impractical for the normal end user. Most of us here buying these high end video cards are part of a small market.
"Eventually" does NOT = "soon" ;)

however, there IS a real need for better cooling solutions . . . liquid seems a "natural"; of course there might be 'breakthroughs' anywhere.

I just stuck a Arctic Cooling VGA Silencer R3 on my 9800Pro 256/256 > XT today. That sucker is a MONster . . . however, it DID drop my GPU temps by 10-15 C . . . not bad for air cooling and $11.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Since the features are supposedly identical, for comparison between the R300 and R400 (R300 / R400):
Core Clock: 325 / 520
Mem Clock: 620 / 1120
Process: 150nm / 130nm

The end result of these differences? Take a look: http://www.anandtech.com/video/showdoc.aspx?i=2044&p=11
The "same core" has scaled very well. By shrinking the process and upping the core/mem clock (as well as other modifications) ATI has managed to stay on top/neck in neck for 2+ years.
Avoiding comparisons between GPUs and CPUs, things like SM3.0 seem to me to be like SSE3, though they tend to be adopted slightly faster. Eventually it will be used in a lot of places, but it takes some time for developers to make use of the technology. Far Cry is a good example of people expecting too much from a patch. It'll be probably a good year or so until we see a properly implemented SM3.0 game, and when we do it will no doubt shine on NVidia cards. In that time, I expect ATI to release a newer card with support for it.

Ackmed, just because R500 is intended to be all new doesn't mean it will be a fantastic success like R300 was. The FX line was "all new" and it was troubled from the start by delays and lacking performance. The original Radeons were all new as well, and they weren't exactly competetive with their contemporaries.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
People are trying to beat Moore's law - that's the problem.
I don't think that has anything to do with it. The problem is that throwing more transistors and frequencies at the problem is nearing the point of limit IMO. Also while a smaller process can help, it also causes leaks and makes cooling harder because the surface is smaller.

It's about time we dumped electricity and started using lasers instead.

Prescott cooks, but the A64 series isn't that bad.
That's because the 3800 isn't running at 3.6 GHz. If it was it would be just as bad, if not worse.

Edit: grammar.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: chsh1ca


Ackmed, just because R500 is intended to be all new doesn't mean it will be a fantastic success like R300 was. The FX line was "all new" and it was troubled from the start by delays and lacking performance. The original Radeons were all new as well, and they weren't exactly competetive with their contemporaries.

I never said it would be fantastic. Wha I said was, "IF (very big if) ATi can get the same kind of performance leap".
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Genx87
jus think if ati hadnt of had to cancle R400 Nv may well be on its knees.........

Heh you assume the R400 in the form ATI had it in was working as expected. My guess is it wasnt performing as well as expected and thus got pushed back and the R420 based off the R3.xx core was used.

It could very well be the things ATI was trying to do would not or could not be done in time to compete with the NV40 and thus if they released it. The chip could have been a complete dud and thus thrown ATI back into the days previous R3.xx

Or that the complexity of a true "next-gen" architecture, produced on their current mfg processes, would have resulted in a "leafblower"-type product for ATI, like the ill-fated NV GF FX 5800 Ultra.

So instead, they "extended" their existing architecture, tweaked it in places (since it was already a more advanced architecture at the time than NV's), and released it as a product refresh, with higher overall performance. Nothing really wrong with that, and I'd be willing to bet that they are achieving higher mfg cost-effeciency due to better yields than NV right now. A 220mil transistor GPU (more than the Itanium2 CPU!) must be pretty difficult for NV to make right now.

Originally posted by: Genx87
Note: I believe the R3.xx chip isnt even an ATI design but a design from a company they bought.

ATI did buy out ArtX, a graphics chip design firm that was a spinoff from SGI, and also designed the chipset that eventually became the one used in the N64. Given all of that, it's probably no surprise that ATI also did the chipset used in the GameCube, in fact it may have already been partially in development at the time that ATI bought ArtX, who knows.

I'm sure that later GPUs designed by ATI incorporated some of the technology that ArtX had, otherwise, why would ATI buy they out?

Originally posted by: Genx87
I wonder what nvidia is gonna do next, they spent all this effort on the new 6800 and in less than a year ATI is gonna release this beast. Not like the 6800 is stomping on the ATI cards right now.

NV50 has been in the works since I believe the NV30 started its production cycle. Supposedly this is going to be a radical design. If ATI is going to bring out the big guns they are going to want to do it against the NV50. As for when the R500 will be released? No earlier than next Spring.

I admit, I don't really consider myself a "fanboy" either way, I've used both ATI and NV cards, usually for different purposes, but I'm really, really impressed with the NV40 arch, especially after their lackluster launch of the almost-useless FX series (useless as compared to the existing GF4 Ti series, on current games at the time). I'm curious what ATI will pull out of their sleeves with their "true next-gen" product release.

Btw, I doubt that NV would release yet another next-gen part, right after the 6800, given the transistor count and associated difficult yields with the current part. I would assume that they would attempt to streamline production, move to a smaller/newer process, and ramp clock rates slightly, for their next NV4x-refresh product. It would also allow them to sell an NV4x-based part into the low/mid-range segments, to more effectively compete against ATI in the mainstream market.

I'm going to make a prediction, that in 6-8 months, we will all be able to buy a 9800Pro-level graphics card, with 128MB or more of memory, DX9-capable, for under $100. For you enthusiasts that already own a 6800 or 9800 Pro, that won't make a hoot of difference, but for some of us on a budget, like myself that just recently "upgraded" to a 9200, that will make a world of difference. Things like GF4 Ti4200 cards, will become like GF4 MXs, and anything not DX9-capable will sell for under $50.

Now if only high-res 19" LCD prices would come down out of the stratosphere, I would be all set. :p
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Acanthus
Originally posted by: SickBeast
Originally posted by: coldpower27
My predictions on NV50 and R500
0.11 micron process minimum
600MHZ core speed to start for both vendors. assuming ATI becomes as efficient as NV.
400 million + transistors for NV50, 300+ million transistors for R500
NV Shader Model 3.0+, ATI Shader Model 3.0 support and 32Bit Precision
24 Pipelines minimum
10 Vertex Shader Units

However if both parts conform to Shader Model 4.0 Specifications, we could have something like 32 Total Shader Unit on both cards.

Speculating is fun :D

If Shader Model 4.0 is part of the DX10 spec, you can expect the R500 to support it. Someone above mentionned that it is rumoured to support 128-bit precision, not 32.

Ahh so now the graphics manufacturers are going to support features 1 full year before the APIs release? So 2 full years before actual use, and 3 full years before wide use...

While it's not the exact same example regarding timeframe, NV4x series support SM3.0 while DX9.0c is not out yet.
It would be nice to see earlier a DX10 gpu out, so developers have a pretty good idea of what they can do with it when API will be released with the Longhorn. It would help to mature and provide better products.
But I agree I don't believe that this will happen with NV50 and R500.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
It's about time we dumped electricity and started using lasers instead.

To bad we only read about it working in Sci-Fi books. Funny thing is, in the early 1900's people thought going over 40 MPH in a vehicle was impossible. Why? Because the lack of oxygen. Now that's funny.