ATI R520

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Some more speculation:

DirectX.Update: Graphics Accelerators: Half a Step Forward

So we get the following most likely picture in the long run:

Code name: R520
Probable name of the product line: RADEON 900 or RADEON 10000
Process 90nm (already known)
Probable time of the announcement: May 2005
Probable number of pipelines 16+8 or 24+8 or 16+8 universal
Shader Model 3.0
Judging from the core complexity, we'll hardly deal with frequencies higher than 600-650 MHz
MMU, storing textures and rendering in PCI-Express memory when necessary
Likely features: fetching and interpolating FP textures, FP blending
512 MB of memory in desktop solutions (already known), 256 bit GDDR3 memory, typical operating frequency (600-750MHz)*2
Rather high price ? the first 512 MB video cards may cost over $600


Why Provide an AGP solution to the market?

"According to my experience, this evolutionary platform replacement used to require and still requires at least 1.5 or even two years, before the volume of accelerators bought for PCI-Express platforms gets equal to the volume of AGP accelerators. Especially as there are decent processors and cost-efficient memory for the old platform so far, which still encourages buying them for new PCs.

There are other fine points as well: like many users have no motivation to replace typical 2.X GHz CPUs in general (they are still quite sufficient for many users) and a more reasonable way to upgrade your accelerator. It can be upgraded to a top AGP solution ? at the same cost it will provide larger gain in games than if you replace the entire platform and buy a new middle end solution for PCI-Express. " - digitlife

Also there are more AGP users than PCIe users who will want to upgrade their graphics card. Certainly anyone with a P4 3.0ghz/A64 3000+ is better off getting R520 than A64 4500+ and 6800GT and it would be unwise for a company to disregard those users, especially given that they already invested money into the RIALTO chip.

 

ohnnyj

Golden Member
Dec 17, 2004
1,239
0
0
Originally posted by: RussianSensation
Some more speculation:

DirectX.Update: Graphics Accelerators: Half a Step Forward

So we get the following most likely picture in the long run:

Code name: R520
Probable name of the product line: RADEON 900 or RADEON 10000
Process 90nm (already known)
Probable time of the announcement: May 2005
Probable number of pipelines 16+8 or 24+8 or 16+8 universal
Shader Model 3.0
Judging from the core complexity, we'll hardly deal with frequencies higher than 600-650 MHz
MMU, storing textures and rendering in PCI-Express memory when necessary
Likely features: fetching and interpolating FP textures, FP blending
512 MB of memory in desktop solutions (already known), 256 bit GDDR3 memory, typical operating frequency (600-750MHz)*2
Rather high price ? the first 512 MB video cards may cost over $600


Why Provide an AGP solution to the market?

"According to my experience, this evolutionary platform replacement used to require and still requires at least 1.5 or even two years, before the volume of accelerators bought for PCI-Express platforms gets equal to the volume of AGP accelerators. Especially as there are decent processors and cost-efficient memory for the old platform so far, which still encourages buying them for new PCs.

There are other fine points as well: like many users have no motivation to replace typical 2.X GHz CPUs in general (they are still quite sufficient for many users) and a more reasonable way to upgrade your accelerator. It can be upgraded to a top AGP solution ? at the same cost it will provide larger gain in games than if you replace the entire platform and buy a new middle end solution for PCI-Express. " - digitlife

Also there are more AGP users than PCIe users who will want to upgrade their graphics card. Certainly anyone with a P4 3.0ghz/A64 3000+ is better off getting R520 than A64 4500+ and 6800GT and it would be unwise for a company to disregard those users, especially given that they already invested money into the RIALTO chip.

But if these chips are as fast as they say they are (I've heard upwards of 3x current gen) then all CPUs will be the bottleneck. Even SLI can push some games pretty hard where if it is paired w/a slow processor the benefits become negligible. A top of the line R520 w/a A64 3000+ will most likely not be seeing its full potential unless you start cranking the res and AA and AF which I suppose is one of the greatest benefits of these upcoming cards. Imagine 2048x1536 8xAA 16xAF at 100+ fps in most current games, now that would have me making an upgrade for sure. And then putting two, three, or more in one system, wow, 300+ fps or so w/the graphics as high as they will go :).
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
1920x1080 8xAA 16xAF :thumbsup: That on a 42"+ plasma/lcd that has 1080p native res. That would be gaming goodness.

75-80fps is pretty much where the eye's peripheral motion sensitivity can't tell anymore. Beyond that isn't necessary.

Until we see chipsets with 40+ lanes per chip, I doubt we'll see anything more than 2x real x16 PCIe for a while. That would also be trace space hell for a motherboard engineer.
 

ohnnyj

Golden Member
Dec 17, 2004
1,239
0
0
Originally posted by: ribbon13
1920x1080 8xAA 16xAF :thumbsup: That on a 42"+ plasma/lcd that has 1080p native res. That would be gaming goodness.

75-80fps is pretty much where the eye's peripheral motion sensitivity can't tell anymore. Beyond that isn't necessary.

Until we see chipsets with 40+ lanes per chip, I doubt we'll see anything more than 2x real x16 PCIe for a while. That would also be trace space hell for a motherboard engineer.

I guess they could add some more layers.
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
Originally posted by: Aganack1
it would be very nice if it also used hdmi connectors

that would mean 2 backplane spaces.... because you would still need HD15.... DVI-I has analog so adapters are easy and cheap, no so with HDMI. Also there isn't that many HDMI devices in the wild. what about dual-dvi monitors for massive resolution? I'm unaware of any dual-HDMI setups.
 

Tab

Lifer
Sep 15, 2002
12,145
0
76
Originally posted by: ribbon13
Originally posted by: Aganack1
it would be very nice if it also used hdmi connectors

that would mean 2 backplane spaces.... because you would still need HD15.... DVI-I has analog so adapters are easy and cheap, no so with HDMI. Also there isn't that many HDMI devices in the wild. what about dual-dvi monitors for massive resolution? I'm unaware of any dual-HDMI setups.

If it comes with a HDMI -> VGA/DVI-


Sure...
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
HDMI to VGA adapters would be prohibitively expensive. That would actually take a power supply, DSP and DACs. DVI-I still has analog video signal, HDMI has none.
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
Originally posted by: IamTHEsnake
I don't.

1 more year max then it will slowly go to low end specs, once these cards truly start taking advantage of the x16 bus, the x8 AGP bus will not suffice.

Just accept it and don't argue that it will be around forever simply because you own an AGP card.
show me where the 8x bus isn't enough.
 

Googer

Lifer
Nov 11, 2004
12,576
7
81
Originally posted by: Aganack1
it would be very nice if it also used hdmi connectors

HDMI blows! It's got RIAA like copy protections embedded into it, LONG LIVE DVI!
 

Googer

Lifer
Nov 11, 2004
12,576
7
81
Originally posted by: ribbon13
Originally posted by: Aganack1
it would be very nice if it also used hdmi connectors

that would mean 2 backplane spaces.... because you would still need HD15.... DVI-I has analog so adapters are easy and cheap, no so with HDMI. Also there isn't that many HDMI devices in the wild. what about dual-dvi monitors for massive resolution? I'm unaware of any dual-HDMI setups.

My parents (52 inch i think) new Panasonic HDTV CRT has DVI on the back. It is apearing on most new devices these days.
 

XBoxLPU

Diamond Member
Aug 21, 2001
4,249
1
0
Originally posted by: IamTHEsnake
1 more year max then it will slowly go to low end specs, once these cards truly start taking advantage of the x16 bus, the x8 AGP bus will not suffice.

.

:confused:

x16 bus PCI Express? and 8xAGP bus. Two totally different things
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: ohnnyj

But if these chips are as fast as they say they are (I've heard upwards of 3x current gen) then all CPUs will be the bottleneck. Even SLI can push some games pretty hard where if it is paired w/a slow processor the benefits become negligible. A top of the line R520 w/a A64 3000+ will most likely not be seeing its full potential unless you start cranking the res and AA and AF which I suppose is one of the greatest benefits of these upcoming cards. Imagine 2048x1536 8xAA 16xAF at 100+ fps in most current games, now that would have me making an upgrade for sure. And then putting two, three, or more in one system, wow, 300+ fps or so w/the graphics as high as they will go :).

That's what dualcore CPU's are for ;) .

The new Xbox is matching three ~3 GHz Power PC CPU's with one of these GPU's, on a console no less!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Looks to be an exciting new product!

I like how they say,
Not much is known on this topic
about ATI AMR, but follow it with
ATI's technology is superior to NVIDIA's SLI in several ways
.

I guess something must be known? :roll: I don't think much of that site, but if it's Super Tiling- it must be better! ;)
 

MetalStorm

Member
Dec 22, 2004
148
0
0
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: MetalStorm
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.

Next generation cards only seem like overkill until new games come around that bring them to their knees. No current generation video cards will be able to run Unreal 3.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Originally posted by: ZimZum
Originally posted by: MetalStorm
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.

Next generation cards only seem like overkill until new games come around that bring them to their knees. No current generation video cards will be able to run Unreal 3.

Yes they will.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Originally posted by: ZimZum
Originally posted by: MetalStorm
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.

Next generation cards only seem like overkill until new games come around that bring them to their knees. No current generation video cards will be able to run Unreal 3.

Exactly. Personally I can't even run TODAY's games at acceptable frame rates. Like I said in my easrlier post my 6800GT simply doesn't have enough horse power to run 2x AA with 1920x1200 resolution atleast with vsync on. With vsync off its not bad but then the game just looks ugly with tearing all over the place. So I guess I will just have to deal with lower resolutions on this thing until these newer cards come out.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Rollo
Looks to be an exciting new product!

I like how they say,
Not much is known on this topic
about ATI AMR, but follow it with
ATI's technology is superior to NVIDIA's SLI in several ways
.

I guess something must be known? :roll: I don't think much of that site, but if it's Super Tiling- it must be better! ;)

Rollo, do you remember the 4xR300 GPU card for workstations or render farms or something?
Well, the R300 supports multiple GPU's, same as the R4xx and the R5xx will.
Now, because the implimentation is the same from the R300 to the R500 (hey, you said yourself they re-use cores, maybe it's not always a bad thing), we can assume that the R300 multi GPU solution is pretty similar to what the ATi multi CARD solution will be, and hence any features that are present with R300 multi-gpu cards/systems will all be present when ATi releases multi card solutions based on R4xx and R5xx.

Consequently, it is safe to say they can make comments like super tiling is better than nVidia's method. (Assuming it is), because they DO know something.
There is some feature which does make it better, I can't remember what though, it may be something to do with AA?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: ohnnyj

But if these chips are as fast as they say they are (I've heard upwards of 3x current gen) then all CPUs will be the bottleneck. Even SLI can push some games pretty hard where if it is paired w/a slow processor the benefits become negligible. A top of the line R520 w/a A64 3000+ will most likely not be seeing its full potential unless you start cranking the res and AA and AF which I suppose is one of the greatest benefits of these upcoming cards. Imagine 2048x1536 8xAA 16xAF at 100+ fps in most current games, now that would have me making an upgrade for sure. And then putting two, three, or more in one system, wow, 300+ fps or so w/the graphics as high as they will go :).

That's a good point. What I was referring to that if it boils down to the cheapest upgrade, it's almost always better to get a new graphics card and not the cpu for gaming. So there are a lot of AGP users that have decent cpu's and could still benefit more from a videocard upgrade, even after taking into account some of the cpu bottlenecks. Imagine they could still play at 1600x1200 6AA/16AF and get smooth frames. A cpu upgrade will not give you that.

I also think that no matter how powerful every new generation of cards is, it's only a matter of time before the games catch up to the hardware. Sure R520 might be fast when it comes out, but soon games based off HL2 and Doom3 engines will come along and you'll start to see current generation struggle and these new cards play the latest games at 1280x1024 and 1600x1200. I think you guys will be able to play at high rez than 1600x1200 with older games but certainly not new games. Look at Call of Duty based off Quake 3 engine. Quake 3 get 500+ frames but call of duty is like 80 frames at 1600x1200 4AA/8AF. I am sure by this fall more advanced games will come out that will even push R520, otherwise why would consumers upgrade?
 

Cheesetogo

Diamond Member
Jan 26, 2005
3,824
10
81
Originally posted by: JBT
Originally posted by: ZimZum
Originally posted by: MetalStorm
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.

Next generation cards only seem like overkill until new games come around that bring them to their knees. No current generation video cards will be able to run Unreal 3.

Exactly. Personally I can't even run TODAY's games at acceptable frame rates. Like I said in my easrlier post my 6800GT simply doesn't have enough horse power to run 2x AA with 1920x1200 resolution atleast with vsync on. With vsync off its not bad but then the game just looks ugly with tearing all over the place. So I guess I will just have to deal with lower resolutions on this thing until these newer cards come out.


I agree with you that currnet cards still don't have enough power. 45fps in doom3 at 1600x1200 with 4xaa and 8x is way too slow for top of the line cards.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
This product is too expensive and has too much advanced technology.

Why DXNext support when we barely have DX9C technology??

You are paying for technology that will be very slow when DXNext games are released!
And there probably wont be any visual difference from DX9C to DXNext.. and anything that can be done under DXNext can be done under DX9C with more passes.

I'd suggest sticking with older technology, because when this card can use its features, it will be too slow.

/end play on modern day braindead ATI fanboy rant on superior technology