ATI R520

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Cheesetogo
Originally posted by: JBT
Originally posted by: ZimZum
Originally posted by: MetalStorm
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.

Next generation cards only seem like overkill until new games come around that bring them to their knees. No current generation video cards will be able to run Unreal 3.

Exactly. Personally I can't even run TODAY's games at acceptable frame rates. Like I said in my easrlier post my 6800GT simply doesn't have enough horse power to run 2x AA with 1920x1200 resolution atleast with vsync on. With vsync off its not bad but then the game just looks ugly with tearing all over the place. So I guess I will just have to deal with lower resolutions on this thing until these newer cards come out.


I agree with you that currnet cards still don't have enough power. 45fps in doom3 at 1600x1200 with 4xaa and 8x is way too slow for top of the line cards.

You dont need next gen cards.

Nvidia has you covered.

Pop in another 6800GT.
Can't complain when you can get next gen performance... today with SLI.
 

MetalStorm

Member
Dec 22, 2004
148
0
0
Originally posted by: Cheesetogo
I agree with you that currnet cards still don't have enough power. 45fps in doom3 at 1600x1200 with 4xaa and 8x is way too slow for top of the line cards.

Oh yea, that's just abysmal frame rates, but how many people actually play on 1600x1200? Then again I guess the people with deep enough pockets to buy a monitor capable of such a resolution will have enough for the top card...
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: MetalStorm
Originally posted by: Cheesetogo
I agree with you that currnet cards still don't have enough power. 45fps in doom3 at 1600x1200 with 4xaa and 8x is way too slow for top of the line cards.

Oh yea, that's just abysmal frame rates, but how many people actually play on 1600x1200? Then again I guess the people with deep enough pockets to buy a monitor capable of such a resolution will have enough for the top card...

You can easily get a 19" monitor for a couple of hundred bux that will do 1600x1200.
And monitors are the one thing worth splashing out on.
I have a 19" CRT I have been using for 4 or 5 years, it runs 1600x1200 @ 85Hz. I also have a 9800 non-pro.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
to those saying 'you'll need to run at 2048x1536 8xAA 16xAF to take advantage of the r520!!!ONE!!' are wrong. i highly doubt a 9800XT (last gen's flagship) can even run Doom3 at 1280x1024 2xAA with 60+ FPS, which would be the ideal setting for me, or higher. My 9500 PRO is dwarfed by Doom 3 even at 640x480 and medium settings. I can manage maybe 35 FPS. Even 60 FPS is bare minimum. What if a game is optimized for nVidia cards, like Doom3? Then I probably will need an R520 to play next year's nVidia-optimized game, at decent settings
 

Cheesetogo

Diamond Member
Jan 26, 2005
3,824
10
81
Originally posted by: housecat
Originally posted by: Cheesetogo
Originally posted by: JBT
Originally posted by: ZimZum
Originally posted by: MetalStorm
So what's wrong with the current generation of cards?

I'm quite happy with my 9800pro at the moment, of course I wouldn't say no to NV40 or R480, but seriously, with 24 pipes and insane clocks what game are they actually making these cards for???

The reason my 9800 is fine for me at the moment is that I can only run games at a max of 1024x768, so even with AA and AF, the games are still smooth as. So I doubt even I would be able to tell the difference in frame rates, and I HATE anything below 60fps.

Next generation cards only seem like overkill until new games come around that bring them to their knees. No current generation video cards will be able to run Unreal 3.

Exactly. Personally I can't even run TODAY's games at acceptable frame rates. Like I said in my easrlier post my 6800GT simply doesn't have enough horse power to run 2x AA with 1920x1200 resolution atleast with vsync on. With vsync off its not bad but then the game just looks ugly with tearing all over the place. So I guess I will just have to deal with lower resolutions on this thing until these newer cards come out.


I agree with you that currnet cards still don't have enough power. 45fps in doom3 at 1600x1200 with 4xaa and 8x is way too slow for top of the line cards.

You dont need next gen cards.

Nvidia has you covered.

Pop in another 6800GT.
Can't complain when you can get next gen performance... today with SLI.


Ok, lets see, 800+ for two 6800gts, or 500-600 for one next gen card that's faster than two 6800gts.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: housecat
This product is too expensive and has too much advanced technology.

Originally posted by: housecat
Who in the hell is against superior/advanced technology?

Oh wait. I'll answer that.. ATI devotees in this case.


Care to make up your mind which way you stand on the subject?



/end play on modern day braindead ATI fanboy rant on superior technology

no comment necessary
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: MetalStorm
Originally posted by: xtknightThen I probably will need an R520 to play next year's nVidia-optimized game, at decent settings

Get a card made by nVidia perhaps?

the problem presents itself when i want to play doom3 AND half life 2, when one performs better on a certain card (ati=hl2,nvidia=doom3). obviously i have to choose something and that something better be good to play a game optimized for the other vendor's card.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: silverpig
ATI's SLI is also not limited to two video cards. There will be one master card which presents the rendered data to the screen and coordinates the slaves which transfer their rendered data into the master's framebuffer. A physical PCB to link the video cards is not required, everything is done via PCI-Express bus.

Imagine 4 of those going at it!

:)

Imagine a motherboard that breaks the PCI-E spec just to use 4 cards!
 
Jun 14, 2003
10,442
0
0
Originally posted by: silverpig
ATI's SLI is also not limited to two video cards. There will be one master card which presents the rendered data to the screen and coordinates the slaves which transfer their rendered data into the master's framebuffer. A physical PCB to link the video cards is not required, everything is done via PCI-Express bus.

Imagine 4 of those going at it!

:)


my electricity bill NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!

nah i suppose with a good watercooling set up, you could just hook your heating up to it and do away with the cnetral heating in your home, 4 R520's should be enough to warm a house in the winter
 

ohnnyj

Golden Member
Dec 17, 2004
1,239
0
0
Originally posted by: jiffylube1024
Originally posted by: ohnnyj

But if these chips are as fast as they say they are (I've heard upwards of 3x current gen) then all CPUs will be the bottleneck. Even SLI can push some games pretty hard where if it is paired w/a slow processor the benefits become negligible. A top of the line R520 w/a A64 3000+ will most likely not be seeing its full potential unless you start cranking the res and AA and AF which I suppose is one of the greatest benefits of these upcoming cards. Imagine 2048x1536 8xAA 16xAF at 100+ fps in most current games, now that would have me making an upgrade for sure. And then putting two, three, or more in one system, wow, 300+ fps or so w/the graphics as high as they will go :).

That's what dualcore CPU's are for ;) .

The new Xbox is matching three ~3 GHz Power PC CPU's with one of these GPU's, on a console no less!

But developers have to develop games that take advantage of dual/tri/quad/quint/etc. cores. And if the Anandtech article on multi cores and gaming is any indication it isn't exactly easy to implement. It may work better on the console front given that developers know that everyone is using that console will benefit from coding to multiple cores. However, dual core won't trickle into the PC market until later this year and won't be mainstream for a while. So many developers may not see it worth (higher, longer development costs) to code for multiple cores.

In the above mentioned article a quote from Tim Sweeney of Unreal fame:

"You can expect games to take advantage of multi-core pretty thoroughly in late 2006 as games and engines also targeting next-generation consoles start making their way onto the PC.

Writing multithreaded software is very hard; it's about as unnatural to support multithreading in C++ as it was to write object-oriented software in assembly language. The whole industry is starting to do it now, but it's pretty clear that a new programming model is needed if we're going to scale to ever more parallel architectures. I have been doing a lot of R&D along these lines, but it's going slowly."

He effectively is saying PC games won't be developed to take thorough advantage of multi cores until late 2006, by then the R520 will be outdated. Plus, as I have said, development is not exactly a walk in the park.
 

Cheesetogo

Diamond Member
Jan 26, 2005
3,824
10
81
It would probably be a little better. But it's not really fair to compare that way, 2 vs. 1. How about two r520's? :D
 

MetalStorm

Member
Dec 22, 2004
148
0
0
Well I'm sure nVidia aren't just going to sit around and watch, maybe eating a few donuts.
What information do we have on their next card? I'm sure it will be an even match for the R520 whatever it is.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: ohnnyj
Originally posted by: jiffylube1024
Originally posted by: ohnnyj

But if these chips are as fast as they say they are (I've heard upwards of 3x current gen) then all CPUs will be the bottleneck. Even SLI can push some games pretty hard where if it is paired w/a slow processor the benefits become negligible. A top of the line R520 w/a A64 3000+ will most likely not be seeing its full potential unless you start cranking the res and AA and AF which I suppose is one of the greatest benefits of these upcoming cards. Imagine 2048x1536 8xAA 16xAF at 100+ fps in most current games, now that would have me making an upgrade for sure. And then putting two, three, or more in one system, wow, 300+ fps or so w/the graphics as high as they will go :).

That's what dualcore CPU's are for ;) .

The new Xbox is matching three ~3 GHz Power PC CPU's with one of these GPU's, on a console no less!

But developers have to develop games that take advantage of dual/tri/quad/quint/etc. cores. And if the Anandtech article on multi cores and gaming is any indication it isn't exactly easy to implement. It may work better on the console front given that developers know that everyone is using that console will benefit from coding to multiple cores. However, dual core won't trickle into the PC market until later this year and won't be mainstream for a while. So many developers may not see it worth (higher, longer development costs) to code for multiple cores.

In the above mentioned article a quote from Tim Sweeney of Unreal fame:

"You can expect games to take advantage of multi-core pretty thoroughly in late 2006 as games and engines also targeting next-generation consoles start making their way onto the PC.

Writing multithreaded software is very hard; it's about as unnatural to support multithreading in C++ as it was to write object-oriented software in assembly language. The whole industry is starting to do it now, but it's pretty clear that a new programming model is needed if we're going to scale to ever more parallel architectures. I have been doing a lot of R&D along these lines, but it's going slowly."

He effectively is saying PC games won't be developed to take thorough advantage of multi cores until late 2006, by then the R520 will be outdated. Plus, as I have said, development is not exactly a walk in the park.

Not a walk in the park?
Understatement of the century.
Not only do they have to code for moltiple cores with the next gen. consoles, they also have to do in order programming, which will be a pain compared to Out of order (on PC's).
AMD don't see dual core CPU's being essential to gamers, anyone who has seen their recent roadmap will know this.
Their FX line will stay single core while the "normal" desktop chips and Opterons will move to dual cores. The FX is a gamers chip, and AMD don't see dual core as necessary for gamers in the near future, apparently.
CPU's will bottleneck us definately until 2006, and with ATi releasing their dual card solution on Intel first, the Intel CPU's will definately be the bottleneck (due to poorer gaming performance).
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: housecat
This product is too expensive and has too much advanced technology.

Yes, damn them for their advanced technology!!! Curses!

Why DXNext support when we barely have DX9C technology??

You are paying for technology that will be very slow when DXNext games are released!
And there probably wont be any visual difference from DX9C to DXNext.. and anything that can be done under DXNext can be done under DX9C with more passes.

I'd suggest sticking with older technology, because when this card can use its features, it will be too slow.

/end play on modern day braindead ATI fanboy rant on superior technology

Says the guy who has two GeForce 6800GT's!


I agree that R520 is not for 99.99% of the consumer public out there (I certainly won't be getting one of these, not at >$500 and quite possibly >$600).

However, there is a market for these cards, it's called a niche (exactly what your dual 6800GT's occupies).

Also, how exactly can you shun multi-GPU capable next-gen cards? Is it buyer's remorse from the 6800's, bias against ATI or what? How can you not like the possibility of having 2-4 times your current performance with multiple GPU's?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Nobody said programming for dual/multi core is going to be easy. By all accounts it's going to be a serious bitch.

However, that's where everyone is going. The Xbox 2 has three PPC CPU's in there, and Cell is essentially a 1 + 8 CPU design (basically 9 cores, although the 8 'slave' CPU's only have 256KB cache each).
 

IamTHEsnake

Senior member
Feb 4, 2004
334
0
0
I keep hearign about the hassle of developing software for multiple cores, whether it be for GPUs or CPUs, what makes it so hard?

BTW Rollo, enjoy it while it lasts, 1000 will get you the tops just until the next gen of cards comes out. It's your choice and that's fine but I'm stating what I think.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: IamTHEsnake
I keep hearign about the hassle of developing software for multiple cores, whether it be for GPUs or CPUs, what makes it so hard?

BTW Rollo, enjoy it while it lasts, 1000 will get you the tops just until the next gen of cards comes out. It's your choice and that's fine but I'm stating what I think.

It's not the stating I have a problem with, it what you think I have a problem with.

SLI offers far more "bang per buck" than any other performance upgrade. Check out my benches if you don't believe me.

And BTW- if I don't buy the next gen cards it's not like I won't be "enjoying it". I'll still be gaming at 16X12 4X8X minimum, to be honest, I don't know how much next gen cards can offer me as 19X14 is max on my monitor?

Until some games come out that stress my 6800GT SLIs, I don't know how much I can really gain.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Rollo

SLI offers far more "bang per buck" than any other performance upgrade. Check out my benches if you don't believe me.

That is questionable:

Case of 6600GT
Buy 1 6600GT - $200
Buy 2nd 6600GT in 1 year's time = Say $100 (estimate)
Remember Cost of SLI motherboard over a regular S939 board is also +$50 on average.
Total Cost: $350

Today you can get 1 6800GT for $340 and PCIe for $375. Given time value of money and opportunity cost of having to play with 1 6600GT at slower frames makes 2 6600GTs a bad purchase period at any point in time (also often 2 6600GTs are slower than 1 6800Gt at higher resolutions that require AA since the dual setup is still 128mb vs. 256mb).

Case of 6800nu:
Buy 1 6800nu = $290
Buy 2nd 6800nu = $150
+ additional cost of SLi board = $50
Total cost: $490

A quick glance at a review 6800NU SLI shows that 6800Gt@Ultra speeds is on par with 2 6800NUs (loses some wins some). A case can be argued that 1 6800Ultra or X850XT or X850XT PE is actually better to buy today because it'll cost the same and perform as fast and you'll enjoy 2 years of fast gameplay vs. 1 year with 1 6800nu and 2nd year equal to the top of the line cards above.

Case of 2 6800GTs
Buy 1 6800GT = $375 PCIe
Buy 2nd 6800GT = $200 in 1 year's time (estimate)
SLi board = +$50
Total: $625
This seems to be the best option since the performance increase is great over today's top of the line cards. However, consider a user who bought 6800GT for $375 today. He can then sell it for $150 in 1 year. His total cost of owning is $225. That leaves $400 to spend on a new generation videocard. Chances are it'll be just as fast or close to 2 6800GTs and offer other new features. I will agree that this alternative is fairly reasonable if the majority of future games were supported by Nvidia's SLI driver.

Case of 2 6800Ultras:
1 6800Ultra = $490
Buy 2nd 6800Ultra = $250 in 1 year's time
SLI board = +50
Total Cost: $790
Again you can simply buy 6800Ultra or X850XT for $500 and then sell it for $200. That leaves $790 - $300 = $490 for new videocard purchase (theoretically for spending to be equal). So for $499 you'll be able to buy R520.

I think it's unreasonable to assume that 2 6800GTs/6800Ultras are just as fast as R520 (they could be faster or slower, be we dont know for sure). We do know that not all games support SLI. R520 will be faster at those games. SLI requires a more expensive PSU. As anand noted in the SLI review, stability issues are common. Also estimated prices in 1 year's time might be higher (or lower). If prices are higher, R520 is faster than 2x X800xt, and a user can sell his old videocard, then SLI is pretty much pointless.