ATI R520

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Errr, it's nice you wrote all that Russian Sensation, but I was talking about buying two 6800GTs or 6800Ultras now.

What performance upgrade offers more framerate increase to a high end machine?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: MisterChief
Since when has Beyond3d had any accurate info? JK:p

Think the info there is often accurate, but can be misleading also. Anyways just supplied as a change of pace.



 

AlexR

Member
Mar 19, 2005
25
0
0
Yeah I've been looking for a solution to a video card. Benchmarks for 6800U SLi seem poor. Maybe one ultra and then like russian suggested, sell it, and buy a R520.

Whats your opinion on buying a card now, one or two? which ones?

I was looking into either a Powercolor X850XT or a BFG 6800U OC.


*sorry not trying to steal this thread*
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Rollo
Errr, it's nice you wrote all that Russian Sensation, but I was talking about buying two 6800GTs or 6800Ultras now.

What performance upgrade offers more framerate increase to a high end machine?

My bad...I completely agree with you that for users who can afford to buy SLI (high-end only though), nothing touches SLI. In this case 2 6800GTs/2 6800Ultras, then 2 R520s, 2NV50s, etc. Basically those who can afford to buy 2 fastest SLi cards every year.

However, how many of us can afford that? :(

I think SLI was a strategic solution for Nvidia. Not only do they have the fastest graphics solution (thanks to SLI), but there is less pressure on them when R520 comes out because 2 6800Ultras will still be fast (albeit higher price). But I think for users other than enthusiasts, SLI isn't a good enough proposition based on ROI.
 

sharkeeper

Lifer
Jan 13, 2001
10,886
2
0
Too bad this thread has become another SLI vs. non testosterone match. :|

SLI has its merits. Of course when GPU cycles pick up to a six month update and become plentiful once again (if ever!) SLI may not be as attractive as it is now. GPU's will go dual core and they'll have more than one on a board, etc. Things will on;ly get better and faster. Let's hope the game developers can keep up and actually use this technology!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: ribbon13
HDMI to VGA adapters would be prohibitively expensive. That would actually take a power supply, DSP and DACs. DVI-I still has analog video signal, HDMI has none.

Like these cards will be invitingly cheap when they debut?

 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: sharkeeper
Too bad this thread has become another SLI vs. non testosterone match. :|

SLI has its merits. Of course when GPU cycles pick up to a six month update and become plentiful once again (if ever!) SLI may not be as attractive as it is now. GPU's will go dual core and they'll have more than one on a board, etc. Things will on;ly get better and faster. Let's hope the game developers can keep up and actually use this technology!
:thumbsup:

I, for one, want a Fudo over a 6800U combo. BUT, I am not going for frame-rate in some game. I want DX9+ performance with encode/decode hardware + 512MB of memory. Of course, I need a HDV camera too to take advantage of it. So, instead of a $600 upgrade, I need a $3900 upgrade ($600 + $3300 for a Sony HFX-1). What the Heck For? My NLE uses DX9 for creating effects and transitions and for HDV playback in real-time. My 9600 can do one line reliably. A X800 Pro could do two timelines. Since I use up to for for some videos, I need performance, baby. ;) Oh, I need a new mobo, new processors, new other... That really is an expensive card (but so is the SLI).

No SLI? We can't even get them to add Sonic Fire Pro/SmartSound integration, so you think they will write optimization for SLI? Yeah, right... ;)

 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: IamTHEsnake
I keep hearign about the hassle of developing software for multiple cores, whether it be for GPUs or CPUs, what makes it so hard?
Multiple GPU's is easy, as what GPU's do it basically the same operations over and over, so it is basically inifintely parallelisable (?), whereas CPU code is different.
Multi-threading games requires everything to be kept in sync between threads/processors, and probably also to try and distribute the workload evenly.
With consoles, there is the issue of also using a totally different type of core (see Anandtech's article on the Cell processor).
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: housecat
This product is too expensive and has too much advanced technology.
Why DXNext support when we barely have DX9C technology??
You are paying for technology that will be very slow when DXNext games are released!
And there probably wont be any visual difference from DX9C to DXNext.. and anything that can be done under DXNext can be done under DX9C with more passes.
What in the world are you talking about. You're totally confusing a set of high-level API specs and features, with low-level shader pipeline features. Some of the more important hardware features that are going to be required for DXNext, are fast GPU state save/restore/reset commands, as well as "fence" opcodes. Kind of hard to explain, but it's almost like read/write fence opcodes for disk cache coherence, except that in this case (AFAIK thus far), they will be used to intermingle software rendering operations with GPU rendering operations, esp. for the 3D layered, alpha-blended, desktop UI in Longhorn.

So yes, those new features actually are needed, if you want to be able to run Longhorn with all of the eye-candy enabled.
Originally posted by: housecat
I'd suggest sticking with older technology, because when this card can use its features, it will be too slow.
Are you suggesting that ATI is totally stupid, and cannot learn from NV's first-gen SM3.0-capable GPU parts, such that when ATI's SM3.0 parts are finally released, that they somehow will not be better than NV's first-gen SM3.0 parts? I doubt that, ATI generally carefully plans their hardware features for maximal impact, while minimizing additional costs. (One reason why they chose 24bit FP modes, instead of NV's 32-bit, because it was both cheaper, and faster, and entirely appropriate at the time for the games on the market. NV wanted to push for EXIF and HDR rendering instead, but it cost them dearly as they had to incur a negative performance/IQ tradeoff either way, because they only implemented 16 and 32-bit FP mode support.) ATI is smarter than that.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Lonyo
Not a walk in the park?
Understatement of the century.
Not only do they have to code for moltiple cores with the next gen. consoles, they also have to do in order programming, which will be a pain compared to Out of order (on PC's).
Actually, the logical programming model is no different, the difference is in the performance/space tradeoff of the CPU die itself, whether or not the CPU implements in-order or OOO execution pipelines. But programming for SMT/SMP/multi-core, is definately going to be a completely different world for some game devs. Early first-gen titles, will likely be ports that will only utilize a single thread/core (and perhaps one for handling I/O and the system libraries). It would be similar to many games for the ill-fated, heavily multi-CPU Atari Jaguar system. Although it contained several 64-bit pieces of hardware, a pair of 32-bit RISC DSP chips, and a standard boring old 68000 "control" CPU, most of the games that were released for it, were ports of Amiga games, and the game code ran entirely on the 68000, the slowest CPU in the machine, only because it was the easiest way to program games for the system.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: MisterChief
Originally posted by: ronnn
Back to the op, Beyond3d seems to suggest that the r520 may be a bit disapointing.
Link

Since when has Beyond3d had any accurate info? JK:p

I think B3d often has good info, although I have to say Dave's position on his "inaccurate" 6800NU SLI numbers surprised me.

I chatted with him about it this weekend, and while he said that he agrees the numbers for that rig appear to defy both logic and what others have found with 6800NUs, he was not going to change it until someone supplied him with new 6800NUs. (because the results were accurate for the cards he has)

I wish I wouldn't have sold my 6800NUs so fast after getting the GTs- I would have loaned them to him just so he could post good info.

Sort of a perplexing position to take for him, IMO.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Maybe then HL2 can release the SM3 patch.

You thought HL2 wasn't really taxing? Hah, just wait.
 

MetalStorm

Member
Dec 22, 2004
148
0
0
Originally posted by: Regs
Maybe then HL2 can release the SM3 patch.

You thought HL2 wasn't really taxing? Hah, just wait.

HL2 ran very well on my rig considering 3200+ barton and 9800 pro, i was impressed!
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: housecat
This product is too expensive and has too much advanced technology.

Why DXNext support when we barely have DX9C technology??

You are paying for technology that will be very slow when DXNext games are released!
And there probably wont be any visual difference from DX9C to DXNext.. and anything that can be done under DXNext can be done under DX9C with more passes.

I'd suggest sticking with older technology, because when this card can use its features, it will be too slow.

/end play on modern day braindead ATI fanboy rant on superior technology

HA HA HA HA HA

You crack me up, on another thread your ranting on how amazing SM3 and DX9C are and how great they are even if most games dont fully utilize them just so you can be future proof.

And now you say you dont need the new features of these new cards.

You a lame nVidia fanboy thats all, you say yea the 6800 is the best because it has SM3, because your future proofing urself. Then you turn around and say when another card coming out in a few months is gonna support stuff that isnt gonna be used in a year or so lame because its not needed yet.

Next time why not think about what your saying first!