• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

R520 Faster Than G70 and 6800U SLI

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

The Sly Syl

Senior member
Jun 3, 2005
277
0
0
I just want to know what the retail names of the r520 is going to be.

Nvidia is obvious, the 7 series. 7200-7900, whatever the hell works for them.

What is ATI going to call it? The XI800? The 11800? The Y800? the Z500?

Seriously, i'm curious what sort of marking they could make up at this point.
 

imported_X

Senior member
Jan 13, 2005
391
0
0
I don't blame you my man. I think my preference for Nvidia is pretty well known around here.

Personally, I think brand loyalty is stupid. I go with whatever manufacturer has the best product at the time. I have no idea at this point whether it will be the G70 or the R520, and I don't really care either way. I just want the best, rather than blindly going with one company because I've liked their products in the past.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Acanthus

Like physics, AI, realtime shaders, occlusion culling that isnt fixed point, and swapping textures into and out of memory.

That's still not good enough for me. The fact that 4 years+ after, A64 4000+ and 6800Ultra SLI are available and games often dont look nearly as good or few better than the 2001 demo. The fact that every game out there doesn't blow away the demo is already mind-boggling. Sure the polygon counts are low but the overall graphics quality is still good. Also X850xt alone is about 4x faster in pure fillrate than my 8500 videocard. It should easily handle games that look 2x as good as the demo at same framerates. I agree graphics have come a long way with Far Cry and Doom 3 -- But in 4.5 years, not really. Medal of Honor looks just as good as HL2....3 years younger than it. Look at battlefield 2, that's progress? Call of Duty or No One Lives Forever 2 look just as good. Difference between max payne 1 and max payne 2? Barely. WOW? Not that good. Of course I care more about gameplay, but just stating a point.....that graphics quality has not increased proportionally with the increase in hardware speed.

What I mean is that 6600GT is good enough to play any game out there. I am not saying using AA/AF 1600x1200. I am talking pure graphics -- like how good doom 3 looks at 800x600 -- simply because of its graphics. Like Dead or Alive on Xbox360 looks better than any PC game right now -- it wont matter if the resolution is 640x480, it'll still look better due to the overall graphical complexity. So what I am saying is why are developers not increasing this "overall graphical complexity" (lacking a better way to describe high texture and polygon and shader counts). I'd rather play insanely beautiful game at 800x600 no AA/AF, then try to max out an ugly game at 1600x1200 4AA/16AF.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: X
I don't blame you my man. I think my preference for Nvidia is pretty well known around here.

Personally, I think brand loyalty is stupid. I go with whatever manufacturer has the best product at the time. I have no idea at this point whether it will be the G70 or the R520, and I don't really care either way. I just want the best, rather than blindly going with one company because I've liked their products in the past.


You seem to mistake having a preference for meaning I will buy that IHV's card.

Reading comprehension, friend.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: X
I don't blame you my man. I think my preference for Nvidia is pretty well known around here.

Personally, I think brand loyalty is stupid. I go with whatever manufacturer has the best product at the time. I have no idea at this point whether it will be the G70 or the R520, and I don't really care either way. I just want the best, rather than blindly going with one company because I've liked their products in the past.

ya it is real stupid. just because one product is good, doesn't mean the other will be...i don't care if the chip on my card is ati or nvidia...as long as it plays wolfenstein 2 and quake 4 i could care less...but i probably would buy from the IHV again just because leadtek owns...but so do bfg and evga...so i don't really care.
 

imported_X

Senior member
Jan 13, 2005
391
0
0
Reading comprehension, friend.

Back at ya...I never said that you were one of the people that blindly advocates a company based on past performance, rather than going with whatever is best at the time :)
 

klah

Diamond Member
Aug 13, 2002
7,070
1
0
Originally posted by: RussianSensation
Like Dead or Alive on Xbox360 looks better than any PC game right now -- it wont matter if the resolution is 640x480, it'll still look better due to the overall graphical complexity.

I am pretty sure they said the minimum standard for all xbox360 games will be 1280x720 with 4xAA.



 

imported_X

Senior member
Jan 13, 2005
391
0
0
So what I am saying is why are developers not increasing this "overall graphical complexity" (lacking a better way to describe high texture and polygon and shader counts).

I think the answer is that they have to cater to the lowest common denominator. Resolutions and AA/AF can scale with hardware, but creating an "insanely beautiful game" sets the bar pretty high. Sadly, games will never be optimized for people on the cutting edge.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: X
Reading comprehension, friend.

Back at ya...I never said that you were one of the people that blindly advocates a company based on past performance, rather than going with whatever is best at the time :)



Hm.





I move that our respective misinterpretations cancel each other out, and we start anew. Hi, I'm Insomniak.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Hi Insomniak.

I motion we open this meeting of Fanbois Anonymous, Not.

:D

g3pro owes me a new bs meter. He broke the needle on mine.
 

imported_Rampage

Senior member
Jun 6, 2005
935
0
0
"R520 Faster than G70 and 6800U SLI"


Yeah, and G80 will be faster than Crossfire R520, but they cant seem to get these magical products to market. ;) And price?

Its a nice pipe dream and a fanboys gizzum, thats about it.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
You'll be seeing at least G70 results in ~ 2 weeks. Based on some of the articles that I've been seeing lately, sure sounds like ATi is having yield problems (some massive ones), so a few working cards, when the rest can't even get more than 16 pipes working (if the information is true), doesn't equate to a shelf product.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
I've heard around a lot that this may be ATi's NV30 - i.e. very late due to manufacturing problems, and possibly disappointing performance.


But time will tell. I think it's obvious from delay after delay that at least the manufacturing difficulties are true.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: keysplayr2003
Originally posted by: Avalon
Originally posted by: keysplayr2003
Originally posted by: X
I have a hard time believing the G70 beats the R520 by "big margins". I would not be equally skeptical if the reverse were being claimed. No offense, but although the Inquirer gets some things wrong, I'll take the word of the one who says ATI wins over yours until hard numbers come out.

Fixed for ya.

And the same could be said for the Nvidia camp for at least 5 people in this thread. What's your point?

Is it even possible to have a point in a pointless thread? Not really gonna try to go all sensible in a thread that really doesn't do anything for us.

Well, I suppose the point of the thread was to speculate, if you can find any point in that at all. Keys, I'm sorry if my reply came off as sounding irritated towards you or anything of the sort. It's just that when I see these threads speculating on the performance of one side or the other, and someone claims card X will by faster than card Y, someone will always be so quick to claim BS, but if claimed Y were faster than X, they would either not post or just say to wait for benches. Then they deny it.

Just bugs the crap out of me.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Insomniak
I've heard around a lot that this may be ATi's NV30 - i.e. very late due to manufacturing problems, and possibly disappointing performance.


But time will tell. I think it's obvious from delay after delay that at least the manufacturing difficulties are true.

It may be late but I doubt it will be disappointing.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Last year the vendors priced gouged the market like crazy. You couldn't lay your hands on a 6800GT for less than 500-600 dollars until eVGA, BFG and PNY had them in quantities.

By the time the new generation drops to it's MSRP which is all ready speculated to be very high in price, ATi will have likely paper launched Fudo and by that time sites will have reviews on it.

Just something to keep in mind.
 

g3pro

Senior member
Jan 15, 2004
404
0
0
And then you can wait a few months after that when you get poor yields on the R520. Something to keep in mind.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: Killrose
Now all we need are games that will tax these video cards. And I'm not talking about only one or two games. Myself, I think i'll wait for the Ageia physics eng., before I get too excited.

The new home consoles should do just that hopefully.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Noob
I wonder why ATI didn't include SM 4.0's Unified Shader Architecture in it's R520.

I think Unified Shaders are planned for the R600 series.