New info on the G70 and R520...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Gamingphreek


FP32 AKAIK was used most of the time. You could for 16bit FP's however that did result in a lower IQ.

Nvidia has always had better AA and ATI. The MAJOR problem was with AF, in which case Nvidia's would do almost nothing and suffer a HUGE performance hit.

As for having PS1.1 and not 1.4. AFAIK the FX series supports 1.4. Even the Geforce 4 had PS1.4 i think. The Geforce 3 was the only one who was stuck at 1.1 (Only supported DX8, not 8.1).

Neither company has really produced a vastly inferior card in comparison to the other. As i said earlier the flaws lie in the mArchitecture, and compilers, not the specs (ie: Both had 256bit bus, same amount of RAM, relatively same amount of clockspeed (the cards were not 150mhz vs. 400mhz), both "technically" had 8 pipelines. It was just poor and for that matter horrible planning on Nvidias part. ATI had it down :thumbsup:

-Kevin

Wow, I just dont know what to say about the part I bolded. I just cant imagine anyone could think that. Not only has NV not always had the best AA, it was also much slower with the FX cards. I mean, really, c'mon.

I was talking about the GF3 and 8500 with the 1.4.

Again, I dont understand why you think NV cant slip up, again. While it sure doesnt appear it has happened this time, thinking they wont ever "release a solution that is that inferior to the one ATI would release!?" is pretty naive to me.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ddogg
Originally posted by: CaiNaM
bah.. here we go again.. just because something has more "pipes" does not mean it will be faster.. just like higher clockspeed does not guarantee a faster product. it will also be largely dependant on the rest of the architecture, which at this point we know little about (either product).

yes...but a 32pipe still has a very likely chance of outperforming the 24pipe card even though a lot depends on architecture

as likely as it will be similar or less in performance... we just don't know, do we? which was the point.. even with this "new info", we still no little, and frankly i agree with the majority here that nvidia is likely in a better postition having a 4month advantage; even better if the speculation regarding the availability of r520 is accurate regardless of who actually has better performance.

after all, despite a few less features, the x800 xt pe was as fast or faster than the ultra, however ati's inability to actually ship the product cost them dearly in the enthusiast/desktop market. 2 generations of this in a row is terrible -- not only does it whittle away at the marketshare they gained with the r3xx generation, but it just makes them look bad.. benchmarks aside, winning customers has a lot to do with the "stigma" surrounding a company (add to that their late entry with crossfire). after all, fanboys do not live on benhmarks alone ;)

nv40 was good for nvidia in many ways.. it showed they were finally moving away from brute power, and working smarter (something ati did previously w/ r300). as with cpu's, clockspeeds only go so far before succumbing to diminishing returns. on top of that, they showed they can deliver both the tech, the features, and the parts themselves. ati has stumbled there, and if speculation holds true, they may stumble again -- even moreso with talk about high prices for their next 'flagship' part (tho we don't really know g70 pricing yet).

i do think ati scored with their 800xl (remarkable performance in it's price class), however despite the positive reactions it is seeing with that, if nvidia delivers with g70 in the next few weeks as stated, the limelight will be short lived.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Gamingphreek
Nvidia has always had better AA and ATI. The MAJOR problem was with AF, in which case Nvidia's would do almost nothing and suffer a HUGE performance hit.

i think you have that backwards...
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
So, my G70 arrived today, and I've put it through it's paces. While I can't divulge any sort of benchmarks, or any information about the card (courtesy of NDAs), I can tell you that it's a definite improvement. Well, I can say it is in fact a single slot card. :)

Take it for what it's worth.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Well if it is a definite improvement over your sli rig.......... Good times ahead for Nvidia.
 

DeeSlanger

Member
Oct 9, 2003
136
0
0
All of this is boiling down to WGF2 with everything is unified, and both graphic companys will be in parity.... note ATI and MS in the sack. Who will have the edge in 18 months? :)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: X
When you say it's a definite improvement, do you mean vs. the single or the SLI 6800U?

Obviously he means sli, as that is what he advertises his system as. Also he does write about the g70 in singular. So unless Ronin is fos, the g70 beats 6800ultra sli! :D
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
I can't confirm nor deny that claim, unfortunately. I only have one card, so I am unable to compare SLi to SLi at this point. Plus, drivers are still in development, so my results may not be entirely accurate.

Try not to make conjectures based on what I said, as that was not my intention. :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: ronnn
So what was your intention?

Give the guy a break man. NDA's are no joke. Try not to corner him, I heard these NDA guys get vicious when they get cornered. ;)

 

obeseotron

Golden Member
Oct 9, 1999
1,910
0
0
Pixel Shader Versions, by chronological architecture release
GF3: 1.1
XGPU (Xbox1): 1.1
GF4: 1.3
Radeon 8500: 1.4
R300: 2.0
NV30: 2.0+ (To my knowledge the + has never been used because NV30 sucks in PS2 so much)
NV40: 3.0
R420: 2.0b


ATI has had generally better AA and before NV40 better and faster AF. Things are very similar now, if there is an edge in quality it would still go to ATI. For reference my last 7 video cards: 6800GT, Geforce 4 Ti 4400, Geforce 2 Pro, Geforce 1 DDR, Riva TNT/Voodoo2, Riva 128/Voodoo2. Nvidia, who I believe is a larger overall company to begin with has a great advantage in being able to use g70 for the ps3 in the highly similar RSX, while ATI had to develop R500 and R520 in parallel and they're completely different.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
I dont think that write up is biased. I just dont know if they really know what they are talking about or not.

I am tired of the scenario that they described ATI doing.. release a killer card, that is unavailable and pretend ATI ownz. Yeah right.

I've had enough waiting from BOTH the NV40/R420. Thanks. Just get a decent improvement out buttheads. That goes for both of them, NV a lesser extent.

I think thats great if the R520 "ultra" has 32pipes and whoops up on the GF7 Ultra.. fine.. it doesnt bother me.. but I'm buying the one I can get ahold of for a halfway reasonable price.

I think the benefit of having a single card that is the fastest single card does not improve sales as much as having an entire SOLID lineup that is actually for sale..
Call me crazy. Or better yet a fanboy for following this business logic. But I'd bet I'd outsell you on video cards thinking this way.
Do it right, and decent the first time out the gate and you wont be needing all these refreshes. Thats why I'm a Geforce6 fanboy. Good sh*t.

Not this $1000 ebay crap for one card like we saw last time.
Maybe I'm growing old or something.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: housecat

I think thats great if the R520 "ultra" has 32pipes and whoops up on the GF7 Ultra.. fine.. it doesnt bother me.. but I'm buying the one I can get ahold of for a halfway reasonable price.

Under $1500 for two?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: keysplayr2003
Originally posted by: ronnn
So what was your intention?

Give the guy a break man. NDA's are no joke. Try not to corner him, I heard these NDA guys get vicious when they get cornered. ;)


If nda is no joke, why post?
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
By stating I had the card and basic information, I violated no NDA. Rest assured, however, that once the card has officially launched, you'll have all the information I've collected today.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Ronin
By stating I had the card and basic information, I violated no NDA. Rest assured, however, that once the card has officially launched, you'll have all the information I've collected today.

lol, attention whore.
are you ronin from the NV official forums? if so, i hate you. You tried to stop us from having our widescreen threads, yet flaunted you had the driver fix.

And even lied and said the 71.89 drivers fixed it. Yeah. I hate you, if thats you.
Now here you are dropping hints you have a G70, just die.
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: Ronin
Even at overclocked speeds, at idle, it appears to be around 43C (stock cooling, of course).

hmm...thats pretty good!! looks like they have solved their heat problems
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Ronin
By stating I had the card and basic information, I violated no NDA. Rest assured, however, that once the card has officially launched, you'll have all the information I've collected today.

So, when is the card supposed to be officially launched? Or can you not say?