New info on the G70 and R520...

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: ddogg
Originally posted by: Ronin
Even at overclocked speeds, at idle, it appears to be around 43C (stock cooling, of course).

hmm...thats pretty good!! looks like they have solved their heat problems

If an overclocked G70 is 43C at idle on stock cooling I'll donate one of my nuts to medical research :laugh:
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Ronin
Whether I am or not has no bearing on your piss poor, childlike attitude. Go bother someone else, you prepubescent twit.

Yeah. You are. What pathetic attempts at showing off you have a G70 though. I like you lied about the 71.89s worked for you under SLI mode.

Nvidia needs to get a moderator who is actually looking out for the interests of their customers, not hide issues pertaining to their hardware.

It came out ANYWAY didnt it little RONIN? It was all over the Inq all over HardOCP and Nvidia had to fix it didnt they little man?

Punk a$$ b****.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Originally posted by: Ronin
Even at overclocked speeds, at idle, it appears to be around 43C (stock cooling, of course).

Inclosed case or motherboard on a open air test bench?

 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: nitromullet
Originally posted by: Ronin
By stating I had the card and basic information, I violated no NDA. Rest assured, however, that once the card has officially launched, you'll have all the information I've collected today.

So, when is the card supposed to be officially launched? Or can you not say?

That information is actually out there already. :)
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: Killrose
Originally posted by: Ronin
Even at overclocked speeds, at idle, it appears to be around 43C (stock cooling, of course).

Inclosed case or motherboard on a open air test bench?

Enclosed case. Same case as my SLi box, actually (I just took the 2 6800's out and put the G70 in).

3 intake fans on the side panel, 1 120MM in the back, 1 120MM in the front for cooling.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: housecat
Originally posted by: Ronin
Whether I am or not has no bearing on your piss poor, childlike attitude. Go bother someone else, you prepubescent twit.

Yeah. You are. What pathetic attempts at showing off you have a G70 though. I like you lied about the 71.89s worked for you under SLI mode.

Nvidia needs to get a moderator who is actually looking out for the interests of their customers, not hide issues pertaining to their hardware.

It came out ANYWAY didnt it little RONIN? It was all over the Inq all over HardOCP and Nvidia had to fix it didnt they little man?

Punk a$$ b****.


Antagonize me all you want. You simply validate what I described you as. The 71.89's did work. Why would I lie? I have no reason to, and nothing to gain.

What those forums need is less users like you who *think* they know, when it's 100% apparent they don't.

Now, go change your diaper and let the adults talk.

And for the record, I quashed that thread because of the way the users were handling it, and for no other reason. I'm all for issues being brought to light, and reported to the proper folks, but that thread turned far beyond the scope of the rules for the forum. I keep a tight leash over there, because if I didn't, it would become a madhouse. I'm fair, but I will not take any crap from anyone.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
OH REALLY? Can you find me ONE person here who has SLI that has a working 2005FPW/2405FPW rig with the 71.89 drivers?

Yeah you wont take crap from anyone even when you are being unreasonable, you are a nazi of a mod and an arrogant sack of ****.

I dont give a damn what I validate for you. You suck as a mod, and you are a liar pure and simple.

There are some hardcore NV guys over there that were looking for help and you shut them down like a dogmatic nazi. They had been putting up with a problem for MONTHS. It was not right.
You do not represent Nvidia well at all.
Or maybe you do.
 

rise

Diamond Member
Dec 13, 2004
9,116
46
91
yikes, if true, this looks disastrous for ati frankly.

they're losing out everyday they don't have their stop gap (imo) amr tech out against sli and then they give nv a 4 month or so jump on next gen single cards.

if the g70 in fact out performs sli'd 6800 ultras this round will be a slaughter.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: rise4310
yikes, if true, this looks disastrous for ati frankly.

they're losing out everyday they don't have their stop gap (imo) amr tech out against sli and then they give nv a 4 month or so jump on next gen single cards.

if the g70 in fact out performs sli'd 6800 ultras this round will be a slaughter.

We'll see if R520 has 32 pipes (as rumors say) then we'll see, letting nVidia get a jump on them is a bad marketing move in my mind but then again I don't know what kind of a monster ATI has up their sleeve, until we see both cards in action then no one can say for certain.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: Ronin
Originally posted by: Killrose
Originally posted by: Ronin
Even at overclocked speeds, at idle, it appears to be around 43C (stock cooling, of course).

Inclosed case or motherboard on a open air test bench?

Enclosed case. Same case as my SLi box, actually (I just took the 2 6800's out and put the G70 in).

3 intake fans on the side panel, 1 120MM in the back, 1 120MM in the front for cooling.

I gotta say 43C with stock cooling for a single slot cooler is really good, but Ronin does have some above average airflow in his case, not downplaying anything those temps are really good, just holding my usual skepticism.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Anybody post the LATEST?
:p

R520 has 32 pipelines, 24 working; 90 nanometre hot chip
IT MIGHT be the only chip capable of doing H.264 video in hardware for a while, as we believe that not even Nvidia G70 will be able to do this. H.264 is next generation codec used for HD-DVD content. Despite that we heard more than once that R520 aka Fudo is the biggest chip ever built.

R520 was showed in Taiwan and it's up and running, showcasing two games, PREY and Alan Wake at ATI booth and doing some H.264 video decoding at a different corner.

We learned that R520 might be quite hot as it is the biggest chip ever with its 300+ millions transistors. Since its 90 nanometre its kind of logical that you can place quite a lot of pipelines under its hood and it turns out that ATI has managed to pack in 32 pipes. Having 32 happy Scottish chaps in kilts playing the pipes at the same time is perhaps the best metaphor.

We wrote about this before but I don?t think ATI will decide about final pipe number until it seas Nvidia's G70, Geforce 7800GTX in action. ATI will have to choose between 24 pipeline part of the nasty decision of having all 32 pipes working at once. If R520 manages to defeat G70 ATI will stick to 24 pipes story and all will be happy in Toronto and ATI's part of Santa Clara. If G70 scores better then ATI will have to enable all 32 pipes. The real problem is that at 32 pipelines yields are not that good as we are talking about very complex chip.

Whatever Nvidia do ATI have reasonable chances this time around. We even heard from Hexus that Nvidia might have some G70 Ultra card in its pocket saving it for rainy days.

G70 comes at some cost to make
We learned that Taiwanese based companies are going to pay no more and no less than $357 for the cards.

ATI Crossfire to come in warm summer [/quote]ATI plans to sample Crossfire enabled boards till the end of the month while its retail availability is expected in July. We believe that it might even slip to August.

Nvidia SLI is its true competitor and SLI is now in good shape, shipping for months now, let's be fair and say a half year already. SLI might have some problems with games here and there and no one can say that this won't be the case with ATI's Crossfire despite company's claims.

Crossfire indeed looks great but it might be some time before we see it on the market and if you think about it Nvidia will announce its G70 before ATI manages to ship its first Crossfire system, at least that?s how it looks like now.

So the next big fight is not G70 vs. R520 it's more likely two G70s in SLI versus two Fudos in Multi VPU, Crossfire[/quote]

the inq sucks
:shocked:

:roll:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: ronnn
Originally posted by: keysplayr2003
Originally posted by: ronnn
So what was your intention?

Give the guy a break man. NDA's are no joke. Try not to corner him, I heard these NDA guys get vicious when they get cornered. ;)


If nda is no joke, why post?

You seem angry. Whats up?

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: fierydemise
... letting nVidia get a jump on them is a bast marketing move in my mind but then again I don't know what kind of a monster ATI has up their sleeve, until we see both cards in action then no one can say for certain.

didn't work for em last gen.. they only failed on promise after promise to actually get the cards to market...

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: housecat
Originally posted by: Ronin
Whether I am or not has no bearing on your piss poor, childlike attitude. Go bother someone else, you prepubescent twit.

Yeah. You are. What pathetic attempts at showing off you have a G70 though. I like you lied about the 71.89s worked for you under SLI mode.

Nvidia needs to get a moderator who is actually looking out for the interests of their customers, not hide issues pertaining to their hardware.

It came out ANYWAY didnt it little RONIN? It was all over the Inq all over HardOCP and Nvidia had to fix it didnt they little man?

Punk a$$ b****.

Housecat chill out man. Only Punk a$$ b****es use the phrase "Punk a$$ b****" anyway.
Teenagers today are weird man. Not saying your one of them but I hear them say this phrase all the time and I have to laugh because I think to myself their IQ drops a notch everytime they say it. I mean when I was a teenager, about 20 years ago, we had to deal with guido's ( little italian teenagers with oily slick hair and 75lbs of fake gold chains around their necks. ) I can say Italian because I am one. It was embarrassing to see them act like wise guys.

Sorry to go O/T but you just reminded me of those funny little guys we used to laugh at and beat up for being such Punk a$$ b****es. :beer:

 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: CaiNaM
Originally posted by: fierydemise
... letting nVidia get a jump on them is a bast marketing move in my mind but then again I don't know what kind of a monster ATI has up their sleeve, until we see both cards in action then no one can say for certain.

didn't work for em last gen.. they only failed on promise after promise to actually get the cards to market...


It didnt work? They outsold the X800 and X800XL/X850XT PE lineup combined with merely the GF6 series.. no refresh, which always boosts sales because people think its really an improvement.
I'd say everything they did, worked. And they regained their street cred as the leading performance GPU design house with the NV40/SLI launch.

Didnt work? IT WAS A COUP D'ETAT!
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: keysplayr2003
Originally posted by: housecat
Originally posted by: Ronin
Whether I am or not has no bearing on your piss poor, childlike attitude. Go bother someone else, you prepubescent twit.

Yeah. You are. What pathetic attempts at showing off you have a G70 though. I like you lied about the 71.89s worked for you under SLI mode.

Nvidia needs to get a moderator who is actually looking out for the interests of their customers, not hide issues pertaining to their hardware.

It came out ANYWAY didnt it little RONIN? It was all over the Inq all over HardOCP and Nvidia had to fix it didnt they little man?

Punk a$$ b****.

Housecat chill out man. Only Punk a$$ b****es use the phrase "Punk a$$ b****" anyway.
Teenagers today are weird man. Not saying your one of them but I hear them say this phrase all the time and I have to laugh because I think to myself their IQ drops a notch everytime they say it. I mean when I was a teenager, about 20 years ago, we had to deal with guido's ( little italian teenagers with oily slick hair and 75lbs of fake gold chains around their necks. ) I can say Italian because I am one. It was embarrassing to see them act like wise guys.

Sorry to go O/T but you just reminded me of those funny little guys we used to laugh at and beat up for being such Punk a$$ b****es. :beer:

I'm sorry my youthful disposition alarms and disturbs your aging ears...

from one punk a$$ ***** to another!!! :p
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: housecat
Originally posted by: keysplayr2003
Originally posted by: housecat
Originally posted by: Ronin
Whether I am or not has no bearing on your piss poor, childlike attitude. Go bother someone else, you prepubescent twit.

Yeah. You are. What pathetic attempts at showing off you have a G70 though. I like you lied about the 71.89s worked for you under SLI mode.

Nvidia needs to get a moderator who is actually looking out for the interests of their customers, not hide issues pertaining to their hardware.

It came out ANYWAY didnt it little RONIN? It was all over the Inq all over HardOCP and Nvidia had to fix it didnt they little man?

Punk a$$ b****.

Housecat chill out man. Only Punk a$$ b****es use the phrase "Punk a$$ b****" anyway.
Teenagers today are weird man. Not saying your one of them but I hear them say this phrase all the time and I have to laugh because I think to myself their IQ drops a notch everytime they say it. I mean when I was a teenager, about 20 years ago, we had to deal with guido's ( little italian teenagers with oily slick hair and 75lbs of fake gold chains around their necks. ) I can say Italian because I am one. It was embarrassing to see them act like wise guys.

Sorry to go O/T but you just reminded me of those funny little guys we used to laugh at and beat up for being such Punk a$$ b****es. :beer:

I'm sorry my youthful disposition alarms and disturbs your aging ears...

from one punk a$$ ***** to another!!! :p

are you trying to say "punk ass bitch"?
:Q
[not that i think about it . . . what is it?]
:confused:

is that 'forbidden' here?
:roll:


and while you were flaming each other i posted some "real" news about the r520/g70
:shocked:

:D
 

rise

Diamond Member
Dec 13, 2004
9,116
46
91
Originally posted by: housecat
It didnt work? They outsold the X800 and X800XL/X850XT PE lineup combined with merely the GF6 series.. no refresh, which always boosts sales because people think its really an improvement.
I'd say everything they did, worked. And they regained their street cred as the leading performance GPU design house with the NV40/SLI launch.

Didnt work? IT WAS A COUP D'ETAT!

err, i think he was saying what ati did failed.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Meh. It prob is banned.. i just figure some really really young (like 10-15) ppl come here and dont want to leave open profanity..
it could be allowed.. but ppl old enough will know what I'm saying anyway.

But it consists of two parts: punk ass (meaning, your ass is a punk), and bitch (meaning female dog, but in this case its taking it for face value as in "you are a bitch").. so to slow it down it would be "punk ass, bitch!"
But if you say it fast.. it sounds fine too.

But ya, I read those reports on the Inq as well today.
Looks as expected, I dont think the G70 sounds too expensive by any means.. what did we expect?

Everyone keeps dreaming of 7800 Ultras coming out and being $200.. dream on.
Same goes for the next high end ATI.. you gotta pay to play. That should be both ATI AND Nvidia's slogans!

Scrap The Way Its Meant to Be Played and Get in the Game!!!

Its GOT TO PAY TO PLAY!!
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
To move the discussion back. Nvidia has never had faster or better AF. AF has always been bad performance wise. Until Geforce 6 quality wise as well. As i have stated plenty of times, it is just an architectural limitation. When AF is used one of the ALU's is taken from pixel processing in order to do the AF computations.

Nvidia HAS had better AA. They just recently (Geforce 6) switched to Rotated Grid which has slightly (almost unnoticable) less IQ but much nicer performance.

-Kevin
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: keysplayr2003
Originally posted by: housecat
Originally posted by: Ronin
Whether I am or not has no bearing on your piss poor, childlike attitude. Go bother someone else, you prepubescent twit.

Yeah. You are. What pathetic attempts at showing off you have a G70 though. I like you lied about the 71.89s worked for you under SLI mode.

Nvidia needs to get a moderator who is actually looking out for the interests of their customers, not hide issues pertaining to their hardware.

It came out ANYWAY didnt it little RONIN? It was all over the Inq all over HardOCP and Nvidia had to fix it didnt they little man?

Punk a$$ b****.

Housecat chill out man. Only Punk a$$ b****es use the phrase "Punk a$$ b****" anyway.
Teenagers today are weird man. Not saying your one of them but I hear them say this phrase all the time and I have to laugh because I think to myself their IQ drops a notch everytime they say it. I mean when I was a teenager, about 20 years ago, we had to deal with guido's ( little italian teenagers with oily slick hair and 75lbs of fake gold chains around their necks. ) I can say Italian because I am one. It was embarrassing to see them act like wise guys.

Sorry to go O/T but you just reminded me of those funny little guys we used to laugh at and beat up for being such Punk a$$ b****es. :beer:


Paizon! :D

Since the subject of temps was brought up, I'll go ahead and unplug the 3 intakes and see what kind of results I get. I think, with all the pictures of the 7800 out on the net, that I can safely take some to post.

More to follow tomorrow.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: keysplayr2003
Originally posted by: ronnn
Originally posted by: keysplayr2003
Originally posted by: ronnn
So what was your intention?

Give the guy a break man. NDA's are no joke. Try not to corner him, I heard these NDA guys get vicious when they get cornered. ;)


If nda is no joke, why post?

You seem angry. Whats up?


Angry? Not really, but is interesting to find that another person promoting how great sli is, gets his cards for free. Makes me wonder if anyone has actually bought dual cards. Besides I wanted to hear something definite if for no other reason than to verify later.
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: trinibwoy
Originally posted by: ddogg
Originally posted by: Ronin
Even at overclocked speeds, at idle, it appears to be around 43C (stock cooling, of course).

hmm...thats pretty good!! looks like they have solved their heat problems

If an overclocked G70 is 43C at idle on stock cooling I'll donate one of my nuts to medical research :laugh:

LOL...atleast some laughter here in this thread which otherwise has been marred by some serious FLAMING
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Gamingphreek
To move the discussion back. Nvidia has never had faster or better AF. AF has always been bad performance wise. Until Geforce 6 quality wise as well. As i have stated plenty of times, it is just an architectural limitation. When AF is used one of the ALU's is taken from pixel processing in order to do the AF computations.

Nvidia HAS had better AA. They just recently (Geforce 6) switched to Rotated Grid which has slightly (almost unnoticable) less IQ but much nicer performance.

-Kevin

Man, get your facts straight. Nvidia had higher quality AF with the FX series. ATi's rotated grid AA was superior. And now that Nvidia has adopted rotated grid as well the quality is very close to ATi's. And stop the nonsense about performance. Changing the sampling positions does not affect performance at all as long as the number of samples is consistent.