New info on the G70 and R520...

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: ronnn
Originally posted by: keysplayr2003
Originally posted by: ronnn
Originally posted by: keysplayr2003
Originally posted by: ronnn
So what was your intention?

Give the guy a break man. NDA's are no joke. Try not to corner him, I heard these NDA guys get vicious when they get cornered. ;)


If nda is no joke, why post?

You seem angry. Whats up?


Angry? Not really, but is interesting to find that another person promoting how great sli is, gets his cards for free. Makes me wonder if anyone has actually bought dual cards. Besides I wanted to hear something definite if for no other reason than to verify later.

Then you're not aware that I get ATi cards as well? In this box I'm posting on, right this second, there is an x850 XT PE AGP. I maintain an unbiased attitude where I need to, and when it comes to my personal performance, I am extremely picky. I have cards from both manufacturers, and I use cards from both manufacturers. Currently, my SLi configuration is the best performing one, which is why it's in my sig. If that changes, so will my sig, plain and simple.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: housecat
Meh. It prob is banned.. i just figure some really really young (like 10-15) ppl come here and dont want to leave open profanity..
it could be allowed.. but ppl old enough will know what I'm saying anyway.

But it consists of two parts: punk ass (meaning, your ass is a punk), and bitch (meaning female dog, but in this case its taking it for face value as in "you are a bitch").. so to slow it down it would be "punk ass, bitch!"
But if you say it fast.. it sounds fine too.

But ya, I read those reports on the Inq as well today.
Looks as expected, I dont think the G70 sounds too expensive by any means.. what did we expect?

Everyone keeps dreaming of 7800 Ultras coming out and being $200.. dream on.
Same goes for the next high end ATI.. you gotta pay to play. That should be both ATI AND Nvidia's slogans!

Scrap The Way Its Meant to Be Played and Get in the Game!!!

Its GOT TO PAY TO PLAY!!

thanks for the slang update . . . . i see where the comma makes all the difference, :)
[although i would think you would NOT be saying "your ass is a punk"; rather you are a "Punk [jack]ass" . . . . 'bitch' just added as emphasis to demean the male it is addressed to . . . . oh well . . . .

and NONE of those words are actually "profanity" . . . . This forum IS 'pg-13' and i hear "far worse" in those movies

well $500 IS expensive for a G70 and probably %550 for the ultra. ;)

and the r520 is the one that looks to be running hot . . . .
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Ronin
Originally posted by: ronnn


Angry? Not really, but is interesting to find that another person promoting how great sli is, gets his cards for free. Makes me wonder if anyone has actually bought dual cards. Besides I wanted to hear something definite if for no other reason than to verify later.

Then you're not aware that I get ATi cards as well? In this box I'm posting on, right this second, there is an x850 XT PE AGP. I maintain an unbiased attitude where I need to, and when it comes to my personal performance, I am extremely picky. I have cards from both manufacturers, and I use cards from both manufacturers. Currently, my SLi configuration is the best performing one, which is why it's in my sig. If that changes, so will my sig, plain and simple.

I am not accusing anyone of bias. Just saying that dual gpus are easier to reccomend when you get them for free (along with the special mb and power supply). Anyways how loud is the g70?


 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: housecat
Originally posted by: CaiNaM
Originally posted by: fierydemise
... letting nVidia get a jump on them is a bast marketing move in my mind but then again I don't know what kind of a monster ATI has up their sleeve, until we see both cards in action then no one can say for certain.

didn't work for em last gen.. they only failed on promise after promise to actually get the cards to market...


It didnt work? They outsold the X800 and X800XL/X850XT PE lineup combined with merely the GF6 series.. no refresh, which always boosts sales because people think its really an improvement.
I'd say everything they did, worked. And they regained their street cred as the leading performance GPU design house with the NV40/SLI launch.

Didnt work? IT WAS A COUP D'ETAT!

hmm.. umm..

the first post stated early release means nv will slaugher ati.. his reply was:

"letting nVidia get a jump on them is a best marketing move"

which is saying it's a good move for ati to let nv go first.. i simply stated it didn't work for ati well last time. i guess i should say "thanks" for expanding on my reply in detail?
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: CaiNaM
Originally posted by: housecat
Originally posted by: CaiNaM
Originally posted by: fierydemise
... letting nVidia get a jump on them is a bast marketing move in my mind but then again I don't know what kind of a monster ATI has up their sleeve, until we see both cards in action then no one can say for certain.

didn't work for em last gen.. they only failed on promise after promise to actually get the cards to market...


It didnt work? They outsold the X800 and X800XL/X850XT PE lineup combined with merely the GF6 series.. no refresh, which always boosts sales because people think its really an improvement.
I'd say everything they did, worked. And they regained their street cred as the leading performance GPU design house with the NV40/SLI launch.

Didnt work? IT WAS A COUP D'ETAT!

hmm.. umm..

the first post stated early release means nv will slaugher ati.. his reply was:

"letting nVidia get a jump on them is a best marketing move"

which is saying it's a good move for ati to let nv go first.. i simply stated it didn't work for ati well last time. i guess i should say "thanks" for expanding on my reply in detail?

I'm very sorry you guys I didn't catch that mistake (typo) I meant "letting nVidia get a jump on them is a bad marketing move".
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
I think the spec that may well settle the 32 vs 24 debate will be transistor counts. Supposedly RSX in PS3 is well over 300 million, and so is r520.

Now consider previous gen:

R420: 160 million transistors
NV40: 222 million transistors

Now granted, they are architected differently, but have the same number of pipelines. The major difference? SM3.0. I think it's quite possible NV could add another 16 pipelines keep their SM3.0 feature set, and end up around 320-350 mil transistors, and I think ATi could add another 16 pipelines and update their feature set to SM3.0 and end up around the same number.

In short, I see it as very possible both cards high end (GTX and whatever ATi's equivalent is) will be 32 pipe cards with SM3.0. I expect the mid-high to be 24 pipes, and the midrange to be 12 pipers. Low end still gets 4, I say.

All in all though, it's all still speculation. We won't know anything about either until they launch, although I have almost no doubt left at this point that Nvidia will launch first (as they usually do). So, we'll see with regard to G70 in short order.

 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: CaiNaM
the first post stated early release means nv will slaugher ati.. his reply was:

"letting nVidia get a jump on them is a best marketing move"

which is saying it's a good move for ati to let nv go first.. i simply stated it didn't work for ati well last time. i guess i should say "thanks" for expanding on my reply in detail?

You're funny. When has anybody ever used the phrase "get a jump on" in a positive context? He obviously meant "bad" (as he has since confirmed) You people are hilarious! :laugh:
 

EightySix Four

Diamond Member
Jul 17, 2004
5,122
52
91
I agree with you insomniak except for the midrange 12 pipers, me thinks 16... We already have 12 pipe midrange cards (Geforce 6800/Radeon x800 pro)
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: crazySOB297
I agree with you insomniak except for the midrange 12 pipers, me thinks 16... We already have 12 pipe midrange cards (Geforce 6800/Radeon x800 pro)


I consider those mid-high end, and so do most of the consumers and IT heads out there. 6800NUs and X800 Pros are gaming cards - you don't find them in the mainstream/low-end Dell packages.

The 6600GT and X700 Pro are the midrange right now, and they're at 8 pipes.

In other words, I expect pricing to work something like this:

$500+ bracket: 32 pipe cards

$300 - $499 bracket: 24 pipe cards

$150 - $299 bracket: 12 pipes

Sub-$150 bracket: Mostly 4 pipes, maybe some 8 pipe models closer to $100+

 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: ronnn
Originally posted by: Ronin
Originally posted by: ronnn


Angry? Not really, but is interesting to find that another person promoting how great sli is, gets his cards for free. Makes me wonder if anyone has actually bought dual cards. Besides I wanted to hear something definite if for no other reason than to verify later.

Then you're not aware that I get ATi cards as well? In this box I'm posting on, right this second, there is an x850 XT PE AGP. I maintain an unbiased attitude where I need to, and when it comes to my personal performance, I am extremely picky. I have cards from both manufacturers, and I use cards from both manufacturers. Currently, my SLi configuration is the best performing one, which is why it's in my sig. If that changes, so will my sig, plain and simple.

I am not accusing anyone of bias. Just saying that dual gpus are easier to reccomend when you get them for free (along with the special mb and power supply). Anyways how loud is the g70?


Actually, the 2 6800U's in the box listed below are from eVGA, and were paid for. The only thing that wasn't was the A8N-SLi Deluxe.
 

vision33r

Member
Jan 21, 2005
106
0
0
By the end of the year we'll see which card will be out. It doesn't matter if ATI or Nvidia is better, they will be priced accordingly. If ATI R520 is the fastest it will be the priciest and if Nvidia comes out on top, they will be most expensive. I'm only interested in the best bang of the buck so we'll see which company can deliver the card with the most bang for the buck.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
I don't understand why everyone is so afraid of admitting bias. It's like a phobia for half the internet. I'll openly admit I prefer Nvidia graphics, but that doesn't mean I can't understand that ATi makes quality cards, is better in many situations, and has many strong points and weak points as does Nvidia.

Just having a preference does not make you a completely uncredible contributor of information.

Besides, if you read and Nvidia biased post, then an ATi biased post, wouldn't it stand to reason that you have a largely unbiased view of the situation? I personally find bias to be USEFUL.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: Insomniak
Besides, if you read and Nvidia biased post, then an ATi biased post, wouldn't it stand to reason that you have a largely unbiased view of the situation? I personally find bias to be USEFUL.

Anybody can recognize a biased post from either side. They key point is is how they choose to respond to such posts.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Insomniak
Originally posted by: crazySOB297
I agree with you insomniak except for the midrange 12 pipers, me thinks 16... We already have 12 pipe midrange cards (Geforce 6800/Radeon x800 pro)


I consider those mid-high end, and so do most of the consumers and IT heads out there. 6800NUs and X800 Pros are gaming cards - you don't find them in the mainstream/low-end Dell packages.

The 6600GT and X700 Pro are the midrange right now, and they're at 8 pipes.

In other words, I expect pricing to work something like this:

$500+ bracket: 32 pipe cards

$300 - $499 bracket: 24 pipe cards

$150 - $299 bracket: 12 pipes

Sub-$150 bracket: Mostly 4 pipes, maybe some 8 pipe models closer to $100+
Pretty close to what the Inq speculates . . . .

We learned that R520 might be quite hot as it is the biggest chip ever with its 300+ millions transistors. Since its 90 nanometre its kind of logical that you can place quite a lot of pipelines under its hood and it turns out that ATI has managed to pack in 32 pipes. Having 32 happy Scottish chaps in kilts playing the pipes at the same time is perhaps the best metaphor.

We wrote about this before but I don?t think ATI will decide about final pipe number until it seas Nvidia's G70, Geforce 7800GTX in action. ATI will have to choose between 24 pipeline part of the nasty decision of having all 32 pipes working at once. If R520 manages to defeat G70 ATI will stick to 24 pipes story and all will be happy in Toronto and ATI's part of Santa Clara. If G70 scores better then ATI will have to enable all 32 pipes. The real problem is that at 32 pipelines yields are not that good as we are talking about very complex chip.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
All this speculation is driving me crazy. I can't wait to see these beasts.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: fierydemise
I'm very sorry you guys I didn't catch that mistake (typo) I meant "letting nVidia get a jump on them is a bad marketing move".

lol.. ok, i agree with you then! ;)

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Insomniak
I don't understand why everyone is so afraid of admitting bias.

Personally, I find bias to be counter productive. As far as video card bias, I find that biased people will gloss over faults in their favorite video card company's products and exaggerate or even make up defects about their competitor's. And that just leads to confused or misleading facts.

Biased people also tend to be both very defensive and aggressive. And that kind of behavior is helpful to no one.

bias - A preference or an inclination, especially one that inhibits impartial judgment.

biased - To influence in a particular, typically unfair direction; prejudice.


There's nothing wrong with have a preference however. You can have a preference without being biased. You know what each company has to offer but just happen to like one particular brand for whatever reason.

Anyhow, that's my take on the subject.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Creig
Originally posted by: Insomniak
I don't understand why everyone is so afraid of admitting bias.

Personally, I find bias to be counter productive. As far as video card bias, I find that biased people will gloss over faults in their favorite video card company's products and exaggerate or even make up defects about their competitor's. And that just leads to confused or misleading facts.

Biased people also tend to be both very defensive and aggressive. And that kind of behavior is helpful to no one.

bias - A preference or an inclination, especially one that inhibits impartial judgment.

biased - To influence in a particular, typically unfair direction; prejudice.


There's nothing wrong with have a preference however. You can have a preference without being biased. You know what each company has to offer but just happen to like one particular brand for whatever reason.

Anyhow, that's my take on the subject.
what people are missing is that Bias does NOT equal preference . . . you quoted the dictionary . . . let me bold it for those who will miss the gist of your post due to their bias.

bi·as
n.

2.
1. A preference or an inclination, especially one that inhibits impartial judgment.

Bias
:thumbsdown:
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Gamingphreek, you're wrong about the OP's quote being biased one way or the other, you're wrong about nV not having released a "pipeline deficient" card in the past, you're wrong about the the FX's "microarchitecture" and compilers being the reason it lags so far behind in DX9 games, you're wrong about nV using FP32 with the FX in most games, you're wrong about nV always having had the edge in AA, you're wrong about nV having had worse AF until the GF6.

You're just wrong on so many counts, it'd be kind of incredible if you weren't doing this on purpose. My suggestion--at least with respect to nVidia's recent history and this thread in particular--is to post less and read more.

Since when is it biased to think ATI can cram more pipelines in with a smaller process than nVidia? Seems like common sense. Also seems like common sense that they may have yield problems with this new process, at least initially.

Surely synthetic benchmarks numbers and the 3DM03 fiasco taught you that nV must/will use FP16 precision to maintain reasonable performance with their cards?

Surely nV worked their butts off to optimise "DX9" games to their FX architecture. Why is it that Far Cry and HL2 and the like run so much worse on the FX series? In fact, why does the four-pipe FX series run about as fast as the four-pipe 9600 series in those games? It's possible game devs are inept and don't want to maximize FX owners' IQ with their cards. Or it's possible the FX just has some (micro)architectural flaws that can't be reasonably expected to be overcome in the real world of deadlines and "cross-platform" development.

ATI seriously debuted AF with the 8500 *after* nV did with the GF3. ATI's AF had almost no performance hit--compared to the GF3's rather large one--because ATI just used bilinear filtering and angle-dependency, while nV used trilinear filtering and "fully" filtered all angles. nV also went above and beyond with its filtering quality, whereas ATI stuck to the D3D bare minimum. 3DCenter had a good article on this, IIRC, which you should search for.

nV basically had "average" AA until the GF6 series, as did ATI until the 9700 series. 3dfx was first out of the gate with excellent (but at a huge hit) AA with the Voodoo 4/5 series. The 9700 moved ATI from supersample to multisample AA, which improved performance; adapted a jittered grid, which improved quality; integrated gamma-"correction", which also improved quality; and allowed for up to three passes, which allowed for higher max quality at a reasonable speed. The GF3 moved to MSAA, but kept the ordered grid for 4x, so it wasn't that hot (especially compared to 3dfx's pseudo-temporal rotated grid). The FX merely improved speed. The GF6 finally brought rotated grid to both 2x AND 4x, closing the gap to ATI's 2x and 4x modes considerably. Again, 3DCenter had a good article on this, IIRC, which you should search for, but most initial 9700P and 6800U p/reviews should have concise write-ups on their respectively improved AA.

No disrespect intended, but so much of what you posted is wrong. It's been pointed out by other people, but maybe spelling it out in detail will clarify your errors. Don't take it personally, just learn and help the next guy out, like the rest of us try to. :)

Edit: My first two paragraphs seem more inflammatory than I mean them to be. I'll leave my post as is, just know that I didn't mean to come across so, well, mean. We all have to learn somehow, and I hope you learned more from my post than that I'm easily exasperated.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
If Nvidia is having their official G70 unveiling on June 21, I wonder how long it'll be after that until retailers can actually get them in stock. AND in decent quantities.

That last generation ATI/Nvidia paper launch was simply ridiculous.
 

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: Avalon
All this speculation is driving me crazy. I can't wait to see these beasts.

lol me 2....just driving me nuts with ppl arguing over which card will be the best.