New info on the G70 and R520...

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ddogg

Golden Member
May 4, 2005
1,864
361
136
Originally posted by: Creig
If Nvidia is having their official G70 unveiling on June 21, I wonder how long it'll be after that until retailers can actually get them in stock. AND in decent quantities.

That last generation ATI/Nvidia paper launch was simply ridiculous.

hopefully atleast their highend-midrange should be available...the flagship card will definetely be rare and come at a hefty price.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I think they will be for sale June 22. Will make it to Canada by August. :beer:
 
Mar 19, 2003
18,289
2
71
Originally posted by: ronnn
I think they will be for sale June 22. Will make it to Canada by August. :beer:

I sure HOPE they're available right after the launch (non-paper, if we're lucky :p). Maybe not the next day, but within a matter of weeks, not months. I also hope they're available at MSRP and not price gouged (since the MSRP is already high enough as it is)....

But what are the chances of that? :p:(

Edit: Though the reports of G70 already being in production do give me some hope. We just don't want a repeat of last year's madness. ;)
 

biostud

Lifer
Feb 27, 2003
19,951
7,049
136
Personally I'm not going to decide which yo buy before I know the specs and pricing of the cards. So even if nVidia get a head start in launch date, I will still wait to see what the r520 has to offer before buying. I would guess that most are going to do the same. So even if nvidia might get their card launched first, it'll probably not going to help them in selling lots of g70, before the r520 has been benched. This ofcourse depends on how large the speed margin is and how long we'll have to wait, before the cards of course is in stock.
 
Mar 19, 2003
18,289
2
71
Originally posted by: biostud
Personally I'm not going to decide which yo buy before I know the specs and pricing of the cards. So even if nVidia get a head start in launch date, I will still wait to see what the r520 has to offer before buying. I would guess that most are going to do the same. So even if nvidia might get their card launched first, it'll probably not going to help them in selling lots of g70, before the r520 has been benched. This ofcourse depends on how large the speed margin is and how long we'll have to wait, before the cards of course is in stock.

I agree. If Nvidia releases a 24-pipe GTX and then (within a few months) ATI released a 32-pipe card and Nvidia has to bring out a 32-pipe Ultra to compete...then it will be better to have waited for the better card, assuming they're not charging $1200 for it - and assuming you don't NEED a new card immediately, obviously. (Since I already have a 6800GT and therefore don't need to upgrade too badly, might as well make the most of the upgrade when I get to that point)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
ATI seriously debuted AF with the 8500 *after* nV did with the GF3.
ATi already had fast 16xAF with the original Radeon, long before the GF3 was in the picture.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Pete
Gamingphreek, you're wrong about the OP's quote being biased one way or the other, you're wrong about nV not having released a "pipeline deficient" card in the past, you're wrong about the the FX's "microarchitecture" and compilers being the reason it lags so far behind in DX9 games, you're wrong about nV using FP32 with the FX in most games, you're wrong about nV always having had the edge in AA, you're wrong about nV having had worse AF until the GF6.

You're just wrong on so many counts, it'd be kind of incredible if you weren't doing this on purpose. My suggestion--at least with respect to nVidia's recent history and this thread in particular--is to post less and read more.

Since when is it biased to think ATI can cram more pipelines in with a smaller process than nVidia? Seems like common sense. Also seems like common sense that they may have yield problems with this new process, at least initially.

Surely synthetic benchmarks numbers and the 3DM03 fiasco taught you that nV must/will use FP16 precision to maintain reasonable performance with their cards?

Surely nV worked their butts off to optimise "DX9" games to their FX architecture. Why is it that Far Cry and HL2 and the like run so much worse on the FX series? In fact, why does the four-pipe FX series run about as fast as the four-pipe 9600 series in those games? It's possible game devs are inept and don't want to maximize FX owners' IQ with their cards. Or it's possible the FX just has some (micro)architectural flaws that can't be reasonably expected to be overcome in the real world of deadlines and "cross-platform" development.

ATI seriously debuted AF with the 8500 *after* nV did with the GF3. ATI's AF had almost no performance hit--compared to the GF3's rather large one--because ATI just used bilinear filtering and angle-dependency, while nV used trilinear filtering and "fully" filtered all angles. nV also went above and beyond with its filtering quality, whereas ATI stuck to the D3D bare minimum. 3DCenter had a good article on this, IIRC, which you should search for.

nV basically had "average" AA until the GF6 series, as did ATI until the 9700 series. 3dfx was first out of the gate with excellent (but at a huge hit) AA with the Voodoo 4/5 series. The 9700 moved ATI from supersample to multisample AA, which improved performance; adapted a jittered grid, which improved quality; integrated gamma-"correction", which also improved quality; and allowed for up to three passes, which allowed for higher max quality at a reasonable speed. The GF3 moved to MSAA, but kept the ordered grid for 4x, so it wasn't that hot (especially compared to 3dfx's pseudo-temporal rotated grid). The FX merely improved speed. The GF6 finally brought rotated grid to both 2x AND 4x, closing the gap to ATI's 2x and 4x modes considerably. Again, 3DCenter had a good article on this, IIRC, which you should search for, but most initial 9700P and 6800U p/reviews should have concise write-ups on their respectively improved AA.

No disrespect intended, but so much of what you posted is wrong. It's been pointed out by other people, but maybe spelling it out in detail will clarify your errors. Don't take it personally, just learn and help the next guy out, like the rest of us try to. :)

Edit: My first two paragraphs seem more inflammatory than I mean them to be. I'll leave my post as is, just know that I didn't mean to come across so, well, mean. We all have to learn somehow, and I hope you learned more from my post than that I'm easily exasperated.

Maybe i confused the AA and AF scenarios with this generation. I know that this generation however the AF was fixed however still suffers a large performance hit due to the loss of an ALU.

I believe there was a thread on this not too long ago (made by mean)... something like "Whats wrong with Nvidia". In that thread that is where i first saw this information i posted.

-Kevin
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
For those that were interested in seeing what it looks like in a box (as I said, since the images are already out there, I am only providing these based on what you have already seen).

http://server.counter-strike.net/images/misc/exteriorflash.jpg
http://server.counter-strike.net/images/misc/exteriornoflash.jpg
http://server.counter-strike.net/images/misc/interiorflash.jpg
http://server.counter-strike.net/images/misc/interiorflash2.jpg
http://server.counter-strike.net/images/misc/interiornoflash1.jpg

Take them for what they're worth.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Ronin
For those that were interested in seeing what it looks like in a box (as I said, since the images are already out there, I am only providing these based on what you have already seen).

http://server.counter-strike.net/images/misc/exteriorflash.jpg
http://server.counter-strike.net/images/misc/exteriornoflash.jpg
http://server.counter-strike.net/images/misc/interiorflash.jpg
http://server.counter-strike.net/images/misc/interiorflash2.jpg
http://server.counter-strike.net/images/misc/interiornoflash1.jpg

Take them for what they're worth.


Cool looking case
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: Ronin
For those that were interested in seeing what it looks like in a box (as I said, since the images are already out there, I am only providing these based on what you have already seen).

http://server.counter-strike.net/images/misc/exteriorflash.jpg
http://server.counter-strike.net/images/misc/exteriornoflash.jpg
http://server.counter-strike.net/images/misc/interiorflash.jpg
http://server.counter-strike.net/images/misc/interiorflash2.jpg
http://server.counter-strike.net/images/misc/interiornoflash1.jpg

Take them for what they're worth.

Ronin, you rock.

:thumbsup:

18 more days, woot. Countdown time.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
This is just my take on it. So don't 'prove me wrong'. I never said these were facts. This is just my prediction on what will happen.

ATI will have trouble with its new process and 32 pipes, so should a 24-piped G70 come ahead, ATI will be in trouble because of an inferior architecture. The next-next generation, the situation will be in reverse because ATI already has a 90nm part.

NVIDIA thinks they can get ahead by releasing first, and ATI thinks it's better to wait and release a better card later.

NVIDIA's card will show up a month after its 6/21 "launch", at MSRP (or remotely reasonable) prices. Before that month later, the cards will all be on pre-order at seriously inflated prices, with just a couple available. Since NVIDIA already started mass-producing them, they are preparing for the worst. So I think this coming generation will still have a paper launch, just not nearly as bad. The exception will be the G70 GTX, which would be analogous to a 6800 Ultra Express for a couple of months.

(All these are single-slot.)

Month after 6/21 "launch":

7800 GTX PCI-E (G70, 24 pipes) - $700+
7800 GT PCI-E (G70, 24 pipes) - $550
7800 PCI-E (G70, 20 pipes, unlockable) - $425

Another month thereafter:

7800 GTX PCI-E (G70, 24 pipes) - $600+
7800 GT PCI-E (G70, 24 pipes) - $485
7800 PCI-E (G70, 20 pipes, unlockable) - $400

Later on, 4 months after the 7800 is announced, a 7600 will be announced, and a month later:

7600 GTX PCI-E (G72, 16 pipes) - $325
7600 PCI-E (G73, 12 pipes, not unlockable) - $275

Hey, I can even make up fake codenames. :D

Note that I'm a pessimist. Let's hope my predictions don't come true.
 

bjc112

Lifer
Dec 23, 2000
11,460
0
76
Wonder when the AGP release will happen..

I assume not immediately. But who knows?!?!
 

zendari

Banned
May 27, 2005
6,558
0
0
Too bad these cards will still cost your left nut to buy, with the 6800s still being $400+.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Conjecture?

It's highly possible you won't see one. I'd venture it's going to depend on the adoption rate with this release (and with the PCIe variants of the 6xxx cards), but if you do see any AGP releases, it will probably be only one card from either/both companies, and not much more.
 

nombrecinq

Senior member
May 15, 2005
245
0
0
I just bought the ATI x800xl! And I don't even have the rest of the parts for my computer. I'll probably be ordering the other stuff this month. So... should I sell the ATI card that I haven't even used yet in preperation to buy a new nVidia card? How much will the G70 debut at?
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I know that this generation however the AF was fixed however still suffers a large performance hit due to the loss of an ALU.
The potential performance hit is there if the game is heavily shader based and you're using high levels of AF. Of course you can use SM3.0 to lower the workload on the shader units which helps counter the problem.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: Gamingphreek

On the last image, what chip is the top one. I assue the bottom is the G70, however what is the top one. I cant make out the name on the core but it looks like 6600GT or at least something GT.

-Kevin

Top one is a 6600GT, bottom is a 7800GTX.

 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: BFG10K
I know that this generation however the AF was fixed however still suffers a large performance hit due to the loss of an ALU.
The potential performance hit is there if the game is heavily shader based and you're using high levels of AF. Of course you can use SM3.0 to lower the workload on the shader units which helps counter the problem.

Oh that would make sense. If the game isn't heavily stressing the shaders it isn't really stressing the ALU's and can afford, without much performance loss, to lose one. Thanks!
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I can't make any predictions for R520 and am not really interested in doing so. I will note that I am reasonably sure that R520 won't be based too heavily on the old R300 design (fanboys and ATi engineers have suggested all that is needed is to bolt-on SM3.0 to the existing design, but it isn't that simple IMO. ATi's current GPU's lack many many things that are mandatory for SM3.0 like branching/looping at take many hardware shortcuts. I think the entire pipeline has had to be redesigned as a consequence and this is the main reason behind why R520 has taken so long to appear).

On nVidia's side, my predictions are that we will see a full gamut of filtering and AA options for the floating point rendering (OpenEXR) modes.

We should also see see a performance increase in shader operations due to further improvement in the onchip resource scheduling (this was a major factor in NV3x performing as poorly as it did and was corrected for nV40 allowing the driver JIT compiler to extract the most performance possible from the chip each cycle). A modest increase in registers could yield significant performance gains. nV40 does not have many more registers per pipe than NV3x did.

Other things that could possibly happen (but I don't necessarily expect to happen until the 90 nm version of G70 hits - if at all) would include adding TMU capabilities to the second large ALU in each pipeline and making both large ALU's capable of performing all math operations. It's also possible that 128 bit floating point rendering will be introduced (the playstation3 GPU features it), but I don't think G70 will be fast enough until the 90nm refresh personally.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
IIRC, it was more than 2 months from "Launch" to the first appearance of a SKU. If history repeats, August for the G70. Widely available in mid/late September.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Ya know what would be funny...

nVidia: "here's our 24 pipeline GTX"
ATI: few months later "here's our 32 pipeline card... haha! take that!"
nVidia: "here's our 48 pipeline Ultra Extreme... forgot we had this..."