Should I wait for the 7 series

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Lonyo
Originally posted by: keysplayr2003
Originally posted by: Bar81
Inq is claiming it and the R520 will show at computex and that the 7xxx architecture will be TWICE as fast as the 6xxx series. If that's true, DAMN. I upgrade when there's double the performance to be had so maybe I'll have to take a look at next gen after all. Of course in the concern camp is that the new GPU from nvidia will still run on a .11 micron process which means that it's likely to be *very* hot (hopefully the dustbuster cooling units won't be making a comeback in nvidia land) while the R520 will be running on the cooler .09 micron process.

Yes, DigiTimes reports the G70 will be built on the 110nm process. And unless they fixed the 90nm process, the R520 can count on about 15% current leakage. Kind of like what happened to prescott. This is of course only if the problem has not been addressed. Hopefully is has been. I'm sure by now it is a more refined process.
AMD don't seem to have had issues with 0.9 micron technology, so it could be we see ATi having the situation AMD had, and not Intel.
But who knows?

Both AMD and Intel Fab their own parts.. ATI and Nvidia use IBM/TSMC and I believe there is a third widely used company. Starts with a U. Anyway, it all depends on how TSMC does with their 90nm fab process. By now, I think they will do ok.

 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
Originally posted by: keysplayr2003
Yes, DigiTimes reports the G70 will be built on the 110nm process. And unless they fixed the 90nm process, the R520 can count on about 15% current leakage. Kind of like what happened to prescott. This is of course only if the problem has not been addressed. Hopefully is has been. I'm sure by now it is a more refined process.

Why doesn't nVidia use SOI technology like AMD to make current leakage as minimum as possible? For that matter why doesn't Intel?:)

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
I think AMD got help from IBM WRT SOI on the A64 (acronyms akimbo!). I wonder if SOI would help significantly with the relatively low (compared to CPUs) clock speed of GPUs?

I think both NV and ATI are working with TSMC again, so whether they use SOI or low-k is really up to whether TSMC offers it.

IIRC, that "U" company is UMC, and I believe it tends to lag behind TSMC in terms of processes (and maybe capacity). I'm not sure at all, though--that's a faint memory of another forum post.
 

thegimp03

Diamond Member
Jul 5, 2004
7,420
2
81
If the recent (note: last year's) release of the next-gen video cards is any symbol of things to come, you won't be able to find a G70 on shelves at a reasonable price for 3-5 months after it's "release". They will site production line problems in order to keep the prices ridiculously high. It's the same old game. I would just buy a mid-line card right now - 6600 gt/6800NU. They play everything out there and you can get them for decent prices. Later on, sell it and buy the G70. Otherwise you'll probably be so anxious to buy the next-gen card that you'll pay way over MSRP for it. Just my 2 cents.
 

Tanclearas

Senior member
May 10, 2002
345
0
71
Look for a deal on the 6800GT. Nvidia (more specifically, GeForce) is kind of like the Star Trek movies (God, I'm such a geek). The even-numbered ones always seem to be the best.