NV40 Specs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Goi

Diamond Member
Oct 10, 1999
6,771
7
91
S1 SRAM? SRAM has traditionally been more expensive in large capacities than DRAM, so while its definitely fast, I don't see how its cheap.
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
Core process of 0.09 microns..... if they are still going to be made at TSMC I can forsee the delays coming already.....
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Originally posted by: Goi
S1 SRAM? SRAM has traditionally been more expensive in large capacities than DRAM, so while its definitely fast, I don't see how its cheap.

I ment 1T-SRAM, and no it isn`t that expensive.
 

Goi

Diamond Member
Oct 10, 1999
6,771
7
91
Hmmn, I haven't been following the development of 1T-SRAM...but since it only uses 1 transistor, it would be cheaper than conventional SRAM I guess.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
1T-SRAM is used in the gamecube, on the flipper chip, the main memory, and the a-memory. Thats one of the reasons that the gamecube is so cheap compared to other consoles.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Prehaps they might do some heavy optimisation to the core, or use some type of exotic cooling solution.
 

godmare

Diamond Member
Sep 25, 2002
5,121
0
0
Originally posted by: Regs
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.

.09 micron will run much cooler than .15 or even .13.
 

Glitchny

Diamond Member
Sep 4, 2002
5,679
1
0
Originally posted by: godmare
Originally posted by: Regs
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.

.09 micron will run much cooler than .15 or even .13.

yes but since they are having troubles and only now working out the .13 process moving immidatley to .09 may be a bad move by nvidia if the process isnt mature ewnough yet
 

godmare

Diamond Member
Sep 25, 2002
5,121
0
0
Originally posted by: Glitchny
Originally posted by: godmare
Originally posted by: Regs
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.

.09 micron will run much cooler than .15 or even .13.

yes but since they are having troubles and only now working out the .13 process moving immidatley to .09 may be a bad move by nvidia if the process isnt mature ewnough yet

That's a given, of course.
Performance aside, though, the .09 micron process is cooler.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Ok let me see if im just getting the wrong picture.

You have a gpu running at a speed of an Intel P3 and with fast ram that constantly transfer data during gaming on maybe a 8x4 video card..... You are telling me that a .09 micron process will help make this beast cool?

Either way I cannot see someone running this video card without at least a true 350 watt PSU .
 

godmare

Diamond Member
Sep 25, 2002
5,121
0
0
Originally posted by: Regs
Ok let me see if im just getting the wrong picture.

You have a gpu running at a speed of an Intel P3 and with fast ram that constantly transfer data during gaming on maybe a 8x4 video card..... You are telling me that a .09 micron process will help make this beast cool?
p3's were made in .25 and .18 micron processes (link), so a .09 micron gpu will definitely be cooler, yes.

To answer your initial question more directly:
They will cool it the same way gpus are cooled now, and the same way cpus have been cooled since the inception of the pentium, the same way hot running electronics have always been cooled: with heatsinks and fans.

They're not just going to say "well, it's just too hot, we'll scrap this project altogetger."
Instead, they'll say "well, you can cool an 80+ watt Athlon XP running at over 2 GHz nearly silently, and it's not even .09 micron, so just maybe this will work."

Edit regarding your psu comment:
YEs, that is likely true. My GF4 recommended a 350-watt psu on the box, as do higher-end Radeaons, afaik. This is the direction of computers in general, though. The bare minimum anyone will run on a current ATX setup is 300 watts anyway, so 350isn't outlandish, IMO.
That segues into a different conversation about the difference between quality psus and poor psus. That's for a different forum, though.
 

Goi

Diamond Member
Oct 10, 1999
6,771
7
91
I just hope the cooling solution won't turn out to be another FlowFX, and I'm sure I speak for everyone here. If the trend continues, they're gonna need to call for a new ATX spec.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Point taken godmare. I don't see this NV40 coming out any time soon but I would like to see how they puzzle it together.
 

godmare

Diamond Member
Sep 25, 2002
5,121
0
0
Originally posted by: Regs
Point taken godmare. I don't see this NV40 coming out any time soon but I would like to see how they puzzle it together.

Yes, we all would :D
 

Slaimus

Senior member
Sep 24, 2000
985
0
76
Do not forget the PS2 already has 4MB of embedded DRAM in its graphics chip, so 16MB in another year would be more than achievable.
 

akapadia

Junior Member
Feb 28, 2003
7
0
0
These specs dont add up...the nv40 chip will be out well before q3 '04 and if it includes 16mb of embedded dram...its gonna be much bigger than 350million transistors...considering that the current gffx's are around 125mil and the core logic should approx double in next gen. remember that on cpu's 1mb of memory corresponds to about 40+million transistors...also i seriously doubt tsmc will be 0.09 ready by then either...its gonna be much more difficult than .13...since low k will become a must. My guess is some bored freak on the futuremark forum decided to write down some creative stuff...however, it does make you realize that by q3 2004 all of our graphics cards...including the nv35 and radeon 9800 will suck!
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
q3 2004 all of our graphics cards...including the nv35 and radeon 9800 will suck!

Depending on your definition of suck. More powerful graphic cards are being developed faster then high-end graphic engine games are being produced. Like 8X agp and direct x 9. The only game we've heard as of late that will actually use the power of a 9700/9800 ati or FX will be Doom 3. I'm sure others will fallow, but their progress on catching up to the hardware specs seems to be hindered.