• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

NV40 Specs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
S1 SRAM? SRAM has traditionally been more expensive in large capacities than DRAM, so while its definitely fast, I don't see how its cheap.
 
Core process of 0.09 microns..... if they are still going to be made at TSMC I can forsee the delays coming already.....
 
Originally posted by: Goi
S1 SRAM? SRAM has traditionally been more expensive in large capacities than DRAM, so while its definitely fast, I don't see how its cheap.

I ment 1T-SRAM, and no it isn`t that expensive.
 
Hmmn, I haven't been following the development of 1T-SRAM...but since it only uses 1 transistor, it would be cheaper than conventional SRAM I guess.
 
1T-SRAM is used in the gamecube, on the flipper chip, the main memory, and the a-memory. Thats one of the reasons that the gamecube is so cheap compared to other consoles.
 
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.
 
Prehaps they might do some heavy optimisation to the core, or use some type of exotic cooling solution.
 
Originally posted by: Regs
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.

.09 micron will run much cooler than .15 or even .13.
 
Originally posted by: godmare
Originally posted by: Regs
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.

.09 micron will run much cooler than .15 or even .13.

yes but since they are having troubles and only now working out the .13 process moving immidatley to .09 may be a bad move by nvidia if the process isnt mature ewnough yet
 
Originally posted by: Glitchny
Originally posted by: godmare
Originally posted by: Regs
How will they keep a 800 Mhz core cool is my question. Hell, the FX cards are power hoggers as is with their cores.

.09 micron will run much cooler than .15 or even .13.

yes but since they are having troubles and only now working out the .13 process moving immidatley to .09 may be a bad move by nvidia if the process isnt mature ewnough yet

That's a given, of course.
Performance aside, though, the .09 micron process is cooler.
 
Ok let me see if im just getting the wrong picture.

You have a gpu running at a speed of an Intel P3 and with fast ram that constantly transfer data during gaming on maybe a 8x4 video card..... You are telling me that a .09 micron process will help make this beast cool?

Either way I cannot see someone running this video card without at least a true 350 watt PSU .
 
Originally posted by: Regs
Ok let me see if im just getting the wrong picture.

You have a gpu running at a speed of an Intel P3 and with fast ram that constantly transfer data during gaming on maybe a 8x4 video card..... You are telling me that a .09 micron process will help make this beast cool?
p3's were made in .25 and .18 micron processes (link), so a .09 micron gpu will definitely be cooler, yes.

To answer your initial question more directly:
They will cool it the same way gpus are cooled now, and the same way cpus have been cooled since the inception of the pentium, the same way hot running electronics have always been cooled: with heatsinks and fans.

They're not just going to say "well, it's just too hot, we'll scrap this project altogetger."
Instead, they'll say "well, you can cool an 80+ watt Athlon XP running at over 2 GHz nearly silently, and it's not even .09 micron, so just maybe this will work."

Edit regarding your psu comment:
YEs, that is likely true. My GF4 recommended a 350-watt psu on the box, as do higher-end Radeaons, afaik. This is the direction of computers in general, though. The bare minimum anyone will run on a current ATX setup is 300 watts anyway, so 350isn't outlandish, IMO.
That segues into a different conversation about the difference between quality psus and poor psus. That's for a different forum, though.
 
I just hope the cooling solution won't turn out to be another FlowFX, and I'm sure I speak for everyone here. If the trend continues, they're gonna need to call for a new ATX spec.
 
Point taken godmare. I don't see this NV40 coming out any time soon but I would like to see how they puzzle it together.
 
Do not forget the PS2 already has 4MB of embedded DRAM in its graphics chip, so 16MB in another year would be more than achievable.
 
These specs dont add up...the nv40 chip will be out well before q3 '04 and if it includes 16mb of embedded dram...its gonna be much bigger than 350million transistors...considering that the current gffx's are around 125mil and the core logic should approx double in next gen. remember that on cpu's 1mb of memory corresponds to about 40+million transistors...also i seriously doubt tsmc will be 0.09 ready by then either...its gonna be much more difficult than .13...since low k will become a must. My guess is some bored freak on the futuremark forum decided to write down some creative stuff...however, it does make you realize that by q3 2004 all of our graphics cards...including the nv35 and radeon 9800 will suck!
 
q3 2004 all of our graphics cards...including the nv35 and radeon 9800 will suck!

Depending on your definition of suck. More powerful graphic cards are being developed faster then high-end graphic engine games are being produced. Like 8X agp and direct x 9. The only game we've heard as of late that will actually use the power of a 9700/9800 ati or FX will be Doom 3. I'm sure others will fallow, but their progress on catching up to the hardware specs seems to be hindered.


 
Back
Top