D10U-30 from nVidia

MXD TESTL4B

Junior Member
Apr 7, 2008
15
0
0
Hi,

had Nvidia inhouse (kinda late) and NO its not an G92 derivate. Its the next NVIDIA Chip Generation (no name yet)
nothing to do with the G9x chips...

Some notes I made:

D10U-30 -> 1024MB GDDR3 -> 512Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8pin) & 2x3 (6pin)]
D10U-20 -> 896MB GDDR3 -> 448Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8n) & 2x3 (6pin)]


Supporting Hybrid Power & Hybrid Boost.....get your power supply prepared for the 8pin connectors in the future ^^

#####

Besides:

Nvidia is going to support the 9800GTX for a full year! I guess its only because they will come up with
a DIE shrink to 55nm in the second half of the year ^^
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
so its actually a "test" version of the upcoming G200?

as 8800GT 512MB is to 9800GTX, a test version of g92 ?
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: MXD TESTL4B
Hi,

had Nvidia inhouse (kinda late) and NO its not an G92 derivate. Its the next NVIDIA Chip Generation (no name yet)
nothing to do with the G9x chips...

Some notes I made:

D10U-30 -> 1024MB GDDR3 -> 512Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8pin) & 2x3 (6pin)]
D10U-20 -> 896MB GDDR3 -> 448Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8n) & 2x3 (6pin)]

Supporting Hybrid Power & Hybrid Boost.....get your power supply prepared for the 8pin connector ^^

#####

Besides:

Nvidia is going to support the 9800GTX for a full year! I guess its only because they will come up with
a DIE shrink to 55nm in the second half of the year ^^


Uhhh 250watt power req... Come on Nvidia... that's gotta be a joke. What's the 8800ultra use? Half that?

 

bdubyah

Senior member
Nov 20, 2007
541
1
81
so you need an 8pin and a 6pin? god, this is getting rediculous. why don't they just include a power cable we can just plug straight into the wall?
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: bdubyah
so you need an 8pin and a 6pin? god, this is getting rediculous. why don't they just include a power cable we can just plug straight into the wall?

LoL nvi-dfx
 

Rusin

Senior member
Jun 25, 2007
573
0
0
So this is fift Nvidia's GPU-generation which would use GDDR3? First one being infamous Geforce FX.
About those power requirements; didn't Geforce 9800 GX2 already have 250W TDP value [and that card consumes less than HD3870 X2]? So these cards wouldn't consume more than 9800 GX2.

Lithan:
If I remember it correctly 8800 Ultra's TDP is close to 180W.
 

TC91

Golden Member
Jul 9, 2007
1,164
0
0
im not liking how those 8pin connectors are becoming more common on video cards
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Originally posted by: MXD TESTL4B
Hi,

had Nvidia inhouse (kinda late) and NO its not an G92 derivate. Its the next NVIDIA Chip Generation (no name yet)
nothing to do with the G9x chips...

Some notes I made:

D10U-30 -> 1024MB GDDR3 -> 512Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8pin) & 2x3 (6pin)]
D10U-20 -> 896MB GDDR3 -> 448Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8n) & 2x3 (6pin)]

Supporting Hybrid Power & Hybrid Boost.....get your power supply prepared for the 8pin connector ^^

#####

Besides:

Nvidia is going to support the 9800GTX for a full year! I guess its only because they will come up with
a DIE shrink to 55nm in the second half of the year ^^
Very interesting. So is this D10U = GT200? (Fudzilla says they're different but I can't imagine NV releasing two different high-end in consecutive quarters)

Power consumption number is ridiculous but at least we're finally getting 512bit/1GB. The G92's memory management is absolutely horrendous.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: lopri

Very interesting. So is this D10U = GT200? (Fudzilla says they're different but I can't imagine NV releasing two different high-end in consecutive quarters)

Power consumption number is ridiculous but at least we're finally getting 512bit/1GB. The G92's memory management is absolutely horrendous.
Why not? Geforce 8800 GTX were released just like 4-5 months after Geforce 7950 GX2. If these rumours are correct then 9800 GX2 would be replaced 4-5 months after release.

In both cases Nvidia needed something for that enthusiast level for a little while before next generation.

----

I don't know about those rumours saying that GT200 would be made with 65nm. I mean that Nvidia is soon launching 55nm versions of G94 and G96. There was story at VR-Zone about those. At least first 55nm chips will go to mobile graphics cards.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bdubyah
so you need an 8pin and a 6pin? god, this is getting rediculous. why don't they just include a power cable we can just plug straight into the wall?

really, a Big-Ass Power Brick that plugs into ANY nvidia GPU and goes into the wall [or UPS if you have several motorcycle batteries wired together to keep it running 'till shut down in 15 or so seconds]
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
WTF...how many cards do they want to flood us with? Lol. So lets have a look at the 'high end' 8800 GTS 320 >8800GTS 640 > 9600GT> 8800GT > 8800GTS 512 >8800GTX >8800 Ultra >9800GTX > 9800GX2 > new thing before GT200?

Phew....so if 512bit/1024mb can we assume this is a new architecture? The next few months should prove to be interesting.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Sylvanas
WTF...how many cards do they want to flood us with? Lol. So lets have a look at the 'high end' 8800 GTS 320 >8800GTS 640 > 9600GT> 8800GT > 8800GTS 512 >8800GTX >8800 Ultra >9800GTX > 9800GX2 > new thing before GT200?

Phew....so if 512bit/1024mb can we assume this is a new architecture? The next few months should prove to be interesting.

It could be the same architecture, just simply more parallelization.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Lithan
Originally posted by: MXD TESTL4B
Hi,

had Nvidia inhouse (kinda late) and NO its not an G92 derivate. Its the next NVIDIA Chip Generation (no name yet)
nothing to do with the G9x chips...

Some notes I made:

D10U-30 -> 1024MB GDDR3 -> 512Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8pin) & 2x3 (6pin)]
D10U-20 -> 896MB GDDR3 -> 448Bit -> 225Watt - 250Watt power consumption [connectors: 2x4 (8n) & 2x3 (6pin)]

Supporting Hybrid Power & Hybrid Boost.....get your power supply prepared for the 8pin connector ^^

#####

Besides:

Nvidia is going to support the 9800GTX for a full year! I guess its only because they will come up with
a DIE shrink to 55nm in the second half of the year ^^


Uhhh 250watt power req... Come on Nvidia... that's gotta be a joke. What's the 8800ultra use? Half that?

That's how much power the card use not the entire power requirement for system. :light:
 

AzN

Banned
Nov 26, 2001
4,112
2
0
They are talking about GT200.

Far as I know these chips use odd memory configuration because of memory bit bus. Similar to 8800gts 640mb, 8800gtx 768mb, and 8800gs 384mb


My educated guess would be:

32 ROP, either 48 or 96 TMU, 192SP, 512bit memory controller 1024mb


28 ROP, either 40 or 80 TMU, 160SP, 448bit memory controller 896mb


Using DDR3 is not a problem when it has 448bit memory controller. Problem with DDR5 is latency but in the long haul it's clock speed ramps upto 2000mhz that latency won't matter.
 

MXD TESTL4B

Junior Member
Apr 7, 2008
15
0
0
hrhr, good news he ;-)

Additional for GX2: they said that the power consumption in their testlab was around ULTRA Level, a bit higher, about 180Watt - 200Watt....yeah sure ^^
Thats why they keep sticking a warning on it about danger of serious burns...

They tried to warn us about case heat problems with the new generation depending of the size of your
case and the level of aircooling....so the cards are going to get very hot ;-)

They still claimed that they don't have a marketing name/brand invented yet. The cards will for sure support the ESA concept of NVIDIA.
Can't wait for testing such a system with the appropiate power supply etc.... ^^
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
This thing must be one big monolithic single GPU card thats hungry for power. Sounds good enough for me. There were rumours of the two variants of this D10U codenamed GPU having 1024MB and 896MB of memory. I think its safe to assume either the memory interface is upped to 512bit/448bit, or memory configuration isnt tied to the memory interface anymore.

The lowest GDDR3 clock is at 1600MHz (3200MHz effective). But i think nVIDIA isnt going to use it, especially with the highend ever since the disaster with NV30 and GDDR2 memory.

I think what azn has is close to what the D10U-30 and 20 will look like. This new GPU would definitely support DX10.1, still use scalar shaders, has dual precision (hopefully), or maybe the shaders were buffed to dual MADD. I also think things like the triangle setup and other limiting/bottlenecking parts of the G80/G9x has been "fixed".
 

Yanagi

Golden Member
Jun 8, 2004
1,678
0
0
Originally posted by: Cookie Monster
This thing must be one big monolithic single GPU card thats hungry for power. Sounds good enough for me. There were rumours of the two variants of this D10U codenamed GPU having 1024MB and 896MB of memory. I think its safe to assume either the memory interface is upped to 512bit/448bit, or memory configuration isnt tied to the memory interface anymore.

The lowest GDDR3 clock is at 1600MHz (3200MHz effective). But i think nVIDIA isnt going to use it, especially with the highend ever since the disaster with NV30 and GDDR2 memory.

I think what azn has is close to what the D10U-30 and 20 will look like. This new GPU would definitely support DX10.1, still use scalar shaders, has dual precision (hopefully), or maybe the shaders were buffed to dual MADD. I also think things like the triangle setup and other limiting/bottlenecking parts of the G80/G9x has been "fixed".

Good post except you need to cut your mem speed in half :)
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
For those making comments about the ridiculous power requirements...keep in mind technologies such as Hybrid SLI...I certainly wouldn't mind a single GPU video card consume 250watts if it had performance to justify it - with something like Hybrid SLI, that power guzzling video card could be shut off entirely when not put to full use.

Originally posted by: TC91
im not liking how those 8pin connectors are becoming more common on video cards

people used to say the same thing about having any sort of external connector or dual slot (or more) cooling
 

ajaidevsingh

Senior member
Mar 7, 2008
563
0
0
I am quite sure that the new thing will not have 2XGPU on the same waffer... The Tulip Chick said it and i read it in an interview with some Nvidia guy but we never know anyway. This is what i think the next true Arc. should have:-

Core speed Around 1-1.5K
Memory around 2-3 K
Shaders around 2.5-3.5K
ROPS around 48-56
Die size around 200-300mm
Tech 55nm
Bus PCI E 2.0
DirectX 11
P.Fillrate 25-30 GP/s
T.Fillrate 70-80 GT/s
Memory type GDDR4
Bus Width 512-768bit
memory size 1-2 GB
Bandwidth 150-200 Gb/s
and offcourse 4 way SLI
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: ajaidevsingh
I am quite sure that the new thing will not have 2XGPU on the same waffer... The Tulip Chick said it and i read it in an interview with some Nvidia guy but we never know anyway. This is what i think the next true Arc. should have:-

Core speed Around 1-1.5K
Memory around 2-3 K
Shaders around 2.5-3.5K
ROPS around 48-56
Die size around 200-300mm
Tech 55nm
Bus PCI E 2.0
DirectX 11
P.Fillrate 25-30 GP/s
T.Fillrate 70-80 GT/s
Memory type GDDR4
Bus Width 512-768bit
memory size 1-2 GB
Bandwidth 150-200 Gb/s
and offcourse 4 way SLI

Not going to happen this time around.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: Azn
Originally posted by: ajaidevsingh
I am quite sure that the new thing will not have 2XGPU on the same waffer... The Tulip Chick said it and i read it in an interview with some Nvidia guy but we never know anyway. This is what i think the next true Arc. should have:-

Core speed Around 1-1.5K
Memory around 2-3 K
Shaders around 2.5-3.5K
ROPS around 48-56
Die size around 200-300mm
Tech 55nm
Bus PCI E 2.0
DirectX 11
P.Fillrate 25-30 GP/s
T.Fillrate 70-80 GT/s
Memory type GDDR4
Bus Width 512-768bit
memory size 1-2 GB
Bandwidth 150-200 Gb/s
and offcourse 4 way SLI

Not going to happen this time around.

512-768-bit bus width isn't going to happen, that's for sure.
Increasing bus width isn't a cost effective way of making a faster card when compared to using faster memory, especially in the long run if you want to be able to reduce costs.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Not to mention transistor count on a chip like that on current process. Is it even possible? Maybe?

Card like that would cost nearly 2 grand and power consumption of a kilowatt.
 

angry hampster

Diamond Member
Dec 15, 2007
4,232
0
0
www.lexaphoto.com
Originally posted by: Lonyo


512-768-bit bus width isn't going to happen, that's for sure.
Increasing bus width isn't a cost effective way of making a faster card when compared to using faster memory, especially in the long run if you want to be able to reduce costs.

It's not cost effective, but it's extraordinarily effective. We've already seen that G92 is quite limited by its 256 bit bus. A 448 or 512 bit bus with nearly a gig of memory would be fantastic. 768 is ridiculous at this point though