Nvidia next generation to use odd memory configuation

AzN

Banned
Nov 26, 2001
4,112
2
0
Can it be that Nvidia will bring a real next generation product just a quarter after it released its 9800 series? Well, we don?t have the answer to that particular question, but as we reported here, Nvidia is working on a new GPU codenamed D10U-30. The D10U-30 will feature 1,024MB of GDDR3 memory and we learned that there will be one more SKU below it.

The second one is codenamed D10U-20 and it will have 896MB of memory, again of the GDDR3 flavor. This new card indicates that Nvidia can play with the memory configuration and that the new chip might support more than the regular 256-bit memory interface.

This one might support 384-bit or some other memory configurations, but we still don?t have enough details about it. It looks like Nvidia doesn?t feel that going for GDDR4 is necessary and it looks like the company will rather jump directly from GDDR3 to GDDR5.

http://www.fudzilla.com/index....=view&id=6702&Itemid=1


My guess would be something like this....

32 ROP, 96 TMU, 192SP, 512bit memory controller 1024 mb

28 ROP, 80 TMU, 160SP, 448bit memory controller 896 mb
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Hey, if they released a gx2 card, then you know a new high end gpu is on the way...:D
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
120 day step-up on the GX2. Which means the earliest we'll see this card is August 2nd. A bit over a quarter away.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Why would nvidia adjust releasedates so stepping up is not viable ? Why did evga increase the step up time last time around, if they want to screw customers as bad as you think they want to ? I know, it's not really relevant to this thread, but keep that in mind please ...

I have no clue what the new vid cards will look like, simply guessing rops, tmu's, shaders and memory interfaces won't do us any good. I don't understand these increase memory sizes. 512mb is still plenty, I could see 768mb being somewhat usefull if you run at really high resolutions and want to futureproof a little for new games, but as the 256bit memory interfaces on the g92 cards have shown, all we really need is more bandwith.
 

angry hampster

Diamond Member
Dec 15, 2007
4,232
0
0
www.lexaphoto.com
I'm kind of happy that they're using GDDR3 on the new cards (assuming this is right). With a mem bus that wide, GDDR4 or GDDR5 shouldn't be necessary. These cards will be unbelievable in hi resolutions if these specs are anywhere near true.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: MarcVenice
I have no clue what the new vid cards will look like, simply guessing rops, tmu's, shaders and memory interfaces won't do us any good.
You might be right about additional shader processors doing little to help, but increases with everything else should help quite a bit.

I don't understand these increase memory sizes. 512mb is still plenty, I could see 768mb being somewhat usefull if you run at really high resolutions and want to futureproof a little for new games, but as the 256bit memory interfaces on the g92 cards have shown, all we really need is more bandwith.
Doom 3 needs more than 512MB to play completely smooth on its highest settings...and when you up the bandwidth to allow for higher resolution/AA you're going to need more memory to go along with it.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
Seriously, Doom 3 uses uncompressed textures on its highest settings, resulting in barely ANY increase in IQ. Getting massive amounts of vram just for that is ridiculous. Some brands will release their own 1gb cards, just like with the 8800gt, but for mainstream and even high-end, it's unnecesary. Look at Crysis, unrivaled in graphics, do you need more then 512mb of vram ?

And i'm not saying increasing 'everything' else won't help, I'm saying guessing the amount of what we will see on g200 is not going to get us anywhere, it's pure speculation. I have no idea when increasing the rops is futile, because a card might then be bottlenecked by something else, etc etc.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
It's only a matter of time before more games benefit from more than 512MB of video mem. Stalker already uses more than that. I think the original g80 design was a good choice with 24 ROPs, 384-bit mem interface and 768MB mem. Increasing the shader and TMU count to 192/96 or more would offer a sizeable boost in performance using the same memory configuration.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: MarcVenice
Seriously, Doom 3 uses uncompressed textures on its highest settings, resulting in barely ANY increase in IQ. Getting massive amounts of vram just for that is ridiculous. Some brands will release their own 1gb cards, just like with the 8800gt, but for mainstream and even high-end, it's unnecesary. Look at Crysis, unrivaled in graphics, do you need more then 512mb of vram ?
Seems you've made it quite clear you're not in the market for a highend card. Your solution is already here with the 9600GT, you should be more than happy with it.

My point with Doom 3 is that its now a pretty old game, and it can use more than 512MB of video ram.

And we don't really know what Crysis needs right now seeing as how its slow on everything on relatively low resolutions - we have yet to crank up the 'above and beyond' resolution and IQ settings that will truly push a memory architecture as wide or big as a 512bit/1GB...

And i'm not saying increasing 'everything' else won't help, I'm saying guessing the amount of what we will see on g200 is not going to get us anywhere, it's pure speculation. I have no idea when increasing the rops is futile, because a card might then be bottlenecked by something else, etc etc.
well if they're increasing everything, then there shouldn't be a bottleneck...

We've gone from 384bit/768MB and even 512bit/1024MB back down to 256bit/512MB, and this started November 8, 2006...It's now April 8, 2008, and you're saying 512bit/1024MB is too much?

IMO you sound like someone who thinks they won't be able to afford one of these new high end cards and are futilely complaining about one of the new features you think will unjustifiably increase the cost of the product.