Nvidia NV 40 specs!

Socio

Golden Member
May 19, 2002
1,732
2
81

--------------------------------------------------------------------------------

This should make ATI fanboys cringe!


Here are the NV40 specs:
300-350 Million Transistors on 90-nm process
750-800 MHz Core
16MB Embedded DRAM (134M trans.)
1.4 GHz 256-512MB DDR-II Memory
8 Pixel Rendering Pipelines (4 texels each)
16 Vertex Shader Engines
204.8 GB/sec Bandwidth (eDRAM)
44.8 GB/sec Bandwidth (DDR-II)
25.6 GigaTexels per Second
3 Billion Vertices per Second
DirectX 10 (or extended 9.1)
Release H2 2004


Found them here:

http://www.quake.co.il/modules/news/article.php?storyid=117
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
When I read the info yesterday at a different site(same link to info btw)it did say "rumour", so they are unoffcial and shouldn`t be taken serious at this time.
 

ScrewFace

Banned
Sep 21, 2002
3,812
0
0
That's what the GeForce FX shoulda been and it would've put nVidia on top for a long time, but they f*cked it up!:|
 

Glitchny

Diamond Member
Sep 4, 2002
5,679
1
0
the fact that this is on the .9nm process kinda has me doubt that it is real or when it is coming out since nvidia jsut switched to .13 and still arent getting decent yeilds on the FX from what i hea. so moving to a .9nm process would seem a little far off since they havent pushed .13 at all.
 

gregor7777

Platinum Member
Nov 16, 2001
2,758
0
71
I also hear that you will be able to fry eggs on top of your case with one of these installed.
 

43st

Diamond Member
Nov 7, 2001
3,197
0
0
Just imagine the cooling solution for this one... I suspect it'll either run off dedicated 3 phase power or possably a gasoline engine. :)

Amazing how the folks are getting all wet about the NV40 when the NV30 hasn't even dropped yet. That's a bad sign.
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Originally posted by: ScrewFace
That's what the GeForce FX shoulda been and it would've put nVidia on top for a long time, but they f*cked it up!:|

huh yeah, they should've had a 300M transistor .09u part with DX10 compliancy out last summer... :p
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Yeah, ATi fanboys cringe! In a year or so nVidia will have a new part out!
And ATi will have been sat on their asses for that year and won't have something equal/better?
 

Furor

Golden Member
Mar 31, 2001
1,895
0
0
except it'll be moved back to early 2006 when ATI will probably have something similar.
 
Jul 1, 2000
10,274
2
0
Ati "fanboys" in despair
rolleye.gif
puhleaze.

This sounds like 3dfx's marketing team at work.
rolleye.gif
Remember about the great products promised when 3dfx was in the financial throws of despair?

I would not look forward to seeing an NV40 for quite some time. .13 is not a mature process yet. .9 is a while off.

nVidia can't produce a financially practical high-end NV35 yet.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Ive seen a very similar post on the Official Unreal 2 Forums.
The embeded dram will speed it up no end. Look at what the embeded dram did for gamecube.
By the time nV40s out, the R400 will be out as well.
IMO, i seriously doubt that these are the nV40 specs. Can you image how much it would cost to make just one board with these specs.
And nVidia should have learnt by then that DDR2 on its own is useless compared to 256bit buses. DDR2&256bit bus is the way forward for memory.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Embedded dram? I hope they have a licence for it from Bitboy's, because I think that is one of their major features, and no doubt have it surrounded by patents.
 

titanmiller

Platinum Member
Jan 5, 2003
2,123
2
81
WOW! As we all know initial guesses arent all that accurate. However it sure is nice to dream about.
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
As far as I know, embedded DRAM itself is not a special technology that can be afforded a patent..... its just memory embedded into the chip and it has been done before on many things other than graphics chips. A more accurate thing to think about is whose memory designs are they gonna pimp, or come up with their own one??

Doubtful about the 90nm process..... it may be ready in a year and a half but considering Intel aren't even there yet I doubt TSMC will have it in that time frame. Also, even at that process, the chip will be huge in terms of packaging, which also means a lot of heat..... I doubt they would push up to an 800MHz core with that many transistors.... possibly with a 2xGF FX vacuum cleaner HS/Fan units, but I doubt it, maybe around 600MHz mark.... but who knows.

Cant really comment on the memory speeds.... they will have the fastest RAM available at the time.... if its 1.4GHz then it will be on there, but even if its only still at 1GHz (not likely), it will be on there......
Of course, the same memory will be available to any other graphics chip maker such as ATI, hell even S3, so its a moot point.

As for it possibly have being out now..... thats not a completely untrue statement, there are plenty of things on the drawing board which could be done now, but due to technical limitations or lack of support (ie. DX 69 not being out yet), its not gonna happen.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
This from the same site that claims the NV30 would be a 4X4 architecture with a 256bit memory controller and a specialized voxel rendering unit.
rolleye.gif


Personally I'd say it looks like BS, .09u is possible and knowing nVidia's trend to bet on the latest tech they may well push for it if possible but it's hardly a guarantee.
750-800MHz core is viable enough IMHO.
16MB E-DRAM is highly unlikely especially paired with a 45GB/s main memory bandwdith, not to mention it will require a vast relayout compared to what they've designed in the past.
nVidia's never been one to make significant architectural changes unless absolutely required, and EDram would assuredly necessitate that if it's to be used effectively.
8x4 architecture? Not a chance in hell. I don't see 4 TMU's as being at all beneficial especially with pixel shaders partially taking over the job of dedicated TMU's in the long term. I'd bet on 2TMU's/per pixel rendering pipeline at most and even that may be stretching it, 4 fixed TMU's would almost seem to be portending a step back in rendering methodology relative to the NV30. A 16pixel shared rendering piepline, or 12-16 dedicated pixel rendering pieplines would cost less in transistor real estate and yield significantly more dividends.

16 VS? That's clearly not going to happen. nVidia has already dropped deciated units with the NV30, and gone the P10 route of a fexible array of small shaders working jointly.
Given all of nVidia's claims of improved vertex shading efficiency it would seem counterproductive to go back to a sheer brute force method.
In the long run 16dedicated units is a losing proposition.
I don't expect to see either the R400/NV40 to use deciated VS.
The NV30 has already egun to follow the P10's path, and I expect the NV40 to complete the journey.

DDR-II is possible, though I'd bet they'll jump to GDDR-III if viable in volume. Especially given that ATi is hedging their bets on skipping DDR-II entirely and jumping to GDDR-III.
44.8GB/s main memory bandwdith initially seems like an inordinately large jump to make is one generation, and dubious given the industry's tendency to make small bumps so as to easier facillitate milking each generation chip sales for every $ they can.
On the other hand a 256bit memory controller is almost a certainty, and 1.4GHz DRAM should be easily viable by then.
I'd look towards 38GB/s as being the likely figure. 1.2GHz RAM will be cost efficient by then and they won't be forced to pay a premium for the fastest, paired with a 256bit memory controller it should still provide a significant boost in bandwidth.

Personally 16 dedicated vertex shaders + a 256bit memory controller + an 8x4 arcgitecture + 16MB eDRAM looks to be somwhat unlikely if they expect to hit 350M transistors.

DX10 doesnt look likely until very late 2004 at the earliest, MS put PS/VS 3.0 in the DX9 spec specifically so they could leave DX9 as being viable for as long as possible to give them time to work on DX10.
I'd bet on NV40/R400 being DX 9 parts, or DX9.1 should PS/VS3.0 be renamed DX9.1.


Oh FWIW, the use of EDram in a graphics renderer cannot be patented, the specific implementation can be however. There is nothing to force nVidia into following the same path as BitBoys in implementation however.
In any case, EDram is extremely dubious in my mind.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
it is total bs, maybe if the whole technology industry partnered up and worked their butts of then we could expect something like the specs above sometime soon.
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
This thread doesn't provide us with too much more information than a simple rumor.

One, we don't know what ATi has up their sleeves for the end of this year. You can't just say this looks so good that ATi can't compete; they can and they will.

Two, these specs are just rumors. We don't know for sure if this is what NVIDIA has planned, and even if they do the specs could change because of manufacturing difficulties or a whole host of problems.

With that said, those look like some really impressive specs. The eRAM could boost performance quite a bit. I would really like to see how this bad boy performs if these are the real specs.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
sure the specs look impresssive, it would look even more impressive if the rumors said they were building them in their new factory on the moon. which, imho, would be about as beliveable as the specs themself.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
EDram isnt patented to bitboys.
ArtX(now part of ATI) used edram on the gamecubes gfx chip, so in theory, ATI would know how to cost effectivly implement edram into their future chips. nVidia on the other hand, would have to work out a cost effective way of implementing it.
However, EDram, if added to a new gfx core could give it s big performance boost. Even a relitevly small amount, say 2-4mb, would provide a boost.
 

capodeloscapos

Senior member
Jan 19, 2002
246
0
0
Just rumours... Instead of talking about NV40, NvIdia should focus on NV35.. because the NV40 is a year away... And ATI will continue dominating this year if they still continue doing bad decisions.
Does any if you know which is the limit for the manufacturing process? could it be that in the future we reach 0.03 Microns or something like that?
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
or .003 mircons and even lower, anyone who claims there is a limit is just being strubrond and simple minded. well ok, assumeing quanum theory is right, there is a a lower limit to everything, but at the time we cannot even prove that limit exists, let alone acutualy say were it is.
 

Slappy00

Golden Member
Jun 17, 2002
1,820
4
81
embedded dram? I hope they have a licence for it from Bitboy's

yah you can pay them with food-stamps or a sandwich, dont they do PDA graphics chips now?! Man they fell-off....

Anyhow I dont find it surprising that Nvidia fans are touting the next gen Nividia card, considering their latest iteneration didnt amount to very much. I remember back in November a couple of militant Nivida zelots ( no names of course :) ) proclaiming that I along with other R9700pro adopters were idiots because the NV30 will squash it balh blah blah... I guess theres always hope for the future huh:D
 

AtomicDude512

Golden Member
Feb 10, 2003
1,067
0
0
Originally posted by: Socio
--------------------------------------------------------------------------------

This should make ATI fanboys cringe!


Here are the NV40 specs:
300-350 Million Transistors on 90-nm process
750-800 MHz Core
16MB Embedded DRAM (134M trans.)
1.4 GHz 256-512MB DDR-II Memory
8 Pixel Rendering Pipelines (4 texels each)
16 Vertex Shader Engines
204.8 GB/sec Bandwidth (eDRAM)
44.8 GB/sec Bandwidth (DDR-II)
25.6 GigaTexels per Second
3 Billion Vertices per Second
DirectX 10 (or extended 9.1)
Release H2 2004


Found them here:

http://www.quake.co.il/modules/news/article.php?storyid=117

HEHE Revenge of my favorite company, it was only a matter of time before they start to crush ATI out of the market like they did everyone else. :D
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
i am sorry AtomicDude512, i don't you want one videocard manfuture on top with absolutely no comptition do drive forward technology or down prices? it just doesn't make sence to me, what motiviation do you have?


personaly i would like to see more companies step up to the plate with nvidia and ati. shouldn't we all?