Nvidia's next gen high end

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
Fudzilla link

Nvidia's next gen high end

Nvidia agrees that D8M and D8E are simply evolutionary steps as it shrinked its chips from 90 to 65 nanometre but the real next generation will come next year probably in Q2 if not later.


This matches the Nvidia's financial conference call where one of the Vice presidents said that Nvidia will have at least two new high end product until the next analyst call. This one usually takes place once a year.


D8E is a dual PCB dual chip card while D9E is a new chip that we don't know much about. We know it will be much faster than G80 but that is about it. In 2008 the graphics boys can do 55 nanometre and it will be too soon for 45 at least we think so.

_____________________________

Fudzilla link

January?

Our sources have confirmed that Nvidia said that the D8E ? Desktop 8th generation Enthusiast dual PCB, dual chip product comes in 2008 and not in November like many previously expected.


Nvidia simply can't get it ready in time. the D8M ? Mainstream and the D8V ? Value parts are going to be 65 nanometre and ready for November.


The D8E will be a GX2 like card with two PCBs and two chips and many power connectors. It will launch very early in Q1 which indicates January or February as a possible date.


The funny thing is that the dual RV670 card comes at the same time and it will be fun to see who will be better.
 

Aznguy1872

Senior member
Aug 17, 2005
790
0
0
Well this definately changes my videocard upgrade. q2 2008 is too long of a wait... But at least we got a confirmed date on when the next high end cards will come out though.
 

non duality

Member
Aug 8, 2007
36
0
0
I guess this was expected. There is still a lot of steam left in the G80 cards for them to be replaced by new high end ones.
Anyway, this does change plans for many who were expecting something big from Nvidia this november.

How will D8M and D8V compare to the 8800 gts?
 

CP5670

Diamond Member
Jun 24, 2004
5,666
765
126
Yeah, this is bad news. :( The D8E is of no interest to me either if it operates on SLI. I was hoping for at least a die shrunk 8800GTX.

Maybe I'll look for a used GTX on ebay and stick with that for a while, as I don't want to pay the price for a new one at this point.
 
Oct 30, 2004
11,442
32
91
Sounds like those of us who are holding out for hot deals and good values are just going to have to wait until January or February when we'll have more downward pressure on the 8800 GTS and 2900 XTs. I hope I can get away with playing UT3 at 800 x 600 and low settings for a couple months.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: non duality
I guess this was expected. There is still a lot of steam left in the G80 cards for them to be replaced by new high end ones.
I'd be genuinely disappointed if anyone was expecting a new architecture from Nvidia this year. 2 years has been the sweet spot for GPU architectures, the R300 lasted until the R520 replaced it, NV20 lasted just short of 2 years until NV30 replaced it, and NV40 lasted until G80. G80 and R600 will be the leading architectures until at least late 2008.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Hindsight is 20/20. I expected at least a Ubermensh high end refresh in November. Die shrink, core tweaks, faster memory, some feature improvements, etcetera. Damn you AMD. Damn you......... dammit. :D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
told you so :p

AMD didn't do it "on purpose", you know
-otoh, nvidia IS [wisely] taking advantage of the situation to maximize their profit


Neither you nor i can feel really bad ... we have had capable cards since early Summer ... i feel i am getting my money's worth out of my upgrade
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Aznguy1872
But at least we got a confirmed date

Link please. :laugh:

edit: Me thinks that nvidia slowed down to optimize drivers. Signs seem to indicate good days ahead for amd.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
this is very odd for nvidia. I was under the impression that they were deliberately delaying release of the next gen high-end cards b/c they were killing DAAMIT, but it is starting to look more and more like they're having problems. Wouldn't it be funny if ati came out ahead of nvidia AND had better performance?
 

Synomenon

Lifer
Dec 25, 2004
10,547
6
81
This is a good thing. It was crazy having new cards coming out every other month.

... Well not every month, but you know what I mean.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Still can't figure out why u guys are taking this as the gospel. Except of course that cj seems to confirm some of this.
 

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
Hah, I guess that's 4-6 months of the top end hardware struggling around 20fps on the newest games, while companies like Falcon NW or Alienware or VoodoPC keep selling $5000 PCs with a straight face ("extreme perfromanceeeeeee")
 

Ylurien

Member
Jul 26, 2007
74
0
0
Yeah, if the GTX were even 25% cheaper than what it is now, it might be worth it. As it is now, I just don't see how people can justify spending that much money for such performance. Tomorrow's games, starting with Crysis, are going to wipe the floor with the GTX. And the damn thing can't even run Oblivion well (although that game needs a bit more optimizing if you ask me).

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
-otoh, nvidia IS [wisely] taking advantage of the situation to maximize their profit
When they should be taking advantage of the situation by fixing their drivers.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
This likely confirms that we won't be moving from 8??? to 9??? but doesn't rule out 8900 parts (similar to 7900's) appearing. I.e. very similar to the 8800's but with higher clock speeds and faster memory. That would be an "evolutionary step". The dual chip card (like 7950GT2) suggests they have high end parts at 65nm as it seems pretty unlikely you could stick 2 8800GTS's in one card and not have it melt. If they have high end for the dual chip at 65nm then they will have high end at 65nm for the standard single chip cards too.

Hence it may well be we see an 8900GTX approx 30% faster then the 8800GTX at the end of the year.

We can but hope :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
With this news I dont see how ATI can just sit tight until January February with "limited" edition HD 2900Pro. They should surely respond with a widely-available cheaper example of HD 2900XT to reap some profits in the $250 and sub market.

If I was to take a wild guess, NV is having a hard time containing power requirements for GF9. Releasing dual GF8s or a 65nm die shrink will allow them to make tweaks to the manufacturing process over time and hopefully allow GF9 to operate cooler. Given the excellent scaling producted by GTX over GTS card from 96 to 128 shaders, it would seem that adding more shaders and texture units will still provide significant gains. Therefore, I am guessing it is not the scaling design of GF8 that's holding back GF9.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: keysplayr2003
Hindsight is 20/20. I expected at least a Ubermensh high end refresh in November. Die shrink, core tweaks, faster memory, some feature improvements, etcetera. Damn you AMD. Damn you......... dammit. :D

damnit dammit


Yeah I was hoping for the same. Glad I have a G80 for the meantime though
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
This sort of upsets me... I might opt for an Ultra and sell my Dad my 8800GTS. I actually was expecting higher clocks cards, or a new arch this year... :-(
 

manowar821

Diamond Member
Mar 1, 2007
6,063
0
0
Originally posted by: ArchAngel777
This sort of upsets me... I might opt for an Ultra and sell my Dad my 8800GTS. I actually was expecting higher clocks cards, or a new arch this year... :-(

I'm betting my step-up program on it.... :eek:

I hope these fud stories are wrong, honestly. I'm not sure if I can use step-up to get from my GTX to an ultra...
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ArchAngel777
This sort of upsets me... I might opt for an Ultra and sell my Dad my 8800GTS. I actually was expecting higher clocks cards, or a new arch this year... :-(

Me too... I sold my 8800GTX on ebay, and I'm using a 7900GS. The GS isn't a bad card by any means (if I had a 19" LCD), but with a 24" LCD, it really struggles a bit. The best I can muster is WoW with 4xAA/16xAF and NO transparancy AA, which makes a huge difference in the appearance and, regrettably, performance... I won't even attempt to run anything else on it. It will certainly not be enough for Crysis or Hellgate London, I'm sure of that.

I'm considering taking the money I got the GTX and putting it towards an EVGA Ultra. That way if something does come out before the new year, I have a hefty step-up credit, and if not... well, I have a sweet card until something does come out. Either way, it's kinda a bummer... I really thought that NV was gonna pull the 9800s out before xmas.

What are the D8M and D8E anyway...? I though I was pretty current with the video stuff, but I guess not...

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
With this news I dont see how ATI can just sit tight until January February with "limited" edition HD 2900Pro. They should surely respond with a widely-available cheaper example of HD 2900XT to reap some profits in the $250 and sub market.

If I was to take a wild guess, NV is having a hard time containing power requirements for GF9. Releasing dual GF8s or a 65nm die shrink will allow them to make tweaks to the manufacturing process over time and hopefully allow GF9 to operate cooler. Given the excellent scaling producted by GTX over GTS card from 96 to 128 shaders, it would seem that adding more shaders and texture units will still provide significant gains. Therefore, I am guessing it is not the scaling design of GF8 that's holding back GF9.

evidently they are :p

that is what the 2900GT is for - to replace the limited edition 2900p

if I had to take a wild guess, i'd say nvidia is waiting to see what AMD does and AMD is waiting for nvidia ... whatEVER either company brings to the table first gets trumped by who is second ... like playing 'chicken'

AMD has the 2950XT in development ... maybe they will guess right this time on a less power-hungry process

But i think i am no longer looking at a single GPU solution for DX10 titles ... anybody found a 2900P for $250 yet?

ANd DX10.1 is probably late next year ... "patched into" games like SM 3.0/HDR was patched into FC ;)

plenty of time to enjoy my corssfired 2900s ... i think

:D

seriously ... bad news for us
 

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
Hah, we're in a situation where even if you had all the money in the world to blow for a new comp today, your system will still be choking trying to play games coming up in the next few months with the full eye candy on.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Astrallite
Hah, we're in a situation where even if you had all the money in the world to blow for a new comp today, your system will still be choking trying to play games coming up in the next few months with the full eye candy on.

it has happened before

when DX8>DX9 first came out ... no one could play DE-IW, Thief DS or FC [remember that HDR had to be patched in with SM 3.0] at "ultra" setting at 16x12 with fastest gen HW then ... it took about a year

now with 2900xt Xfire or GTX sli, there is a 'shot' at playing Crysis on ultra at 19x12 ... maybe some details down ...
[i might be able to come close at 16x10 with Xfire]

--that is a little progress for a DX9>DX10 changeover