nVidia to launch DirectX10 in November?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RedStar
:) they mean vista itself will not be directX 10 --the aero interface for example.

And yes crysis is a dx 10 game. the word "full" is reserved by you :)

try reading the quote again:
So far, Microsoft only promised that Windows Vista will run DirectX 9.0, allowing to later upgrade it to DirectX 10 via Windows Update

it's pretty clear

Crysis only has a "few DX10 features" that may simply allow it to run a bit faster and may also allow for higher resolutions . . . it may be patched to resemble a DX10 game in a year or so after release. :p
 

imported_RedStar

Senior member
Mar 6, 2005
526
0
0
i think this will make it clearer to you:

http://www.extremetech.com/article2/0,1697,1982033,00.asp

"ExtremeTech: Speaking to that point, we know that the desktop in Vista is drawn using DirectX 9. What happens if you have a DirectX 10 card? Does the desktop still use DX9, or does that switch over and use DX10?

Blythe: It continues to use DirectX 9. Largely the reason for that is when we built the desktop, that was being done concurrently with the design of DirectX 10. It becomes somewhat more complicated to build both the low-level technology and the thing on top of it, concurrently. It's better to sort of have a time gap between those. At the same time, we were making some minor tweaks to DirectX 9 to accommodate new features that were needed to do the desktop. For us, it's best to have one consistent platform. Even though we could imagine there being benefit to the desktop using DX10, it's better to do all the debugging and get it to work with DX9 and ship that. Then over time, as the hardware base builds up for DX10, by the time we do the next major release, we'd be looking at trying to move the entire desktop onto 10
"

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RedStar
i think this will make it clearer to you:

http://www.extremetech.com/article2/0,1697,1982033,00.asp

"ExtremeTech: Speaking to that point, we know that the desktop in Vista is drawn using DirectX 9. What happens if you have a DirectX 10 card? Does the desktop still use DX9, or does that switch over and use DX10?

Blythe: It continues to use DirectX 9. Largely the reason for that is when we built the desktop, that was being done concurrently with the design of DirectX 10. It becomes somewhat more complicated to build both the low-level technology and the thing on top of it, concurrently. It's better to sort of have a time gap between those. At the same time, we were making some minor tweaks to DirectX 9 to accommodate new features that were needed to do the desktop. For us, it's best to have one consistent platform. Even though we could imagine there being benefit to the desktop using DX10, it's better to do all the debugging and get it to work with DX9 and ship that. Then over time, as the hardware base builds up for DX10, by the time we do the next major release, we'd be looking at trying to move the entire desktop onto 10
"
i get it . . . you don't even though you quote it :p

your enthusiasm for a new DX platform has blinded you to reality . . . even the most enthusiastic promoters of Vista/DX10 realize it's gonna be a couple of years:
I think you'll see DX10-only in the next two years, certainly. That's about as close as I can pin it, because part of it is going to depend on adoption and installed base of DirectX 10 hardware.
 

imported_RedStar

Senior member
Mar 6, 2005
526
0
0
once again you fail to see :)

we don't need to be required to use directX 10 to get a whole lotta tangible benefits before hand...just like dx9

Don't you dare play a dx10 game for 2 years..or you will owe me that 2000$ :)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RedStar
once again you fail to see :)

we don't need to be required to use directX 10 to get a whole lotta tangible benefits before hand...just like dx9

Don't you dare play a dx10 game for 2 years..or you will owe me that 2000$ :)

"a whole lotta" . . . sure :p

NO one is denying the eventual advantages of DX 10 . . . what does that have to do with your ridiculous "timeline"?

there won't be ANY full DX10 games within 2 years . . .

and your 'bet' like your fantasies are in your own mind.
:thumbsdown:
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: apoppin
we now know it will launch mid-Nov for sure - as has been speculated [and which i doubted].

Actually, I still have doubts after reading the DigiTimes article.



The title of the article proclaims:

Nvidia to launch DirectX 10 chip in mid-November

yet the main body of the article says:


Nvidia is expected to announce the world's first DirectX 10-compliant graphics chip, the GeForce 8800 (codenamed G80), in the middle of November, graphics card makers revealed.


Announcing is FAR different than launching.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Creig
Originally posted by: apoppin
we now know it will launch mid-Nov for sure - as has been speculated [and which i doubted].

Actually, I still have doubts after reading the DigiTimes article.



The title of the article proclaims:

Nvidia to launch DirectX 10 chip in mid-November

yet the main body of the article says:


Nvidia is expected to announce the world's first DirectX 10-compliant graphics chip, the GeForce 8800 (codenamed G80), in the middle of November, graphics card makers revealed.


Announcing is FAR different than launching.

nice catch :)

we can probably [now] infer that nvidia's G80 NDA will expire mid-November and all the "Previews" will publish . . . probably a late-Nov Early Dec launch of their g80 Ultra X - in time for the holidays.

everything will depend on DX9 performance and nvidian hype to sell it
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Gstanfor
Funny how the fanatics just can't seem to forget nv30 isn't it? evn though most of them had R300's and had no intention of actually purchasing it, even if it was good? No suprise to see such reminiscing occuring in a thread about future nvidia chips either...

Even more intersting when the fanatic in question tells us in an earlier post he has no interest in this upcomming gen or cards * or* the refrehses thereof...


Originally posted by: Gstanfor
There is nothing "to understand" appopin. You could discuss g80 without mentioning nv30 at all, or stating any of a number of things that you have in this thread so far, but that's not what yo uare here for is it? So far as I can tell you are filling in for Ackmed...

And you seem unable to create 5 posts in a row without using the word "fanatic" in at least one of them. Honestly, if the Moderators were to delete every single post of yours that contained the word "fanatic", your overall post count would drop by at least 20%.






Lighten up, Francis.
 

Griswold

Senior member
Dec 24, 2004
630
0
0
Originally posted by: Matt2
Originally posted by: mamisano
Why purchase the "G80" card when it will do nothing for DX10 until DX10 is actually released in Vista. Sit back and wait until the Radeon is out so you can at least compare which is best at DX10. I don't know why people have to jump at any new product to hit the market.... what for, bragging rights?

Have you seen the rumored G80 specs?

The fact that G80 is DX10 compliant is just a marketing gimmick as far as I'm concerned.

This card will smoke any DX9c card on the market. G80's bread and butter will be DX9.

DX10 performance will be dog slow probably, but the architecture of this card gives Nvidia the foundation for the real DX10 cards this time next year. Same for ATI and the R600.

Rumored specs indeed. And it most likely wont be dog slow with a dx10 title. The move to dx10 is nothing like the step from dx8 to 9 - the main catch for dx10 is vastly increased efficiency. History doesnt always repeat itself especially not if its written on different paper.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Griswold
Originally posted by: Matt2
Originally posted by: mamisano
Why purchase the "G80" card when it will do nothing for DX10 until DX10 is actually released in Vista. Sit back and wait until the Radeon is out so you can at least compare which is best at DX10. I don't know why people have to jump at any new product to hit the market.... what for, bragging rights?

Have you seen the rumored G80 specs?

The fact that G80 is DX10 compliant is just a marketing gimmick as far as I'm concerned.

This card will smoke any DX9c card on the market. G80's bread and butter will be DX9.

DX10 performance will be dog slow probably, but the architecture of this card gives Nvidia the foundation for the real DX10 cards this time next year. Same for ATI and the R600.

Rumored specs indeed. And it most likely wont be dog slow with a dx10 title. The move to dx10 is nothing like the step from dx8 to 9 - the main catch for dx10 is vastly increased efficiency. History doesnt always repeat itself especially not if its written on different paper.

efficiency :p

you mean 'efficiency' in the future. . . . years off for games ;)

DX9 has barely been tapped by game programmers --at the end of '06 . . . SM 3.0 has "room" to write at least 20 times more shader instructions then you find in ANY current title.

DX10 and SM 4.0 is looking at the distant future - after 2 to 4 years . . . NO ONE - except the most deluded Vista hype-swallowers - expect "DX10 Only" games for at least 2 years.
:thumbsdown:
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: apoppin
Originally posted by: Griswold
Originally posted by: Matt2
Originally posted by: mamisano
Why purchase the "G80" card when it will do nothing for DX10 until DX10 is actually released in Vista. Sit back and wait until the Radeon is out so you can at least compare which is best at DX10. I don't know why people have to jump at any new product to hit the market.... what for, bragging rights?

Have you seen the rumored G80 specs?

The fact that G80 is DX10 compliant is just a marketing gimmick as far as I'm concerned.

This card will smoke any DX9c card on the market. G80's bread and butter will be DX9.

DX10 performance will be dog slow probably, but the architecture of this card gives Nvidia the foundation for the real DX10 cards this time next year. Same for ATI and the R600.

Rumored specs indeed. And it most likely wont be dog slow with a dx10 title. The move to dx10 is nothing like the step from dx8 to 9 - the main catch for dx10 is vastly increased efficiency. History doesnt always repeat itself especially not if its written on different paper.

efficiency :p

you mean 'efficiency' in the future. . . . years off for games ;)

DX9 has barely been tapped by game programmers --at the end of '06 . . . SM 3.0 has "room" to write at least 20 times more shader instructions then you find in ANY current title.

DX10 and SM 4.0 is looking at the distant future - after 2 to 4 years . . . NO ONE - except the most deluded Vista hype-swallowers - expect "DX10 Only" games for at least 2 years.
:thumbsdown:

This is the point I've been trying to make.

Anyone who buys G80 for anything but DX9 games and maybe some purdy DX10 demos is fooling themselves.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Griswold

Rumored specs indeed. And it most likely wont be dog slow with a dx10 title. The move to dx10 is nothing like the step from dx8 to 9 - the main catch for dx10 is vastly increased efficiency. History doesnt always repeat itself especially not if its written on different paper.

Um, the main feature of DX10 is unified shaders. If the card has some kind of dual processor hybrid unit, it could very well be a poor DX10 performer. Especially if speculation holds true that G80 is really supposed to be a DX9 monster with DX10 "compatibility."

But that's neither here nor there at the moment - anyone anticipating this card in a November/Christmas timeframe is doing so for the DX9 performance; Vista won't even go retail until next year!

I'm just happy I upgraded to a new PSU!
 

oddity21

Member
Aug 1, 2006
45
0
0
Originally posted by: jiffylube1024
Um, the main feature of DX10 is unified shaders.
Hmm. Could that be the same unified shaders that's apparently not required for a video card to be branded DX10-compliant?

That's no main feature. The theoretical 8x efficiency and the geometry shader are the main features.
 

imported_RedStar

Senior member
Mar 6, 2005
526
0
0
Let's not talk about a main feature as this directX 10 developer mentions:

"ExtremeTech: What are the big bullet points?the main, big changes in DirectX 10 graphics over what we have in DX9?

Blythe: That's actually a fairly complicated question in the sense that there's a lot that has changed. I want to be careful not to point out anything as, "oh, this is the defining feature." Some aspects of it are to do with the implementation that, one of the key things that we were trying to solve with DirectX 10 was to make it more efficient. And so there's some structural things in the way the API works where what we tried to improve is what people have commonly referred to as the "small batch problem." So that applications can send a lot more material changes or changes to the description of geometry a lot more frequently. But that doesn't show up at any "bullet point" feature in the API, like "oh turn on this switch and you get better performance." It sort of underpins the design.

And then there's a lot of things about feature consistency in hardware implementations that allow developers to have an easier job of targeting a wider variety of platforms. That doesn't show up as any single feature, but it's a big deal to get agreement on what this common set of features is and it will be available everywhere.

Then when you start getting into individual features, things like adding an integer instruction set, the new geometry shader is a fairly big deal. In terms of adding that in the middle of the pipeline and having access to all the parameters of a primitive?for example the three vertices of a triangle or the two endpoints of a line?and being able to operate on that as a whole, sort of changes the game. There are things that we've done where, part of the objective was to make it possible to do more processing on the GPU and to do this in a way where there wasn't a lot of mediation that needed to be done on the CPU. So you can do iterative types of computations on the GPU without having to sort of return data to the CPU, do a little bit more work there, and then return the data to the GPU. The way data formats work and can be moved back and forth between different parts of the pipeline show up as being important features for enabling this, but they're not?


"

http://www.extremetech.com/article2/0,1697,1982032,00.asp
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Creig
Originally posted by: Gstanfor
Funny how the fanatics just can't seem to forget nv30 isn't it? evn though most of them had R300's and had no intention of actually purchasing it, even if it was good? No suprise to see such reminiscing occuring in a thread about future nvidia chips either...

Even more intersting when the fanatic in question tells us in an earlier post he has no interest in this upcomming gen or cards * or* the refrehses thereof...


Originally posted by: Gstanfor
There is nothing "to understand" appopin. You could discuss g80 without mentioning nv30 at all, or stating any of a number of things that you have in this thread so far, but that's not what yo uare here for is it? So far as I can tell you are filling in for Ackmed...

And you seem unable to create 5 posts in a row without using the word "fanatic" in at least one of them. Honestly, if the Moderators were to delete every single post of yours that contained the word "fanatic", your overall post count would drop by at least 20%.






Lighten up, Francis.

What's the mattrer Creig? Does the truth hurt you?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: apoppin

ah yes . . . NV30 . . . brand new architecture to run DX9 and to compete with r300 [9700p] . . .

nvidia launched NVIDIA's NV30 chip, better known as the GeForce FX 5800 Ultra . . . after spending BIG bucks . . . ta da . . .

the DustBuster :p

in case you forgot, Matt . . .
What's under the Dustbuster?

i seriously doubt it will happen again . . . but then ATi might be preparing a surprise . . .

Funny how the fanatics just can't seem to forget nv30 isn't it? evn though most of them had R300's and had no intention of actually purchasing it, even if it was good? No suprise to see such reminiscing occuring in a thread about future nvidia chips either... :disgust:

Even more intersting when the fanatic in question tells us in an earlier post he has no interest in this upcomming gen or cards * or* the refrehses thereof...

you are really not worth talking to

everyone else understands

There is nothing "to understand" appopin. You could discuss g80 without mentioning nv30 at all, or stating any of a number of things that you have in this thread so far, but that's not what yo uare here for is it? So far as I can tell you are filling in for Ackmed...

so now your back to prove you're 100% clueless. :p
:Q

as i said, everyone else understands ;)

. . . and you are trying to fill in for Rollo . . . too bad . .. you don't have his skills
:thumbsdown:

I fill in for nobody appopin. As I said before, there was NO reason for anyone to bring up nv30 in the context of G80, let alone talk down G80 before it's even released - unless, of course, you are a fanatic (and your post history clearly shows that you are). Tell me, is your B3D persona WaltC by chance? Your mentalities and nv30 fixations match rather closely....

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
As I said before, there was NO reason for anyone to bring up nv30 in the context of G80.
Why? Big hype with (so far) uncertain performances? Seems related thus far. If G80 comes out and kills everything on the market, so be it, it won't be another nv30 then.
...let alone talk down G80 before it's even released
How many G80 threads are there/ have there been? Heaven forbid we talk about what is coming to market.

I'm not saying that G80 will be the next nv30, but I'm not saying it won't be either. Who knows. Personally, I don't believe it will but I can't be certain because it hasn't arrived. I just don't see the sense in being so defensive with a bad GPU Nvidia released. Don't be so angry that Nvidia did something wrong in the past.
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: josh6079
I'm not saying that G80 will be the next nv30, but I'm not saying it won't be either. Who knows. Personally, I don't believe it will but I can't be certain because it hasn't arrived. I just don't see the sense in being so defensive with a bad GPU Nvidia released. Don't be so angry that Nvidia did something wrong in the past.

Let's be honest here. Both companies have had duds (read: failures) in the past. There are plenty of fanboys running around here on either side. I'm going to go with whatever gives me the most FPS in a single card at any given time, and I don't give a damn what name is etched on the GPU.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
As I said before, there was NO reason for anyone to bring up nv30 in the context of G80.
Why? Big hype with (so far) uncertain performances? Seems related thus far. If G80 comes out and kills everything on the market, so be it, it won't be another nv30 then.

Well we could always mention the X1800....:D

 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: Pabster
Originally posted by: josh6079
I'm not saying that G80 will be the next nv30, but I'm not saying it won't be either. Who knows. Personally, I don't believe it will but I can't be certain because it hasn't arrived. I just don't see the sense in being so defensive with a bad GPU Nvidia released. Don't be so angry that Nvidia did something wrong in the past.

Let's be honest here. Both companies have had duds (read: failures) in the past. There are plenty of fanboys running around here on either side. I'm going to go with whatever gives me the most FPS in a single card at any given time, and I don't give a damn what name is etched on the GPU.

Most sensible post I've seen in a while. :thumbsup:

Hell, my last card was a Radeon 9700 after the FX debacle. Just go for what's best now, and be doggone happy! :D
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Let's be honest here. Both companies have had duds (read: failures) in the past. There are plenty of fanboys running around here on either side. I'm going to go with whatever gives me the most FPS in a single card at any given time, and I don't give a damn what name is etched on the GPU.
QFT. When I had my 7800GT's, they were one of the fastest graphic rendering setups you could have--still are pretty nice. I've had several X1900's--one XTX, one GT, and now one XT--and I'll take what ever is gives me as much overclocking control as I've got with my X1900 and more fps's.
Well we could always mention the X1800....:D
We could and that would be fine. I just don't understand Gstanfor's defensivness for a Nvidia card that was indeed bad and was considered so by almost everyone. Just because someone brought it up isn't a reason to frant about it and call whoever instigated its discussion a fanATIc.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Well the G80 won't necessarily be dog slow in DX10, remember the 9700 Pro, that thing was pretty fast in the first DX9 games and run DX9 pretty darn well, hell, it can still play DX9 games today, its just a tad slower than a 9800pro.

So who knows, maybe the 8800 will be the next 9700 Pro. (nad hopefully it won't be the next 5800 FX)
 

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
Hopefully DX10 in G80 won't be like DX9 was in the FX series. Which was absolutely Aweful. :disgust:


Jason
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
Originally posted by: Matt2
Originally posted by: mamisano
Why purchase the "G80" card when it will do nothing for DX10 until DX10 is actually released in Vista. Sit back and wait until the Radeon is out so you can at least compare which is best at DX10. I don't know why people have to jump at any new product to hit the market.... what for, bragging rights?

Have you seen the rumored G80 specs?

The fact that G80 is DX10 compliant is just a marketing gimmick as far as I'm concerned.

This card will smoke any DX9c card on the market. G80's bread and butter will be DX9.

DX10 performance will be dog slow probably, but the architecture of this card gives Nvidia the foundation for the real DX10 cards this time next year. Same for ATI and the R600.



What he said.

DX9 will be where this card will rule. DX10 support is just for selling.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: josh6079
As I said before, there was NO reason for anyone to bring up nv30 in the context of G80.
Why? Big hype with (so far) uncertain performances? Seems related thus far. If G80 comes out and kills everything on the market, so be it, it won't be another nv30 then.
...let alone talk down G80 before it's even released
How many G80 threads are there/ have there been? Heaven forbid we talk about what is coming to market.

I'm not saying that G80 will be the next nv30, but I'm not saying it won't be either. Who knows. Personally, I don't believe it will but I can't be certain because it hasn't arrived. I just don't see the sense in being so defensive with a bad GPU Nvidia released. Don't be so angry that Nvidia did something wrong in the past.

What big Hype? There has been little to no hype from nvidia... ATI only wish they could say the same with their engineers talking about R600 in forums for years now...