Gabe Newell hates DX10

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: nemesismk2
Originally posted by: cmdrdredd
Originally posted by: Chesebert
Originally posted by: tuteja1986
Originally posted by: Chesebert
I am glad he hates DX10 because I hate vista, so I have no interest in DX10 whatsoever; If MS comes out with SP2 for vista I will reconsider :D

theirs nothing really wrong with vista, SP1 is need as it was need so bad with XP because of the doom of broadband. The problem is trashy developer that can't integrate DX10 efficiently. Anyways Crysis DX10 and everyone will be saying ... "DX10 is okay".

well. easy for you to say. half of my programs doesn't work with vista and I am not about to spend another $1k on software upgrades. My network is half as fast under vista vs xp, my games runs slower in vista, I couldn't get my emu sound card to work, my slim server is not working properly, etc. The whole system just feels slower than XP. I lived with vista for a couple month and I am getting out of the vista camp for now.

There's 5 people that use Vista and like it over XP for every 1 who tried it and hated it.

Prove it, where are your facts, proof etc to backup your statement?

First of all let me say this. The people who like vista and use it daily do not come here to cry about it. That's a fact. It's like with anything, ONLY the people who don't like something, usually for a silly reason such as "I hate DRM so I won't use vista" will post about it.

That's why it seems like people hate vista, they don't. In fact most people who use it and give it some time and work with it really see the benefits of it as an OS. It's more crashfree for one thing and that's a godsend to me.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I personally bought Vista to future-proof my rig, and I figured I would need to eventually upgrade to it anyways.

I purchased it after owning my 8800GTS for some time. I thought that Vista would help take advantage of the DX10 features of the card. I had no idea that DX10 would tank this bad and be so poorly accepted by the gaming industry. In retrospect, I probably could have stuck with XP, at the very least until Crysis comes out.

When I first got Vista, I had serious system stability issues to the point that my GF asked me to un-install it. For the past few months it has been much better. My gaming performance now matches that of XP, and the system feels more responsive. I do like the new look of the OS.

DX10 does seem like a bit of a bad joke to me. It doesn't add anything new visually. It's supposed to be more efficient with complex scenes, but so far DX9 has proven to be far more efficient. So why use it? Hopefully some good programmers can give us a reason. ;)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: ConstipatedVigilante
I still just don't get why developers don't dump DX and its idiocy for OpenGL. id and John Carmack (bless his nerdy voice) have shown it to be extremely capable as a graphics API, not only in delivering beautiful effects, but great performance. id is best known for cleanly-coded and smooth-running games with cutting-edge graphics. As advances are made, there's no need to program for new APIs and platforms - you just adopt the same code for all platforms. In contrast, almost all DX games out today run like crap and are filled with graphical bugs.

In conclusion, I hate microsoft.


Yeah, but OpenGL graphics always have that plasticy look that is absolute shit IMO. That goes for Riddick, Doom 3, Quake 4, QuakeWars, etcetera. That alone is a reason why I like DX. I can't stand those plasticy cartoonish fake looking graphics
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Frackal
Originally posted by: ConstipatedVigilante
I still just don't get why developers don't dump DX and its idiocy for OpenGL. id and John Carmack (bless his nerdy voice) have shown it to be extremely capable as a graphics API, not only in delivering beautiful effects, but great performance. id is best known for cleanly-coded and smooth-running games with cutting-edge graphics. As advances are made, there's no need to program for new APIs and platforms - you just adopt the same code for all platforms. In contrast, almost all DX games out today run like crap and are filled with graphical bugs.

In conclusion, I hate microsoft.


Yeah, but OpenGL graphics always have that plasticy look that is absolute shit IMO. That goes for Riddick, Doom 3, Quake 4, QuakeWars, etcetera. That alone is a reason why I like DX. I can't stand those plasticy cartoonish fake looking graphics

Hmmm... IMO, some of the most cartoonish looking games are running DX. HL2 is a prime example.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SickBeast
I personally bought Vista to future-proof my rig, and I figured I would need to eventually upgrade to it anyways.

I purchased it after owning my 8800GTS for some time. I thought that Vista would help take advantage of the DX10 features of the card. I had no idea that DX10 would tank this bad and be so poorly accepted by the gaming industry. In retrospect, I probably could have stuck with XP, at the very least until Crysis comes out.

When I first got Vista, I had serious system stability issues to the point that my GF asked me to un-install it. For the past few months it has been much better. My gaming performance now matches that of XP, and the system feels more responsive. I do like the new look of the OS.

DX10 does seem like a bit of a bad joke to me. It doesn't add anything new visually. It's supposed to be more efficient with complex scenes, but so far DX9 has proven to be far more efficient. So why use it? Hopefully some good programmers can give us a reason. ;)

your last sentence sums it up. Good programmers. None of the games that use DX10 at all were done properly. That's a fact.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Matt2
Originally posted by: Frackal
Originally posted by: ConstipatedVigilante
I still just don't get why developers don't dump DX and its idiocy for OpenGL. id and John Carmack (bless his nerdy voice) have shown it to be extremely capable as a graphics API, not only in delivering beautiful effects, but great performance. id is best known for cleanly-coded and smooth-running games with cutting-edge graphics. As advances are made, there's no need to program for new APIs and platforms - you just adopt the same code for all platforms. In contrast, almost all DX games out today run like crap and are filled with graphical bugs.

In conclusion, I hate microsoft.


Yeah, but OpenGL graphics always have that plasticy look that is absolute shit IMO. That goes for Riddick, Doom 3, Quake 4, QuakeWars, etcetera. That alone is a reason why I like DX. I can't stand those plasticy cartoonish fake looking graphics

Hmmm... IMO, some of the most cartoonish looking games are running DX. HL2 is a prime example.

HL2 looked much more realistic than everything shining like they've been dipped in suntan oil (doom3 engine).
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
The look of the game has more to do with the artist doing the textures than it does the game engine.
For every texture there can be a texture for specular mapping.
Do the mapping too high and everything is shiny, too low and everything looks flat/matte.
Also depends on if texture baking is being used and how its used.

Opengl and dx has nothing to do with how cartoonish/realistic the output is.
They are simply tools.




 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
I don't know about that. All OpenGL games I've played have had a distinct look that I wrote of above
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Modelworks
The look of the game has more to do with the artist doing the textures than it does the game engine.
For every texture there can be a texture for specular mapping.
Do the mapping too high and everything is shiny, too low and everything looks flat/matte.
Also depends on if texture baking is being used and how its used.

Opengl and dx has nothing to do with how cartoonish/realistic the output is.
They are simply tools.

This man speaks the truth.

IMO DX10 is just another tool for MS to advertise and drive the "Vista" campaign. Having these tools become "OS dependent" is pure BS. OpenGL 3.0 can do all everythnig DX10 can do except independent of OS. (Well need cards that support OpenGL 3.0 though).

And dont even think about starting the doom3 vs HL2 thing. It brings "shivers" down my spine. Both have there own merit and cons so lets bring to a halt here.

The things i dont like even more is the talk of DX10.1, etc etc when we dont even have DX10 games yet, they start talking about revisions.
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Originally posted by: Matt2
Originally posted by: Frackal
Originally posted by: ConstipatedVigilante
I still just don't get why developers don't dump DX and its idiocy for OpenGL. id and John Carmack (bless his nerdy voice) have shown it to be extremely capable as a graphics API, not only in delivering beautiful effects, but great performance. id is best known for cleanly-coded and smooth-running games with cutting-edge graphics. As advances are made, there's no need to program for new APIs and platforms - you just adopt the same code for all platforms. In contrast, almost all DX games out today run like crap and are filled with graphical bugs.

In conclusion, I hate microsoft.


Yeah, but OpenGL graphics always have that plasticy look that is absolute shit IMO. That goes for Riddick, Doom 3, Quake 4, QuakeWars, etcetera. That alone is a reason why I like DX. I can't stand those plasticy cartoonish fake looking graphics

Hmmm... IMO, some of the most cartoonish looking games are running DX. HL2 is a prime example.

HL2 has cartoonish graphics? Are you wearing your oh-so special Virtual Reality glasses when playing it?
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Cookie Monster


IMO DX10 is just another tool for MS to advertise and drive the "Vista" campaign. Having these tools become "OS dependent" is pure BS. OpenGL 3.0 can do all everythnig DX10 can do except independent of OS. (Well need cards that support OpenGL 3.0 though).

True, opengl can do the DX10 "features", that MS claims can't be done in xp, on xp .
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Modelworks
Originally posted by: Cookie Monster


IMO DX10 is just another tool for MS to advertise and drive the "Vista" campaign. Having these tools become "OS dependent" is pure BS. OpenGL 3.0 can do all everythnig DX10 can do except independent of OS. (Well need cards that support OpenGL 3.0 though).

True, opengl can do the DX10 "features", that MS claims can't be done in xp, on xp .

But takes alot more work by a ceveloper. It's much easier to make a DX based game than an OGL game.
 

niggles

Senior member
Jan 10, 2002
797
0
0
Originally posted by: cmdrdredd
Originally posted by: niggles
first off I installed bioshock on both XP and Vista and I did in fact like the look better on Vista. The Filtering looked nicer, although I have to say that the particle effect differences in DX9 compared to DX10 weren't huge, but they were a lot more seamless under DX10. Performance wise it was waaaay better under xp which is why I installed it on both rather than run it in compatability mode under vista.

Next up, I'm guessing everyone has heard that DX10.1 is coming out around the same time as SP1 for Vista some time around March? Apparantly it will require all new hardware... well I'm sorry, I'm not buying it... they make a minor change to Direct X and I'm supposed to by a couple of new GPUS? I don't think so.

1) the game does not run in "compatability mode"
2) DX10.1 from what it looks like to me and what I've gathered, is a software side change. It doesn't affect the hardware since all hardware that conforms to DX10 specs can do 4xaa and 32bit floating point. Therefore it will just become standard practice and required to impliment 4xaa and 32bit floating point in the engine.

Anyone have any comment on this? I didn't make this up but have been hearing things about it being unrelated to HW at all. If this is wrong don't shoot me...I read this on another forum.

arrrg. ok,

1. the game *does* run in compatability mode... if you run it in compatibility mode... have you tried running it in compatability mode because I have. Give it a whirl and you'll see it can.

2. Please listen to last weeks PCGamer podcast. This isn't some rumor heard around the internet, this is Microsofts press release.

rest assured I will not shoot you :)

 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: niggles
Originally posted by: cmdrdredd
Originally posted by: niggles
first off I installed bioshock on both XP and Vista and I did in fact like the look better on Vista. The Filtering looked nicer, although I have to say that the particle effect differences in DX9 compared to DX10 weren't huge, but they were a lot more seamless under DX10. Performance wise it was waaaay better under xp which is why I installed it on both rather than run it in compatability mode under vista.

Next up, I'm guessing everyone has heard that DX10.1 is coming out around the same time as SP1 for Vista some time around March? Apparantly it will require all new hardware... well I'm sorry, I'm not buying it... they make a minor change to Direct X and I'm supposed to by a couple of new GPUS? I don't think so.

1) the game does not run in "compatability mode"
2) DX10.1 from what it looks like to me and what I've gathered, is a software side change. It doesn't affect the hardware since all hardware that conforms to DX10 specs can do 4xaa and 32bit floating point. Therefore it will just become standard practice and required to impliment 4xaa and 32bit floating point in the engine.

Anyone have any comment on this? I didn't make this up but have been hearing things about it being unrelated to HW at all. If this is wrong don't shoot me...I read this on another forum.

arrrg. ok,

1. the game *does* run in compatability mode... if you run it in compatibility mode... have you tried running it in compatability mode because I have. Give it a whirl and you'll see it can.

2. Please listen to last weeks PCGamer podcast. This isn't some rumor heard around the internet, this is Microsofts press release.

rest assured I will not shoot you :)

The game runs natively though on Vista. You do not have to specifically check "run it in compatability mode".
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Originally posted by: Frackal
I don't know about that. All OpenGL games I've played have had a distinct look that I wrote of above

Most are based on the doom 3 engine which just looks like that cause that's how it was coded. It's nothing to do with ogl itself which can do exactly the same things as DX.

Originally posted by: Arkaign
Show me *any* game/demo that runs faster in DX10 vs the DX9 mode.

http://enthusiast.hardocp.com/...w1LCxoZW50aHVzaWFzdA==

bioshock (on nvidia at least) actually runs a faster in dx10 - well similar fps but better visuals.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: cmdrdredd
Originally posted by: Frackal
I don't know about that. All OpenGL games I've played have had a distinct look that I wrote of above

I agree here too.

That's because the game developers decided to make it look that way, not because of using DX or OpenGL. OpenGL and DX are just programming interfaces to the hardware, and regardless of which one a developer uses, all those instructions get decoded by the driver into a common machine language that the video card understands. Both OpenGL and DX support shader programs, and the devs who write these programs specify how materials interact with lighting in the game.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: munky
Originally posted by: cmdrdredd
Originally posted by: Frackal
I don't know about that. All OpenGL games I've played have had a distinct look that I wrote of above

I agree here too.

That's because the game developers decided to make it look that way, not because of using DX or OpenGL. OpenGL and DX are just programming interfaces to the hardware, and regardless of which one a developer uses, all those instructions get decoded by the driver into a common machine language that the video card understands. Both OpenGL and DX support shader programs, and the devs who write these programs specify how materials interact with lighting in the game.

I know, but it's still distracting for everything to look plastec, like clay.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: kakarotxiv
right now dx10 is meh. plus ya need the crapfest named vista for it. enough to turn everyone off

*sigh* go cry about vista somewhere else thanks.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
As a hobbyist developer and graphics programmer, I love all the ideas behind DX10; minimizing context switches and state changes, batching, unifying the pipeline, adding and deleting geometry on the fly, etc.

Unifying the vertex and pixel pipeline with the same instructions and registers and making the GPU a mass vector unit is a great idea and eliminates the load balancing conflict of fill rate vs. vector rate and treats it just like a multi-threated multi-core CPU where any free core (pipe or shader unit in GPU speak) can grab the next chunk of work indifferently.

But I'm not about to switch to Windows XP ME in order to use it.

DRM and hardware/OS level rights management, memory encryption, being completely black boxed out of my own PC without my being able to enable or disable it as I please also is an unappealing prospect to me. Sorry but I don't subscribe to the pay per use/view model on anything, much less my PC. I will not participate in a PC platform that encourages a totally controlled, locked down, and nickel and dimed environment that follows the cell phone industry just because a few lazy fat cats are upset that 1% of the population is tired of paying indefinitely to watch Steamboat Willy.

Come to think of it, it is the promise of "perfect" DRM that left a sour taste in my mouth with Vista. I refuse to put something on my computer that will restrict and deny me because I don't have an approved cable or such similar nonsense. It's MY fxxking PC not a cable companies leased pay per view box.
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Originally posted by: cmdrdredd
Originally posted by: kakarotxiv
right now dx10 is meh. plus ya need the crapfest named vista for it. enough to turn everyone off

*sigh* go cry about vista somewhere else thanks.

Having not used Vista, and hearing that 64-bit Vista's performance exceeds XP outside of games, I believe Vista has an architecture the driver manufacturers can rely on.

It's taken a while for drivers in Vista to get performance on par with XP, and if the performance gains and image quality delivered in games to come are spotty, I would not upgrade from XP to Vista until the card manufacturers stop making effective drivers in XP.

Since Aero is a DX client, I can only assume that Vista has the ability to help card drivers integration into the system. So at this point I am looking to the card makers to improve the drivers, not MS to improve Vista's interface with them.

But the 8800's have shown to be beasts when drivers are not faulty, and seeing Bioshock's recent upside-down performance in Vista - 2900XT beats 8800GTX, no AA-support, I have to wonder why the card makers are stumbling, but whoever is to blame, it is certain that the leap into Vista is a matter of time.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: exdeath
As a hobbyist developer and graphics programmer, I love all the ideas behind DX10; minimizing context switches and state changes, batching, unifying the pipeline, adding and deleting geometry on the fly, etc.

Unifying the vertex and pixel pipeline with the same instructions and registers and making the GPU a mass vector unit is a great idea and eliminates the load balancing conflict of fill rate vs. vector rate and treats it just like a multi-threated multi-core CPU where any free core (pipe or shader unit in GPU speak) can grab the next chunk of work indifferently.

But I'm not about to switch to Windows XP ME in order to use it.

DRM and hardware/OS level rights management, memory encryption, being completely black boxed out of my own PC without my being able to enable or disable it as I please also is an unappealing prospect to me. Sorry but I don't subscribe to the pay per use/view model on anything, much less my PC. I will not participate in a PC platform that encourages a totally controlled, locked down, and nickel and dimed environment that follows the cell phone industry just because a few lazy fat cats are upset that 1% of the population is tired of paying indefinitely to watch Steamboat Willy.

Come to think of it, it is the promise of "perfect" DRM that left a sour taste in my mouth with Vista. I refuse to put something on my computer that will restrict and deny me because I don't have an approved cable or such similar nonsense. It's MY fxxking PC not a cable companies leased pay per view box.

Um...go to the OS forum and for your information. DRM doesn't affect me and I use Vista x64 every damn day.
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
Originally posted by: Modelworks
The reason developers don't just drop directx and go opengl is because most of the current crop of programmers only know directx.
Its a choice of , do I re-learn most of what I know about graphics programming, or do I stick with what I know and make the best of it ?

The benefit of opengl is its multi-platform , it also works on almost any hardware.
Opengl allows flexibility .
If your writing a game engine and you want to support pixel shader 3.0 you can do that and someone who is using a pixel shader 2.0 card can still run the game.
What would happen though is the driver for your video card would be responsible for emulating the 3.0 support.
Depending on how well the driver works the game may or may not display correctly with the 2.0 card.

This is what MS was trying to avoid with DX. They wanted to make it so that developers did not have to concern themselves with what features a card
supported and just write the game for a standard like dx9 and any video card supporting that would run it properly.

The problem with that approach is that its all or nothing. You either comply and can run the game , or you can't. If the standard changes like dx 10.1 you are stuck
with a card that is now useless with all future dx 10.1 games even if your card was a really fast dx10 part.

As much as video cards cost now, I think its both a wasteful and a ignorant way to do a graphics api.
To completely ban a card from a dx version because it lacks a feature that may be no more than better lighting, sharper textures seems wrong.

Let the end user decide if whatever feature is important enough for them to upgrade , not MS.


A quick quote on how opengl handles hardware.
If the hardware consists only of an addressable framebuffer,
then OpenGL must be implemented almost entirely on the host CPU. More typically,
the graphics hardware may comprise varying degrees of graphics acceleration,
from a raster subsystem capable of rendering two-dimensional lines and polygons
to sophisticated floating-point processors capable of transforming and computing
on geometric data. The OpenGL implementor?s task is to provide the CPU
software interface while dividing the work for each OpenGL command between
the CPU and the graphics hardware. This division must be tailored to the available
graphics hardware to obtain optimum performance in carrying out OpenGL calls.


I don't know about OGL being that much more flexible than DirectX. For instance, Half-Life 2 was DirectX, but could run happily over several generatorions of hardware with DirectX 7 compatibility mode. Doom3, in contrast, could not run well with older cards. I thought Valve made a very good choice with DirectX.

I'm personally not going any near DirectX 10 until I have to though. I still use Win XP and Win 2000...