"Why DX11 will save the video card industry and...

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Dude, the article was linked, you don't need to post it in its entirety.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Phynaz
Dude, the article was linked, you don't need to post it in its entirety.

agreed, if you're going to discuss someone's article at least give them the hits.
 

Negronpope

Junior Member
May 29, 2006
23
0
0
Clearly stated facts. If you're not gaming, then a first generation DX9 card (like a Radeon 9700) is still a great performer for all current OS'. It runs Aero very well (far better than most current integrated solutions). So, why follow the 'mandatory' upgrade path? I went from DX7 straight to DX9 and I haven't upgraded most of my computers because DX9 has proven more than adequate.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Well some points are kind of valid, but DirectX 11 is, in some ways, catering to those points.
There is a "good enough" point for graphics in some ways, and having super amazing graphics doesn't always matter (e.g. TF2, Battlefield Heroes vs Crysis).
DX10.1, one of the things it did was improve efficiency for doing AA IIRC, so that you could have improved AA performance.
DX11 is adding things like Tesselation, multithreaded rendering, new compression stuff, etc, which in theory could lead to improved performance levels over current stuff.

If developers can use DX11 and subsequent releases to improve the efficiency of graphics, then it means that even lower end cards might be able to give acceptable levels of performance with good enough graphics, which would be quite good for gaming but not so much for GPU manufacturers.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
In less than 10 years 3D, stacked OLED screens will sell requiring graphics process 1000 times more powerful than we have now (for games at least.)

I'm imagining 3D cubes made of stacked, .2mm thick, transparent OLED screens running true 3D as opposed to holograms or glasses. Resolutions of ****x****x****. The screens already exist, as does the idea among manufacturers in Japan, it will happen unless some pseudo-3D manages to offer similar quality. When it does happen, Moore's Law will need some time to satisfy these.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
I question whether or not the person who wrote the article has any idea what the hell they're talking about.
Nintendo has realized that "good enough" is where the money is
Nintendo actually got it right. Wii all laughed at first but they're the ones who've staged the biggest comeback in gaming history because they figured the market out. They knew that yes HDTVs are awesome and cool, but most households in this generation (2006-2011) will only have one. The kids' rooms will all get the hand me down CRTs, and those are still standard def. So target them, and win the brothers and sisters and parents with interesting game designs that break the mold, following the successful experiment called the DS.
Did he really just imply that you can't tell the difference between a Wii and an Xbox 360 on a 480i television?

People criticize the Wii as being "last-generation", since it does not even pretend to be anything other than the overclocked GameCube Slim with a weird controller that it is. But anyone who believes this means Nintendo "has lost it" or "no longer cares about real gamers" needs to take a look at just how many centuries Nintendo has been in business and what their profit/loss record looks like.
Again, does this person have any idea what the hell they're talking about? The NES was a high end system at the time. SNES was a lot more powerful than the Sega Genesis. N64 was faster than the PS1. All 3 of these consoles were wildly successful and they had top end hardware. The Wii is the one and only time a shitty console has sold the most units.

Nobody in their right mind actually believes a dedicated chip with billions of transistors to render 3D graphics is actually useful to the majority of business users. There's only one reason to have it, and the fact that such things have integrated themselves into video cards and then motherboards and coming soon CPUs only shows what we've known for ages is really the truth. Everyone Plays Games On The Computer. Everyone. Or if they don't, they want to.
This is just flat out wrong. Similar to CPUs, the bulk of graphics money comes from workstation computers. While we're here on the forum bitching about which $200 gaming card to get, people using Solid Works are thinking about buying $2300 graphics cards.

but these days barely anyone but the enthusiasts can tell the difference between a DX9 and DX10 screenshot without a side by side comparison let alone DX10 vs. DX10.1, and even then half of them are faking it to make it sound like they know what they're talking about.
This is easily the most retarded statement I've seen all week. The problem is that he's implying all or most graphical improvements are a result of new DirectX (or OpenGL) features. This couldn't be more wrong. Screen and texture resolution is infinitely more important than DirectX version. Fallout 3 set to ultra quality but run at 800x600 will always look like shit, but turning off all of the features including HDR and shadows still looks beautiful if one uses the high resolution textures and runs the game at 1680x1050. When we run games like Quake 2 at 1680x1050, the walls and floor look a bit shitty not because of the OpenGL version it uses, but because the texture resolution is very low and the models have very few polygons. If we can't tell the difference between DirectX 9 and 10 screenshots, it's because both are using the same textures and models.

Long gone are the days when a new DirectX release (which could happen several times on the same OS even without a Service Pack!) prompted a flurry of downloads and new graphics card purchases.
When has this ever been true? I remember people often wanting to stay with the old version because it always ran faster. We see the exact same mentality right now with Nvidia's physx and ambient occlusion. Ambient occlusion probably does look awesome, but I'm not going to turn it on because I know it will destroy my frame rate. When Half-Life 2 came out, lots of people forced the game to run DirectX 8 in order to get better frame rates.

Nobody is making drooling fanboy articles about the laundry list of DX11 features, and posting screenshots of early build games for people to debate the authenticity of. Because nobody cares anymore. And when DX11.1 and DX12 arrive, they still won't.
Indeed. DirectX 11's GPGPU features are a big enough threat to CPU demand that Intel feels the need to introduce something like Larabee. Of course this is because Intel just randomly does things and it has nothing to do with DirectX 11.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
I'm with Shawn, all the cutesy homonyms are annoying as well.

Nintendo = biggest comeback in gaming history? What the hell? I think they were the only console maker in the last 2 generations to freaking post profits. When was the last time Nintendo lost money, VirtualBoy? Marketshare =/= profit.....usually.

This article was written by a twelve year old who had 5 friends that had PS2s but the author decided to get a Gamecube.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: reallyscrued

This article was written by a twelve year old who had 5 friends that had PS2s but the author decided to get a Gamecube.

:D
 

Jacen

Member
Feb 21, 2009
177
0
0
Originally posted by: reallyscrued
I think they were the only console maker in the last 2 generations to freaking post profits. .

This is way wrong. The 360 has been making a profit since late last year and the PS2 made a profit, a massive, massive profit pretty early on comparitively.

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
To all those who are questioning what is "better" rather than "what has been a bigger money maker". I don't think the article boasts the Wii as a "superior" gaming console at all. It's a "superior money maker". Every child wanted or wants one who has seen a Wii commercial.

We have a Wii, and what it is, is a console that is packed with entertainment. Nobody is comparing graphics to 360 or PS3. But the best graphics may not be what consumers want. They just might want to be entertained. And I think Wii sales prove that this is the case.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It is true graphics is fixed around the consoles much more- that is why everything is still basically DX9c. To say that they are all we need graphically and hence DX11 is pointless is a lie however. I'm sure the next gen consoles supporting DX12 or whatever will blow the last gen away, unfortunately while the PC is just getting ports of 360 games it may well be that the PC market don't see the full capabilities of anything much beyond DX9c until the xbox 720 comes out.
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
Nobody in their right mind actually believes a dedicated chip with billions of transistors to render 3D graphics is actually useful to the majority of business users. There's only one reason to have it, and the fact that such things have integrated themselves into video cards and then motherboards and coming soon CPUs only shows what we've known for ages is really the truth. Everyone Plays Games On The Computer. Everyone. Or if they don't, they want to.
This is just flat out wrong. Similar to CPUs, the bulk of graphics money comes from workstation computers. While we're here on the forum bitching about which $200 gaming card to get, people using Solid Works are thinking about buying $2300 graphics cards.
I think he has a point here. Majority of businesses(animation,rendering, etc. isn't the majority of businesses out there) uses computers for documents, spreadsheets, records, Internet browsing, mail, etc.. While some businesses/users do use the GPU for other means majority of users who have a graphic card uses it to play

but these days barely anyone but the enthusiasts can tell the difference between a DX9 and DX10 screenshot without a side by side comparison let alone DX10 vs. DX10.1, and even then half of them are faking it to make it sound like they know what they're talking about.
This is easily the most retarded statement I've seen all week. The problem is that he's implying all or most graphical improvements are a result of new DirectX (or OpenGL) features. This couldn't be more wrong. Screen and texture resolution is infinitely more important than DirectX version. Fallout 3 set to ultra quality but run at 800x600 will always look like shit, but turning off all of the features including HDR and shadows still looks beautiful if one uses the high resolution textures and runs the game at 1680x1050. When we run games like Quake 2 at 1680x1050, the walls and floor look a bit shitty not because of the OpenGL version it uses, but because the texture resolution is very low and the models have very few polygons. If we can't tell the difference between DirectX 9 and 10 screenshots, it's because both are using the same textures and models.
I think he is basing his opinion from the current implementation of dx10. An example would be this RE5 dx9 vs dx10 comparison and this Crysis dx9 vs dx10 comparison. As can be seen the changes aren't really that noticeable in still images much less while moving in-game

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: ShawnD1Again, does this person have any idea what the hell they're talking about? The NES was a high end system at the time. SNES was a lot more powerful than the Sega Genesis. N64 was faster than the PS1. All 3 of these consoles were wildly successful and they had top end hardware. The Wii is the one and only time a shitty console has sold the most units.

SNES was barely more powerful than the Genesis which supported more point sprites and had more sound channels, SNES was simply more flexible in some early 3D modeling and had a much bigger color palette. N64 was much more powerful than the PS1, but in the end, the PS1 proved to be the most succesful console before the PS2 debuted, and it wasn't the most powerful console since even the Saturn had theorically more computing power which was almost impossible to unleash.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Dribble
It is true graphics is fixed around the consoles much more- that is why everything is still basically DX9c. To say that they are all we need graphically and hence DX11 is pointless is a lie however. I'm sure the next gen consoles supporting DX12 or whatever will blow the last gen away, unfortunately while the PC is just getting ports of 360 games it may well be that the PC market don't see the full capabilities of anything much beyond DX9c until the xbox 720 comes out.

Interestingly enough though, MS seems to be following the "good enough" graphics with a fancy controller approach with project Natal. IIRC, Natal is supposed to work with the current 360, but MS is also supposedly launching a slightly amped up version of the 360 (overclocked/die shrink? no details yet) that comes with Natal included. While this isn't exactly their "next gen" console, it does show that they are adopting Nintendo's approach.

The point is that it doesn't look like consoles will be pushing the graphics envelope any time soon. Unless, NVIDIA decides to leverage its relationship with game developers and experience with CUDA/physics to build a game console centered around GPGPU squarely targeted towards videophile PC gamers (hi rez graphics with KB/mouse support). Of course, there has been no mention or even rumor about this, just food for thought.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: evolucion8
Originally posted by: ShawnD1Again, does this person have any idea what the hell they're talking about? The NES was a high end system at the time. SNES was a lot more powerful than the Sega Genesis. N64 was faster than the PS1. All 3 of these consoles were wildly successful and they had top end hardware. The Wii is the one and only time a shitty console has sold the most units.

SNES was barely more powerful than the Genesis which supported more point sprites and had more sound channels, SNES was simply more flexible in some early 3D modeling and had a much bigger color palette. N64 was much more powerful than the PS1, but in the end, the PS1 proved to be the most succesful console before the PS2 debuted, and it wasn't the most powerful console since even the Saturn had theorically more computing power which was almost impossible to unleash.
http://www.emulator-zone.com/doc.php/genesis/
http://www.emulator-zone.com/doc.php/snes/

-SNES has 32768 colors, Sega Genesis has 512.
-SNES can have 256 colors on the screen, Sega Genesis can have 64
-SNES has 8 sound channels, Sega Genesis has 6
-SNES can do 128 sprites, Sega Genesis does 80
-SNES had "mode 7" which allowed fast screen rotation for games like F-Zero, but Sega Genesis racing games looked terrible. Even Virtua Racing on my 32X was not as good looking as F-Zero.

The SNES was a hard core system in its day. It launched for $200 in 1991 which is roughly $316 in today's money. I had a Sega Genesis instead because it was only $120 at the time.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Originally posted by: nitromullet
Unless, NVIDIA decides to leverage its relationship with game developers and experience with CUDA/physics to build a game console centered around GPGPU squarely targeted towards videophile PC gamers (hi rez graphics with KB/mouse support). Of course, there has been no mention or even rumor about this, just food for thought.

That would be cool but then it would just be a PC but less customizable...sort of like the xbox is now.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: ShawnD1
I question whether or not the person who wrote the article has any idea what the hell they're talking about.
Nintendo has realized that "good enough" is where the money is
Nintendo actually got it right. Wii all laughed at first but they're the ones who've staged the biggest comeback in gaming history because they figured the market out. They knew that yes HDTVs are awesome and cool, but most households in this generation (2006-2011) will only have one. The kids' rooms will all get the hand me down CRTs, and those are still standard def. So target them, and win the brothers and sisters and parents with interesting game designs that break the mold, following the successful experiment called the DS.
Did he really just imply that you can't tell the difference between a Wii and an Xbox 360 on a 480i television?

No. I think the author was basically saying that considering the overwhelming number of older SDTV sets, even a system like the Wii which is basically built for SD resolutions doesn't look too bad when the video from a PS3 or Xbox 360 is downgraded. The PS3 or Xbox 360 will still look better of course. Then there is the issue with general consumers, not the "hardcore" or techie crowd, don't care as much about graphics and so long as it doesn't look like sh!t and the game is fun.

but these days barely anyone but the enthusiasts can tell the difference between a DX9 and DX10 screenshot without a side by side comparison let alone DX10 vs. DX10.1, and even then half of them are faking it to make it sound like they know what they're talking about.
This is easily the most retarded statement I've seen all week. The problem is that he's implying all or most graphical improvements are a result of new DirectX (or OpenGL) features. This couldn't be more wrong. Screen and texture resolution is infinitely more important than DirectX version. Fallout 3 set to ultra quality but run at 800x600 will always look like shit, but turning off all of the features including HDR and shadows still looks beautiful if one uses the high resolution textures and runs the game at 1680x1050. When we run games like Quake 2 at 1680x1050, the walls and floor look a bit shitty not because of the OpenGL version it uses, but because the texture resolution is very low and the models have very few polygons. If we can't tell the difference between DirectX 9 and 10 screenshots, it's because both are using the same textures and models.

I think he's arguing that at similar resolutions, and with similar features, DX10 does not make a noticeable improvement in quality vs DX9. I find nothing wrong with that statement. Maybe we just haven't seen a true DX10 game, and I mean built from the ground up for only DX10, but games that support both DX9 and DX10 looks incredibly similar whether you use DX9 or DX10.

I'm not a programmer but from reading the DX10 articles, it seems DX10 was meant to improve efficiency in getting the effects you wanted. This saves time in getting the effects you wanted on screen. The result is you can use the time savings to further improve the graphics or invest it in gameplay or whatever.
 

s44

Diamond Member
Oct 13, 2006
9,427
16
81
Silly article. The next frontier isn't better *looking* graphics (how much better can it get than retextured Crysis anyway?), it's leveraging GPGPU and the like into better physics, AI, environmental interaction, etc. DX11's Compute Shader is an important step.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: akugami
No. I think the author was basically saying that considering the overwhelming number of older SDTV sets, even a system like the Wii which is basically built for SD resolutions doesn't look too bad when the video from a PS3 or Xbox 360 is downgraded. The PS3 or Xbox 360 will still look better of course. Then there is the issue with general consumers, not the "hardcore" or techie crowd, don't care as much about graphics and so long as it doesn't look like sh!t and the game is fun.
Having better hardware allows for much farther draw distances and having more things on the screen at one time. This isn't something like the PS3 and Wii looking somewhat similar because it's the same res and are drawing the same things. This is more like the PS3 will draw an explosion to a certain level of detail that still looks really good on a low res TV, but the Wii will do a half-assed Quake 3 style explosion where 3 large chunks fly apart. Wii and Xbox/PS3 are not even close, even when it's low res. I use a 480i television with my Xbox (when it works) and it really does look a hell of a lot better than the PS2 and Gamecube/Wii did.

A better analogy of Wii vs Xbox/PS3 would be if you look at the video options in Crysis. Sure you're using the same resolution, but do you want shadows? How many shadows do you want? Do you want the trees to sway in the wind? Should water be transparent like a swimming pool or should it reflect the sun like a real lake? On the Xbox and PS3, you can have those things. On the Wii, you can't. Games like Far Cry Vengeance are not as immersive as Gears of War and Killzone 2 because the whole game looks like the background of a school play.


I think he's arguing that at similar resolutions, and with similar features, DX10 does not make a noticeable improvement in quality vs DX9. I find nothing wrong with that statement. Maybe we just haven't seen a true DX10 game, and I mean built from the ground up for only DX10, but games that support both DX9 and DX10 looks incredibly similar whether you use DX9 or DX10.
True, but his arguing about graphical improvements only shows that he has no idea what he's talking about. DirectX 10 was mostly a change on how things are done. According to the wikipedia page:

(from wikipedia) Many former parts of DirectX API were deprecated in the latest DirectX SDK and will be preserved for compatibility only: DirectInput was deprecated in favor of XInput, DirectSound was deprecated in favor of XACT and lost support for hardware accelerated audio, since Vista audio stack renders sound in software on the CPU
It's a strawman. He tries to get the reader to believe that DirectX 10 was all about graphical improvements then he attacks his own delusion.

I'm not a programmer but from reading the DX10 articles, it seems DX10 was meant to improve efficiency in getting the effects you wanted. This saves time in getting the effects you wanted on screen. The result is you can use the time savings to further improve the graphics or invest it in gameplay or whatever.
Exactly.
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: s44
Silly article. The next frontier isn't better *looking* graphics (how much better can it get than retextured Crysis anyway?), it's leveraging GPGPU and the like into better physics, AI, environmental interaction, etc. DX11's Compute Shader is an important step.

Yeah, there's still 1 frontier for better looking graphics I'm waiting for, and that's Ray Tracing. It's like the icing on the cake for realism since it makes the shading/lighting look really close to real life.

Some awesome ray tract examples:
http://upload.wikimedia.org/wi...c/Glasses_800_edit.png
http://www.mildlydiverting.com...ng/images/cgi_face.jpg
http://downloads.playwhat.com/files/test/patio.jpg
http://splutterfish.com/images/gallery/images/193.jpg

Little comparison.
http://www.pcmech.com/wp-conte...tracedvsrasterized.jpg



 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: reallyscrued
This article was written by a twelve year old who had 5 friends that had PS2s but the author decided to get a Gamecube.

The Gamecube was a great console. I have both a PS2 and a Gamecube and I'd pick the Gamecube any day of the week, most notably thanks to its awesome first party lineup. So to me your insult to the author comes off as an insult to yourself.