- Jun 21, 2005
- 12,060
- 2,273
- 126
Originally posted by: Phynaz
Dude, the article was linked, you don't need to post it in its entirety.
Did he really just imply that you can't tell the difference between a Wii and an Xbox 360 on a 480i television?Nintendo has realized that "good enough" is where the money is
Nintendo actually got it right. Wii all laughed at first but they're the ones who've staged the biggest comeback in gaming history because they figured the market out. They knew that yes HDTVs are awesome and cool, but most households in this generation (2006-2011) will only have one. The kids' rooms will all get the hand me down CRTs, and those are still standard def. So target them, and win the brothers and sisters and parents with interesting game designs that break the mold, following the successful experiment called the DS.
Again, does this person have any idea what the hell they're talking about? The NES was a high end system at the time. SNES was a lot more powerful than the Sega Genesis. N64 was faster than the PS1. All 3 of these consoles were wildly successful and they had top end hardware. The Wii is the one and only time a shitty console has sold the most units.People criticize the Wii as being "last-generation", since it does not even pretend to be anything other than the overclocked GameCube Slim with a weird controller that it is. But anyone who believes this means Nintendo "has lost it" or "no longer cares about real gamers" needs to take a look at just how many centuries Nintendo has been in business and what their profit/loss record looks like.
This is just flat out wrong. Similar to CPUs, the bulk of graphics money comes from workstation computers. While we're here on the forum bitching about which $200 gaming card to get, people using Solid Works are thinking about buying $2300 graphics cards.Nobody in their right mind actually believes a dedicated chip with billions of transistors to render 3D graphics is actually useful to the majority of business users. There's only one reason to have it, and the fact that such things have integrated themselves into video cards and then motherboards and coming soon CPUs only shows what we've known for ages is really the truth. Everyone Plays Games On The Computer. Everyone. Or if they don't, they want to.
This is easily the most retarded statement I've seen all week. The problem is that he's implying all or most graphical improvements are a result of new DirectX (or OpenGL) features. This couldn't be more wrong. Screen and texture resolution is infinitely more important than DirectX version. Fallout 3 set to ultra quality but run at 800x600 will always look like shit, but turning off all of the features including HDR and shadows still looks beautiful if one uses the high resolution textures and runs the game at 1680x1050. When we run games like Quake 2 at 1680x1050, the walls and floor look a bit shitty not because of the OpenGL version it uses, but because the texture resolution is very low and the models have very few polygons. If we can't tell the difference between DirectX 9 and 10 screenshots, it's because both are using the same textures and models.but these days barely anyone but the enthusiasts can tell the difference between a DX9 and DX10 screenshot without a side by side comparison let alone DX10 vs. DX10.1, and even then half of them are faking it to make it sound like they know what they're talking about.
When has this ever been true? I remember people often wanting to stay with the old version because it always ran faster. We see the exact same mentality right now with Nvidia's physx and ambient occlusion. Ambient occlusion probably does look awesome, but I'm not going to turn it on because I know it will destroy my frame rate. When Half-Life 2 came out, lots of people forced the game to run DirectX 8 in order to get better frame rates.Long gone are the days when a new DirectX release (which could happen several times on the same OS even without a Service Pack!) prompted a flurry of downloads and new graphics card purchases.
Indeed. DirectX 11's GPGPU features are a big enough threat to CPU demand that Intel feels the need to introduce something like Larabee. Of course this is because Intel just randomly does things and it has nothing to do with DirectX 11.Nobody is making drooling fanboy articles about the laundry list of DX11 features, and posting screenshots of early build games for people to debate the authenticity of. Because nobody cares anymore. And when DX11.1 and DX12 arrive, they still won't.
Originally posted by: reallyscrued
This article was written by a twelve year old who had 5 friends that had PS2s but the author decided to get a Gamecube.
Originally posted by: reallyscrued
I think they were the only console maker in the last 2 generations to freaking post profits. .
I think he has a point here. Majority of businesses(animation,rendering, etc. isn't the majority of businesses out there) uses computers for documents, spreadsheets, records, Internet browsing, mail, etc.. While some businesses/users do use the GPU for other means majority of users who have a graphic card uses it to playThis is just flat out wrong. Similar to CPUs, the bulk of graphics money comes from workstation computers. While we're here on the forum bitching about which $200 gaming card to get, people using Solid Works are thinking about buying $2300 graphics cards.Nobody in their right mind actually believes a dedicated chip with billions of transistors to render 3D graphics is actually useful to the majority of business users. There's only one reason to have it, and the fact that such things have integrated themselves into video cards and then motherboards and coming soon CPUs only shows what we've known for ages is really the truth. Everyone Plays Games On The Computer. Everyone. Or if they don't, they want to.
I think he is basing his opinion from the current implementation of dx10. An example would be this RE5 dx9 vs dx10 comparison and this Crysis dx9 vs dx10 comparison. As can be seen the changes aren't really that noticeable in still images much less while moving in-gameThis is easily the most retarded statement I've seen all week. The problem is that he's implying all or most graphical improvements are a result of new DirectX (or OpenGL) features. This couldn't be more wrong. Screen and texture resolution is infinitely more important than DirectX version. Fallout 3 set to ultra quality but run at 800x600 will always look like shit, but turning off all of the features including HDR and shadows still looks beautiful if one uses the high resolution textures and runs the game at 1680x1050. When we run games like Quake 2 at 1680x1050, the walls and floor look a bit shitty not because of the OpenGL version it uses, but because the texture resolution is very low and the models have very few polygons. If we can't tell the difference between DirectX 9 and 10 screenshots, it's because both are using the same textures and models.but these days barely anyone but the enthusiasts can tell the difference between a DX9 and DX10 screenshot without a side by side comparison let alone DX10 vs. DX10.1, and even then half of them are faking it to make it sound like they know what they're talking about.
Originally posted by: ShawnD1Again, does this person have any idea what the hell they're talking about? The NES was a high end system at the time. SNES was a lot more powerful than the Sega Genesis. N64 was faster than the PS1. All 3 of these consoles were wildly successful and they had top end hardware. The Wii is the one and only time a shitty console has sold the most units.
Originally posted by: Dribble
It is true graphics is fixed around the consoles much more- that is why everything is still basically DX9c. To say that they are all we need graphically and hence DX11 is pointless is a lie however. I'm sure the next gen consoles supporting DX12 or whatever will blow the last gen away, unfortunately while the PC is just getting ports of 360 games it may well be that the PC market don't see the full capabilities of anything much beyond DX9c until the xbox 720 comes out.
http://www.emulator-zone.com/doc.php/genesis/Originally posted by: evolucion8
Originally posted by: ShawnD1Again, does this person have any idea what the hell they're talking about? The NES was a high end system at the time. SNES was a lot more powerful than the Sega Genesis. N64 was faster than the PS1. All 3 of these consoles were wildly successful and they had top end hardware. The Wii is the one and only time a shitty console has sold the most units.
SNES was barely more powerful than the Genesis which supported more point sprites and had more sound channels, SNES was simply more flexible in some early 3D modeling and had a much bigger color palette. N64 was much more powerful than the PS1, but in the end, the PS1 proved to be the most succesful console before the PS2 debuted, and it wasn't the most powerful console since even the Saturn had theorically more computing power which was almost impossible to unleash.
Originally posted by: nitromullet
Unless, NVIDIA decides to leverage its relationship with game developers and experience with CUDA/physics to build a game console centered around GPGPU squarely targeted towards videophile PC gamers (hi rez graphics with KB/mouse support). Of course, there has been no mention or even rumor about this, just food for thought.
Originally posted by: ShawnD1
I question whether or not the person who wrote the article has any idea what the hell they're talking about.
Did he really just imply that you can't tell the difference between a Wii and an Xbox 360 on a 480i television?Nintendo has realized that "good enough" is where the money is
Nintendo actually got it right. Wii all laughed at first but they're the ones who've staged the biggest comeback in gaming history because they figured the market out. They knew that yes HDTVs are awesome and cool, but most households in this generation (2006-2011) will only have one. The kids' rooms will all get the hand me down CRTs, and those are still standard def. So target them, and win the brothers and sisters and parents with interesting game designs that break the mold, following the successful experiment called the DS.
This is easily the most retarded statement I've seen all week. The problem is that he's implying all or most graphical improvements are a result of new DirectX (or OpenGL) features. This couldn't be more wrong. Screen and texture resolution is infinitely more important than DirectX version. Fallout 3 set to ultra quality but run at 800x600 will always look like shit, but turning off all of the features including HDR and shadows still looks beautiful if one uses the high resolution textures and runs the game at 1680x1050. When we run games like Quake 2 at 1680x1050, the walls and floor look a bit shitty not because of the OpenGL version it uses, but because the texture resolution is very low and the models have very few polygons. If we can't tell the difference between DirectX 9 and 10 screenshots, it's because both are using the same textures and models.but these days barely anyone but the enthusiasts can tell the difference between a DX9 and DX10 screenshot without a side by side comparison let alone DX10 vs. DX10.1, and even then half of them are faking it to make it sound like they know what they're talking about.
Having better hardware allows for much farther draw distances and having more things on the screen at one time. This isn't something like the PS3 and Wii looking somewhat similar because it's the same res and are drawing the same things. This is more like the PS3 will draw an explosion to a certain level of detail that still looks really good on a low res TV, but the Wii will do a half-assed Quake 3 style explosion where 3 large chunks fly apart. Wii and Xbox/PS3 are not even close, even when it's low res. I use a 480i television with my Xbox (when it works) and it really does look a hell of a lot better than the PS2 and Gamecube/Wii did.Originally posted by: akugami
No. I think the author was basically saying that considering the overwhelming number of older SDTV sets, even a system like the Wii which is basically built for SD resolutions doesn't look too bad when the video from a PS3 or Xbox 360 is downgraded. The PS3 or Xbox 360 will still look better of course. Then there is the issue with general consumers, not the "hardcore" or techie crowd, don't care as much about graphics and so long as it doesn't look like sh!t and the game is fun.
True, but his arguing about graphical improvements only shows that he has no idea what he's talking about. DirectX 10 was mostly a change on how things are done. According to the wikipedia page:I think he's arguing that at similar resolutions, and with similar features, DX10 does not make a noticeable improvement in quality vs DX9. I find nothing wrong with that statement. Maybe we just haven't seen a true DX10 game, and I mean built from the ground up for only DX10, but games that support both DX9 and DX10 looks incredibly similar whether you use DX9 or DX10.
It's a strawman. He tries to get the reader to believe that DirectX 10 was all about graphical improvements then he attacks his own delusion.(from wikipedia) Many former parts of DirectX API were deprecated in the latest DirectX SDK and will be preserved for compatibility only: DirectInput was deprecated in favor of XInput, DirectSound was deprecated in favor of XACT and lost support for hardware accelerated audio, since Vista audio stack renders sound in software on the CPU
Exactly.I'm not a programmer but from reading the DX10 articles, it seems DX10 was meant to improve efficiency in getting the effects you wanted. This saves time in getting the effects you wanted on screen. The result is you can use the time savings to further improve the graphics or invest it in gameplay or whatever.
Originally posted by: s44
Silly article. The next frontier isn't better *looking* graphics (how much better can it get than retextured Crysis anyway?), it's leveraging GPGPU and the like into better physics, AI, environmental interaction, etc. DX11's Compute Shader is an important step.
Originally posted by: reallyscrued
This article was written by a twelve year old who had 5 friends that had PS2s but the author decided to get a Gamecube.
Originally posted by: Kakkoii
Little comparison.
http://www.pcmech.com/wp-conte...tracedvsrasterized.jpg