What is the point of ATI and Nvidia constantly amping up their video card performance? I have a Pair of PNY GTX465s flashed to GTX470s that I bought in August 2010 and I'm really not seeing why cards like the GTX580, GTX590 and the HD 6990 are relevant cards. I paid around $400 for the pair of cards and I know that the higher end cards push the frame rate on larger displays. This isn't a troll post or anything like that, it is more like game development is stuck in 2006 with the vast majority of console ports and such and the only game where I have seen my card even come close to its limits is Battlefield 3 and I only am using a Phenom X6 @ 3.8Ghz. Granted I'm using an older technology monitor that is only 1280X1024 but I still don't see why video cards have to be 20X more powerful then the consoles for the vast majority of games that they are being ported from.
Lets put this also into perspective. The Wii U that is coming out, I'm actually kind of looking forward to the console but even then we are only going to be having a tripling of the fill rate being I believe it is going to be be using a 4770 as its GPU, a midrange DX10.1 GPU that came out in 2009. And I'm also going to go as far to say that the upcoming consoles from Sony and Microsoft are not going to be all that much faster then the Wii U since they aren't going to put 500+ Watt Power supplies into a console. Remember one thing when the Xbox 360 came out, video cards were not pushing 100Watts with the exception of a few GPU's back then, today we have cards that easily go over 300Watts and I seriously doubt that people are going to want that much heat in their living room.
I'm not downing people with high end rigs with this post, but I want to know why people will spend $1000 on a set of video cards for games that will not utilize them for a a good 3-4 years? I know that there are applications that use the power of the GPU for stuff like bitcoin mining but it still seems like that is more of an excuse for spending that much on a set of GPUs vs what you actually want to be doing with the cards and that is gaming and for me the less then 10 games that come out a year that push the cards to their limits doesn't seem like a good purchase. I mean Skyrim was a DX9C game for crying out loud and Batman is the only other game that I see other then BF3 to be DX11.
Lets put this also into perspective. The Wii U that is coming out, I'm actually kind of looking forward to the console but even then we are only going to be having a tripling of the fill rate being I believe it is going to be be using a 4770 as its GPU, a midrange DX10.1 GPU that came out in 2009. And I'm also going to go as far to say that the upcoming consoles from Sony and Microsoft are not going to be all that much faster then the Wii U since they aren't going to put 500+ Watt Power supplies into a console. Remember one thing when the Xbox 360 came out, video cards were not pushing 100Watts with the exception of a few GPU's back then, today we have cards that easily go over 300Watts and I seriously doubt that people are going to want that much heat in their living room.
I'm not downing people with high end rigs with this post, but I want to know why people will spend $1000 on a set of video cards for games that will not utilize them for a a good 3-4 years? I know that there are applications that use the power of the GPU for stuff like bitcoin mining but it still seems like that is more of an excuse for spending that much on a set of GPUs vs what you actually want to be doing with the cards and that is gaming and for me the less then 10 games that come out a year that push the cards to their limits doesn't seem like a good purchase. I mean Skyrim was a DX9C game for crying out loud and Batman is the only other game that I see other then BF3 to be DX11.