Not seeing the point of bleeding edge currently

jmarti445

Senior member
Dec 16, 2003
299
0
71
What is the point of ATI and Nvidia constantly amping up their video card performance? I have a Pair of PNY GTX465s flashed to GTX470s that I bought in August 2010 and I'm really not seeing why cards like the GTX580, GTX590 and the HD 6990 are relevant cards. I paid around $400 for the pair of cards and I know that the higher end cards push the frame rate on larger displays. This isn't a troll post or anything like that, it is more like game development is stuck in 2006 with the vast majority of console ports and such and the only game where I have seen my card even come close to its limits is Battlefield 3 and I only am using a Phenom X6 @ 3.8Ghz. Granted I'm using an older technology monitor that is only 1280X1024 but I still don't see why video cards have to be 20X more powerful then the consoles for the vast majority of games that they are being ported from.

Lets put this also into perspective. The Wii U that is coming out, I'm actually kind of looking forward to the console but even then we are only going to be having a tripling of the fill rate being I believe it is going to be be using a 4770 as its GPU, a midrange DX10.1 GPU that came out in 2009. And I'm also going to go as far to say that the upcoming consoles from Sony and Microsoft are not going to be all that much faster then the Wii U since they aren't going to put 500+ Watt Power supplies into a console. Remember one thing when the Xbox 360 came out, video cards were not pushing 100Watts with the exception of a few GPU's back then, today we have cards that easily go over 300Watts and I seriously doubt that people are going to want that much heat in their living room.

I'm not downing people with high end rigs with this post, but I want to know why people will spend $1000 on a set of video cards for games that will not utilize them for a a good 3-4 years? I know that there are applications that use the power of the GPU for stuff like bitcoin mining but it still seems like that is more of an excuse for spending that much on a set of GPUs vs what you actually want to be doing with the cards and that is gaming and for me the less then 10 games that come out a year that push the cards to their limits doesn't seem like a good purchase. I mean Skyrim was a DX9C game for crying out loud and Batman is the only other game that I see other then BF3 to be DX11.
 

arredondo

Senior member
Sep 17, 2004
841
37
91
PC game makers are holding back the full release of titles powered by next-gen engines until the new super HD consoles come out (lol no, the Wii HD won't count). Check back this time next year to see glimpses of stuff on the horizon that'll visually impress.
 

jmarti445

Senior member
Dec 16, 2003
299
0
71
I sure hope so dude, but I'm still skeptical especially since they are talking about ARM core's replacing Power PC for the CPU. Granted I don't think the Power PC processors that are in the consoles are all that powerful since they are In-order instruction based and frankly were not really all that potent when they came out in the first place and I'm assuming that if they put about 30 or so cores into the consoles it will really start to take off in performance(think about this a 1Ghz ARM CPU uses about 1.5 watts, if they can get it scale to around 3 Ghz and maybe shrink the packaging it may have teeth to it). They say that the 8 core XGene processor at 3Ghz has about half the performance of a dual core sandy bridge @ 2.4Ghz. That really isn't bad at all considering that per core it is going to use about 2 Watts.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
you run two 470s sli for 1280X1024???????

Look, buy you a good 30" 2560x1600 LCD and then you might push you might get it. The experience is well worth it. There are plenty of games that will make your cards crawl at those resolutions. You could also try 3d with your setup.

Your stuck back in time with your 1280X1024 display. Do you know thats a resolution that is apt for tablet and netbooks and 10 inch displays. These devices run on less than 2watts of power.

As resolution goes up so does the need for GPU power. Your missing the clarity and crisp fantastic Image Quality by using such old technology. No wonder, your missing a huge part of the PC gaming advantages. It no wonder your disappointed. With your display your cards are an overkill. I huge overkill.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
1280X1024? good grief. not only is that a low res but the worst possible aspect ratio ever made. you are missing out on 1/3 of the field of view for modern game. that aspect ratio is so claustrophobic that it makes me feel a bit sick to my stomach especially in a game like Dead Space.

buy a normal size monitor that even grandmas use today and you can tax your gpus more realistically. at 1920x1080 and above on max settings there are some games that need all the gpu power that you have.
 
Last edited:

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
He does have a point regardless of his monitor size. I've purchased three RPGs this fall (Fable III, Two Worlds II and Skyrim) and I can see the ugly truth of console oriented dumbing down on every one of them. My two year old midrange CPU and one year old midrange GPU can max the ingame settings on every one of them. Makes my hardware dollar go farther I suppose.
 

amenx

Diamond Member
Dec 17, 2004
4,428
2,751
136
"Granted I'm using an older technology monitor that is only 1280X1024"

You've answered your own question. 465 SLI is a waste of GPU power for that itty bitty tiny res. Dude, its 2011. Even 1920x1080 is beginning to look small these days. The biggest impact on anyones gaming experience imo is first and foremost SCREEN SIZE. Every jump in size I've made since I owned a 1280x1024 monitor many years ago has blown me away more than any single GPU has. Get a new monitor NOW! They are not that expensive anymore.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Get a new monitor and get back to us. Saying you have SLI 470's and game at 1280X1024 makes you look.... well less than smart to put it nicely.
 

jmarti445

Senior member
Dec 16, 2003
299
0
71
If I had the money I'd get a bigger display, unless anandtech wants to start a collection for me I think I need to wait.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
you could have bought a 1920x1080 for less than the cost of one of your cards. I would take a single gtx470 at 1920x1080 with some reduced settings over gtx470 sli on an awful 1280x1024 res monitor any day.
 

jmarti445

Senior member
Dec 16, 2003
299
0
71
Sigh...dude, they are flash modded GTX465 cards and if I was going to upgrade my monitor I'd get a S-IPS monitor over another TN display, due to image quality and viewing angle. They cost a lot more then a cheap panel. Point me to a good deal on a monitor and I may buy it.

Would it make things better if I told you that I have a 1920X1080 display in the form of a 46 Inch Sony PVA based TV(not great for gaming but great for watching movies, the black levels are amazing).
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I started reading the op with an open mind and finished it with a very skewed perception, very much like the experience you are having. You have 2 mid high end cards in sli for what? To keep warm? The tables will turn rapidly as you up the resolution. Your cards would be better suited for a 4.1 megapixel display rather than the 1.3 megapixel display you are using.

Your visual experience will greatly improve with a couple million more pixels
 
Mar 10, 2006
11,715
2,012
126
Yeah man, 1280x1024 for those cards is...well, it's a joke. Get a nice, cheap 1080P monitor (see my sig if you want a recommendation), crank up the AA/AF and get a fantastic gaming experience that will tax your cards hardcore.

I had 2x 470s before I sold them to pay for this single GTX 580, and believe me, I could not go back to 1x 470 after having 2x 470s or a GTX 580 -- the performance just isn't there for the settings I play at.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't understand spending 400 dollars just to utilize a 1280 x 1024 resolution.
 

Ika

Lifer
Mar 22, 2006
14,264
3
81
Have you looked at e-IPS screens? A good deal cheaper and most of the perks of IPS, without the flaws of TN. I've used one as my main monitor for a year now, after using a Samsung TN for three, and the difference is night and day.
 

Vic Vega

Diamond Member
Sep 24, 2010
4,535
4
0
Part of me agrees with the OP. I have two 5850s and I can play any game I want maxed out at 1920x1200. Chances are this time next year this will be largely unchanged. The need for me to drop a grand on video hardware just isn't there for me.

I also know that technology must progress. Today's high end is tomorrows mid range. :)
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Part of me agrees with the OP. I have two 5850s and I can play any game I want maxed out at 1920x1200. Chances are this time next year this will be largely unchanged. The need for me to drop a grand on video hardware just isn't there for me.

I also know that technology must progress. Today's high end is tomorrows mid range. :)

Not metro 2033
 

repoman0

Diamond Member
Jun 17, 2010
5,191
4,571
136
Have you looked at e-IPS screens? A good deal cheaper and most of the perks of IPS, without the flaws of TN. I've used one as my main monitor for a year now, after using a Samsung TN for three, and the difference is night and day.

+1. I bought a 22" Dell 2209WA a year and a half ago. It's only 1680x1050, but I just put the taskbar to the left and autohide it and there's tons of space. These are well under $200 if you wanted a quick and cheap upgrade, OP.

That said, my $120 HD5770 that I bought a year and a half ago still works great for all my games at this resolution, so you still have a point regardless of monitor.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
The OP has a point for the most part - but only if the following is true:


  • One doesn't use SGSSAA
  • One doesn't use downsampling
  • One doesn't use Eyefinity/NV Surround
  • One doesn't use 3DVision
  • One doesn't need constant 60fps
There definitely is a point in high end cards, you just have to find the scenario. I play Skyrim in 1280x1024 with 8xSGSSAA and at times my GTX580 SLI cannot provide more than 30fps. Now imagine what would happen, if I had a 1080p monitor...
 

Puppies04

Diamond Member
Apr 25, 2011
5,909
17
76
The OP has a point for the most part - but only if the following is true:


  • One doesn't use SGSSAA
  • One doesn't use downsampling
  • One doesn't use Eyefinity/NV Surround
  • One doesn't use 3DVision
  • One doesn't need constant 60fps
There definitely is a point in high end cards, you just have to find the scenario. I play Skyrim in 1280x1024 with 8xSGSSAA and at times my GTX580 SLI cannot provide more than 30fps. Now imagine what would happen, if I had a 1080p monitor...

There are quite a few people complaining about GTX580s not working properly with skyrim and dipping down to 40% usage while only delivering 40fps (driver issue). If you truly believe skyrim is actually maxing twin GTX 580s at 1280x1024 then I don't even know what to say lmao
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
There are quite a few people complaining about GTX580s not working properly with skyrim and dipping down to 40% usage while only delivering 40fps (driver issue). If you truly believe skyrim is actually maxing twin GTX 580s at 1280x1024 then I don't even know what to say lmao

But I know: read my post properly. I use 8xSGSSAA.
Gives a performance drop of approximately 75% compared to no AA. Skyrim can be highly CPU-limited (obviously not with my settings, but with plain MSAA it surely can), so there is no driver issue at all. My GPU-usage is 95%+ on both cards almost all the time when I don't have vsync active. I have a 2600K@4300.
 

Puppies04

Diamond Member
Apr 25, 2011
5,909
17
76
So if you can't get over 30FPS (lets say you are averaging 25FPS) with these settings then taking into account your figure of a 75% drop in FPS you are saying a GTX580 SLI setup would only produce 100FPS average in this game at ultra, because that isn't the sort of figures im seeing thrown around. Trying to find it now but i've seen a guy with a single 580 pinned at 60FPS minimum running 50% more pixels than your screen has. Something doesn't add up....