• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

For a GTX 275 owner, a GTX 460 or 470 is *not* attractive

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
A 470 is significantly faster than a 275 especially for games that have come out in the last two years.
 
I found out quickly that buying a $400/500 graphics card and keeping it for 3 years actually costs more than upgrading more often at $200 price levels and you get better performance too! Just something to think about next time. :awe:
I agree in general, but in July '08 it was time to replace my April '06 socket 939 system, including the graphics card. At the time the 4870 offered good price/performance at $300, unlike this year's 5870 overpriced at $400.
 
At the end it's not so bad right? You got a 4870 which lasted 2 years @ $300, while others blow $1800 on an iphone + data plan. Gaming is a rather cheap hobby.
 
To the OP, right on for realizing this generation of video cards is crap if you already have a decent last gen card.

If you like throwing your money away bust a move, otherwise, wait for better tech as this roundup is not only disappointing but overpriced.

+1
 
This current generation is unatractive and strange. We see SKU's like the HD 5830 being outperformed by the HD 4890, or seeing the GTX 275 being able to even match the GTX 465, shows three things;

1)Current architectures of both GPU vendors are slower in a per clock basis compared to previous generations

2)Games can't challenge current GPU's enough to make a difference thanks to the console ports and huge CPU dependency.

3)Lack of driver optimizations

I can tell that the first statement is quite true, a good example the same HD 5770 usually trails slightly behind the HD 4870, or the HD 4550 outperforming the HD 5450 which has a 50MHz higher core advantage. Or the HD 5830 which has more stream processors than the HD 4890 and barely outperforms it in some scenarios, or the fact than an HD 4870X2 tends to be slightly faster accross scenarios compared to the HD 5870, specially in CPU bound situations (Above 100fps) considering the Crossfire overhead and that no multi-GPU configurations scales 100% all the time.

The second statement is quite simple to explain, lots of games are direct ports of consoles with graphics that barely challenges even a GTS 250, and due to the nature of the multi core consoles currently, they're more inclined to CPU power than GPU power, plus the usage of the Unreal 3 engine which can run great even on an HD 4550.

The third statement is more picky, because during the entire life of the HD 5x00 series, besides of Crossfire support/scaling improvements, no significant boost in performance is seen across all the games, and the GTX 4x0 series experiments the same thing, because the situations were the GTX 4x0 were performing well below expectations, it was more related to driver bugs than anything else. HD 5x00 series had solid drivers since the beginning (Considering that's based on the HD 4x00 which is based on the HD 3x00 etc), it only had to improve in situations where it was needed and the Crossfile profiles. And the GTX 4x0 series are too new and hopefully will see improvements across its life.

So resuming everything, the OP is right, for a GTX 275 owner, heck, even a GTX 260+ owner, going to a GTX 470 isn't attractive, the same goes for HD 4870/HD4890 users. And that's not including multi-GPU users of the videocards mentioned above which doesn't have a choice of performance upgrade unless if lots of money is invested (Like HD 5870 in Crossfire or GTX 480 in SLI)
 
Last edited:
Evo8, the reason why a 4870x2 would outperform a 5870 is because at heavy AA and high resolutions, you are looking at 230 GB/sec bandwidth (+50&#37😉 over 5870's 154 GB/sec.

I think point #2 is most valid. We know that 7900GTX was much faster than an 8600GTS when the 8600 cards were released.
Anandtech: 8600GTS is barely keeping up with a 7900GS, while an 8600GT is losing mostly every time: http://www.anandtech.com/show/2218/3

What about now?

8600 GTS > 7900GTX in Splinter Cell Conviction by 50%
http://www.gamespot.com/features/6261472/p-4.html

8600 GTS > 7900 GS in COD:MW2 by 78%
http://www.gamespot.chttp://www.gamespot.com/features/6200616/p-5.html

8600 GTS > 7900 GS in Far Cry 2 by 61%
http://www.gamespot.com/features/6200616/p-5.html

8600 GT > 7900 GS in Crysis by 76%
http://www.gamespot.com/features/6182806/p-5.html

So basically what was once a joke of a card (8600GT esp.), is now far faster than any 79xx series ever made!
 
Last edited:
well the 8600gt/gts release drivers were garbage and really put the 8600gt in a bad light when it first came out. every time I get into a debate about the 8600gt people always link to the those initial reviews which were so bad. being a low end card no sites really bothered to go back and run benchmarks on the next couple driver revisions. for instance at release the 8600gt was no better than a 7600gt in some games such as FEAR but the next two driver revisions nearly doubled the framerate. heck I even eventually ran Crysis on DX9 medium and water on high settings at 1440x900 with a decent framerate.
 
Last edited:
I think last generation --> this generation is similar to 6800GT --> 7800GTX (so about 50% performance increase), instead of 70-100% we used to get (say 9800XT --> X800XT). Unless some seriously intensive games come out, I just don't see getting 70-100% performance increases. Add to this the problems of moving to 28nm and it's going to be even harder to squeeze out more brute force into a die. It's going to be all about efficiency and performance per clock in the next 1-2 years. I just cant see how NV is going to release a GTX580 with 70-100% the performance of the 480 part....in the next 12 months.
 
I bought a GTX 295 right before I came to Afghanistan. Is it still going to be able to hang @ 1920x1200 with the games that are out now? It's paired with a dinky E6600...
 
I bought a GTX 295 right before I came to Afghanistan. Is it still going to be able to hang @ 1920x1200 with the games that are out now? It's paired with a dinky E6600...
why on earth would you put a gtx295 with a cpu like that? that E6600 cant come remotely close to keeping up with a dual gpu card like that. in fact you are probably getting little to no better performance in some games than you would with a gtx275 because of the cpu overhead. I sure hope you didnt pay very much for that card because if you did it wasnt a very wise purchase.
 
Last edited:
why on earth would you put a gtx295 with a cpu like that? that E6600 cant come remotely close to keeping up with a dual gpu card like that. in fact you are probably getting little to no better performance in some games than you would with a gtx275 because of the cpu overhead. I sure hope you didnt pay very much for that card because if you did it wasnt a very wise purchase.

Yeah got it for $100 on a shell shocker.
 
why on earth would you put a gtx295 with a cpu like that? that E6600 cant come remotely close to keeping up with a dual gpu card like that. in fact you are probably getting little to no better performance in some games than you would with a gtx275 because of the cpu overhead. I sure hope you didnt pay very much for that card because if you did it wasnt a very wise purchase.

Jesus, Toyota. Think you can make the guy feel any worse? He'll get along ok until he has the opportunity to upgrade his CPU/platform.
 
I bought a GTX 295 right before I came to Afghanistan. Is it still going to be able to hang @ 1920x1200 with the games that are out now? It's paired with a dinky E6600...
this is like an aston martin vanquish with a 2.0L engine LOL!

E6600 is so much bottleneck to your GTX 295
 
I think last generation --> this generation is similar to 6800GT --> 7800GTX (so about 50% performance increase), instead of 70-100% we used to get (say 9800XT --> X800XT). Unless some seriously intensive games come out, I just don't see getting 70-100% performance increases. Add to this the problems of moving to 28nm and it's going to be even harder to squeeze out more brute force into a die. It's going to be all about efficiency and performance per clock in the next 1-2 years. I just cant see how NV is going to release a GTX580 with 70-100% the performance of the 480 part....in the next 12 months.

Games that use ambient occlusion should favor current cards. Best example is Stalker; a 5770 matches GTX 285 SLI there, and the game scales extremely well in SLI.
 
I think last generation --> this generation is similar to 6800GT --> 7800GTX (so about 50% performance increase), instead of 70-100% we used to get (say 9800XT --> X800XT). Unless some seriously intensive games come out, I just don't see getting 70-100% performance increases. Add to this the problems of moving to 28nm and it's going to be even harder to squeeze out more brute force into a die. It's going to be all about efficiency and performance per clock in the next 1-2 years. I just cant see how NV is going to release a GTX580 with 70-100% the performance of the 480 part....in the next 12 months.
We were right here when DX8 came to town. Geforce 2 owners weren't all that impressed, and with mature GF2 chip OCs being what they were, neither the GF3 nor Radeon 8500 were all that exciting, unless you had just bought an expensive CRT that could do 1600x1200@100Hz. By the time ATi got things taken care of, the GF4 was out. This time it's nVidia having the issues, coupled with a generation of products represented by feature additions, more than performance improvements. New buyers who got GF3 and 8500 cards, though, had the option to wait longer before their next upgrade, as previous gen cards hit very hard walls, lacking either required DX8 features, finding a new, bigger, monitor to just not work well, or having poor performance from lack of z-culling mechanisms. Even if nVidia had been competitive, we would not have had massive performance increases, with these early DX11 cards.
 
I bought a GTX 295 right before I came to Afghanistan. Is it still going to be able to hang @ 1920x1200 with the games that are out now? It's paired with a dinky E6600...
Overclock the E6600 to 3.4GHz+ and you'll be fine. I have no idea where people get that an E6600 is going to bottleneck you once you clock it.
 
I know what you are saying. I purchased my Radeon 8500 for $500 CDN or something when it just came out. Then I realized in 3 years it's worth about $40. That's why now I pretend to buy the new generation right when it comes out....then end up buying it way later in its life cycle. (i.e., 4890 for $175, GTX470 for $200). So the total cost of upgrades after selling the older parts is reasonable.

I found out quickly that buying a $400/500 graphics card and keeping it for 3 years actually costs more than upgrading more often at $200 price levels and you get better performance too! Just something to think about next time. :awe:

i also ALWAYS *used* to get the "latest and greatest" and i did the same thing, buying cards basically the day when they came out.

In retrospective, i behaved like a fool.

Most cards seem to always be one generation ahead with their features - ie. REAL LIFE games/applications who actually support the features of just released new cards are RARE. I dont say they dont exist, but its usually only a few games/apps. (Like at the moment with problem-free DX11 support)

Buying a card a few months later not only means you get them significantly cheaper - it also means that more games might support the features the card has.

I remember times where i spent $500ish on a card...that's why i said when i spent "only" €200 on that 275 knowing that the 295 exists it "felt" more like getting a mid-range card 🙂

I know that the DX11 cards have interesting stuff to offer (eg. TESSELLATION!) - but i also remember the 8500 disaster. That card had an early form of tessellation which never made it into the main stream and was literally nothing more than a gimmick.

So, buying a card solely for being able to run some Nvidia demos is foolish, kind of.

Anway, in some months things could look different.

(Edit: There are SOME criteria which still would make the new cards interesting, eg. transparency antialising peformance AND COMPATIBILITY (see BF2:BC!) and, yes, AO performance. But i dont have enough data right now which would legit the "upgrade...in particular eg. whether in BF2:BC TRAA really works with Antialiasing with the new cards.)
 
Last edited:
Overclock the E6600 to 3.4GHz+ and you'll be fine. I have no idea where people get that an E6600 is going to bottleneck you once you clock it.

It only goes up to 3.0ghz. I didn't realize notice a difference overclocking it with my 8600gt (last video card.)
 
It only goes up to 3.0ghz. I didn't realize notice a difference overclocking it with my 8600gt (last video card.)
It'll go much higher than 3.0GHz, read some of the overclocking guides available. And your 8600GT was such a bottleneck I'm not surprised you couldn't tell a difference.
 
i also ALWAYS *used* to get the "latest and greatest" and i did the same thing, buying cards basically the day when they came out.

In retrospective, i behaved like a fool.

Most cards seem to always be one generation ahead with their features - ie. REAL LIFE games/applications who actually support the features of just released new cards are RARE. I dont say they dont exist, but its usually only a few games/apps. (Like at the moment with problem-free DX11 support)

Buying a card a few months later not only means you get them significantly cheaper - it also means that more games might support the features the card has.

I remember times where i spent $500ish on a card...that's why i said when i spent "only" €200 on that 275 knowing that the 295 exists it "felt" more like getting a mid-range card 🙂

I know that the DX11 cards have interesting stuff to offer (eg. TESSELLATION!) - but i also remember the 8500 disaster. That card had an early form of tessellation which never made it into the main stream and was literally nothing more than a gimmick.

The 8500 tessellation was used on a couple of games like the following;

Counter-Strike
Tom Clancy's Rainbow Six
Soldier of Fortune
Soldier of Fortune II: Double Helix
Quake
Quake 2
Unreal Tournament
The Elder Scrolls III: Morrowind
Madden NFL 2004
Bugdom
Return to Castle Wolfenstein
Serious Sam
Unreal Tournament 2003 and 2004
Wolfenstein: Enemy Territory
Command & Conquer: Renegade
Neverwinter Nights

But that's it, never took off like it should.

By the time ATi got things taken care of, the GF4 was out. This time it's nVidia having the issues, coupled with a generation of products represented by feature additions, more than performance improvements. New buyers who got GF3 and 8500 cards, though, had the option to wait longer before their next upgrade, as previous gen cards hit very hard walls, lacking either required DX8 features, finding a new, bigger, monitor to just not work well, or having poor performance from lack of z-culling mechanisms. Even if nVidia had been competitive, we would not have had massive performance increases, with these early DX11 cards.

The Radeon 8500 was more advanced than the Geforce Ti 4xx series, it supported Truform and Pixel Shaders 1.4, even though GeForce 4 memory controller was more sophisticated, its anti aliasing was slightly more usable and had faster and better anisotropic filtering. The Radeon 8500 was weak in texturing/filtering power, plus it was using Super Sampling which even for today's standards, its incurrs in a huge performance hit in current games and even in many old games.
 
I remember 8500 tessellation and Unreal 🙂 The result was that all chars looked like they were "blown up".. 🙂 As for Radeon AA....i never had a problem back in those days with AA performance...and it always looked good. Better than what i see today with Nvidia.
 
RussianSensation said:
Were GeForce 4xxx owners able to play Battlefield 2? I remember there was a minimum required support for PS1.4 or something IIRC?
So, you could have been happily gaming away for three years, already. Then, when it happened, you could have gotten a 9800 of some kind for a good price, or FX 5900XT, by then (or X-series, but they were a touch expensive around that time, IIRC), and then you would have been set for another few years. If you're looking for something better around the corner, you can bet it will be there, and almost always be right. Better has just been on vacation for a year, and is still ignoring his cell.

The Radeon 8500 was more advanced than the Geforce Ti 4xx series, it supported Truform and Pixel Shaders 1.4, even though GeForce 4 memory controller was more sophisticated, its anti aliasing was slightly more usable and had faster and better anisotropic filtering. The Radeon 8500 was weak in texturing/filtering power, plus it was using Super Sampling which even for today's standards, its incurrs in a huge performance hit in current games and even in many old games.
The performance wasn't that great, and it had flaky drivers--these new kids complaining about AMD's graphics drivers just don't know. If you already had a previous gen card, there wasn't a compelling reason to buy a new one, until you finally hit a feature or performance wall. The cases where the new features could shine weren't yet common. By the time they became common, its overall performance would be lacking. However, if you had been two generations behind, it looked pretty good, and a much better option than buying the last gen's high-end. It had features that would get used a good bit in the future, it scaled up in resolution better, the price wasn't too high, and your current card wasn't cutting it, anymore, so it wouldn't make too much sense to wait another year.

Many people want Fermi to be the greatest thing ever, wipe the floor with the current Radeons, and their own last gen, and for less money, to boot...yet, it doesn't, and it won't. It was a new spin on their already-excellent designs, with DX11 features added in, and a computing emphasis, that then got delayed over and over and over (nV's addiction to big dice don't help, I'm sure); while AMD managed two nearly perfect roll outs in a row. The GTX 260 and up weren't the best in performance per watt, and were edged out in performance per dollar, but they had excellent raw performance. That doesn't make the GF104 a bad value, as much as it makes what you already have a good value. Be happy, and wait for 28nm GPUs.
 
Last edited:
I dunno about compared to a GTX275, but the GTX470 I just got SMOKES my GTX260 at 1920x1080 w/ everything maxed in newer games. Some of the advanced settings in Just Cause 2 and Metro 2033 really brought my GTX260 to it's knees, but the 470 pretty much breezes through it. Overall, very happy with my purchase.
 
Back
Top