so why buy the 6xx series ? (august '12)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hokies83

Senior member
Oct 3, 2010
837
2
76
AnandThenMan, do you have any specific info that says that God doesn't exist? If not, then God clearly exists and your argument seems unwarranted.

I like AMD a lot, but their company hasn't made (very much) money in 6 years. Their chief competitor is the biggest/baddest/best in the tech world. Their secondary competitor is the biggest/baddest/best in the gpu market. Besides, I never said that they were going away, I simply said that was one thing that might cross a person's mind when making a purchasing decision.



Do a little homework before spouting this FUD. NV has better sli drivers these days, but AMD has been just as solid if not better with single gpu drivers for years.

I do lotsa Homework

As i am a active person in Both gtx 670 gtx 680 and 7970 series clubs on OCN.. And have owned both and have benched both and have played the same games with both. :thumbsup:

Notice those Spikes in After Burner...

SAM_0933.jpg


SAM_0932.jpg


SAM_0921.jpg

My Wonderful Random every Hour or so Teal Screen Crash
SAM_0914.jpg
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91

liar, fanboy /sarcasm

If nvidia's current lineup didn't suck so hard I wouldn't be using this 480. 660Ti is crap due to bandwidth, 670/680 are oc limited due to throttling at 70c, their top teir card is barely 20 to 25% faster than their last top tier card, you only get a 50watt difference in power consumption and you get the pleasure of paying out the ass for it all.

Paying a premium for getting next to nothing is unacceptable.

My thoughts exactly. I'm not sure why many of you are so worked up about gpus right now. This gen flat out sucks from both camps.
 
Last edited:

hokies83

Senior member
Oct 3, 2010
837
2
76
liar, fanboy /sarcasm



My thoughts exactly. I'm not sure why many of you are so worked up about gpus right now. This gen flat out sucks from both camps.

Umm im no fan boy dude.. And if they worked like they were intended to work i would still have them

I had 2 7970s dude so yah

Look at my system... I have Intel Gigabyte G skill Asus and Galaxy... And have had Both Power Color And sapphire...
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
liar, fanboy /sarcasm



My thoughts exactly. I'm not sure why many of you are so worked up about gpus right now. This gen flat out sucks from both camps.

When I am building a new system from the ground up I am not content with a 4 year old video card. Sorry I don't like to turn graphics down and stuff.

GTX680 now loses in 5 major games: Skyrim+Mods, Batman with MSAA, Dirt Showdown, Sniper Elite and Sleeping dogs.

You know I'm gonna nit pick here...

So we are hand picking games? You said previously that Dirt Showdown was basically cheating Nvidia outright. Now you change your mind. Sleeping Dogs just came out and it's an AMD Evolved title to boot. I have never seen Skyrim shown to run better on AMD hardware. As for Batman...Physx adds a very real effect to the game. There are whole sections of smoke and effects that are gone when physx is off. It's a big deal for that title when taken side by side IMO. That outweighs any performance differences to me.

You also mentioned in this thread that BF3 doesn't matter. You made a thread talking about the new Medal of Honor running better on Nvidia hardware with the Frostbite 2 engine. So now you're going back on those comments? Battlefield 3 does matter because the engine will be in use for a while and it's a popular title still and a lot of people want to know how it will run on different cards that they might choose. You can't ignore BF3 but at the same time name titles that cheat one brand needlessly (dirt showdown graphics aren't that impressive to tank in framerate).


but we don't buy these things for just current performance what if you buy another game you love outside of your current library next year that nvidia tanks in because of bandwidth limitations so you have to look at its performance in more games than just the one you play and determine if the performance delta in one particular set of games you play is worth being stubbed in the wide variety because eventually you'll want to try something new

for example let's say you're a fan of both fps and racing games you got the 670 over the 7970 because of its enhanced ability in bf3 and cod4 but when you wanted to try out dirt showdown you have to turn down the visual quality and your experience suffers for it . same goes for games with direct compute components, games that use bandwidth, and if you like high amounts of aa

http://www.tomshardware.com/reviews/geforce-gtx-660-ti-benchmark-review,3279-2.html

You can never think of it that way ever. Nobody can predict the future. You must rely on what you play now to determine if something is worth it to you. It's not that hard if you look at the performance of different engines on your hardware. CryEngine 3, Frostbite 2, Unreal Engine 3 etc. Also have you played Dirt Showdown? It looks almost exactly like Dirt 3 but runs 10 times worse. That's ridiculous...I don't even call that fair at all. Just like I didn't call Crysis 2 very fair to AMD when it used tessellation oceans underground that were unnecessary.

not as likely because the next gen consoles will all be running amd gpus so devs will program more for amd strengths than nvidia so given all the factors in play it will likely shift in amds favor

the future is only a calculation of factors of the past just do the math amd you'll have a grasp of it

HAHA are you serious here? You really think making a game for a 7670 (rumored spec for next console which is a rebranded 6670) will run poorly on a GTX 670? For real? What does a 7670 do that is so fantastic? It has nothing in common with a 7950 BTW.

Here's a tip: if that is your standard then never upgrade again and run everything on high not ultra with no AA at below 1080p because that's about what the next consoles will be.
 
Last edited:

hokies83

Senior member
Oct 3, 2010
837
2
76
LoL yah Dirt ShowDown is a Cheap Amd ploy.. for benchmark cookies..

1 7970 maxxed out i could hold firm 60fps...

2 Heavy Overclocked 680s im Avg in the high 40s low 50s with dips in the high 20s lol.

Still love the game tho...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
LoL yah Dirt ShowDown is a Cheap Amd ploy.. for benchmark cookies..

1 7970 maxxed out i could hold firm 60fps...

2 Heavy Overclocked 680s im Avg in the high 40s low 50s with dips in the high 20s lol.

Still love the game tho...

I'm talking specifically about the graphics. It doesn't look that different from Dirt 3 but runs much worse which is pretty lame IMO. I wasn't into Dirt all that much but it came free with my 6950 a while back.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
So it's a good game but a cheap trick by AMD to optimize for it?

@cmdrdredd: good point, 4 year old video is probably in need of upgrade. I usually end up buying something each cycle, but I sadly I fear that I might have to skip this gen.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So it's a good game but a cheap trick by AMD to optimize for it?

@cmdrdredd: good point, 4 year old video is probably in need of upgrade. I usually end up buying something each cycle, but I sadly I fear that I might have to skip this gen.

I didn't say that about the game. I just feel the performance isn't justified by the visuals. I mean it it was photo realistic or it looked 2x better and more detailed than Dirt 3 I would say ok that's fair. I just don't see that in the game. I also said similar things about Crysis 2 using tessellation that was not rendered visually. I don't know if that was on purpose but there was enough of an uproar on the forums about it that I felt it didn't seem very fair when AMD cards were known at the time to have performance issues with tessellation.

Yeah I was running a GTX 295 which might not sound bad, but no DX11 really made it feel stale to me. I had already sold off my HD6950 to a friend for his build when I was swapping parts for my new system.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I do lotsa Homework

As i am a active person in Both gtx 670 gtx 680 and 7970 series clubs on OCN.. And have owned both and have benched both and have played the same games with both. :thumbsup:

Notice those Spikes in After Burner...

SAM_0933.jpg


SAM_0932.jpg


SAM_0921.jpg

My Wonderful Random every Hour or so Teal Screen Crash
SAM_0914.jpg


I really hope you don't run 2 graphics cards on what it looks to be a 19" 4:3 monitor.


FWIW, Powercolor OC cards have a seemingly higher failure rate due to lack of binning for their higher end skus. i.e. GPU's that have an Asic score of 65 and below.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I didn't say that about the game. I just feel the performance isn't justified by the visuals. I mean it it was photo realistic or it looked 2x better and more detailed than Dirt 3 I would say ok that's fair. I just don't see that in the game. I also said similar things about Crysis 2 using tessellation that was not rendered visually. I don't know if that was on purpose but there was enough of an uproar on the forums about it that I felt it didn't seem very fair when AMD cards were known at the time to have performance issues with tessellation.

Yeah I was running a GTX 295 which might not sound bad, but no DX11 really made it feel stale to me. I had already sold off my HD6950 to a friend for his build when I was swapping parts for my new system.

Hokies83 wrote that it was a cheap ploy by AMD, right after he said that he liked the game.
 

hokies83

Senior member
Oct 3, 2010
837
2
76
I really hope you don't run 2 graphics cards on what it looks to be a 19" 4:3 monitor.


FWIW, Powercolor OC cards have a seemingly higher failure rate due to lack of binning for their higher end skus. i.e. GPU's that have an Asic score of 65 and below.


That was a 17inch panel... I used when i sold my U2711 while i was waiting for my Catleap to come in the mail..

I was hoping the teal screen was due to using VGA...

But this issues still remained with the catleap

I Also had the Sapphire OC Non Ref card The power color would over clock higher and had a Higher Asic of 65% the Sapphires was 63% also the Sapphires Choke Whine was so bad it sounded like a Midget Screaming in your case after he was kicked in the balls.
SAM_1004.jpg


SAM_1003.jpg

SAM_1002.jpg


And heres a pic of my U2711 before i sold it

SAM_0826.jpg
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Both my 680s have 100% Asic rating which seems to be the norm for kepler 90%-100% ( im sure someone will say Asic does not mean anything or Nvidia has Gpu-z in there pay roll)

ASIC reading means nothing for Kepler, and that's been stated by W1zzard himself (the author of GPU-Z), about a week ago.

Haha, guess someone is saying ASIC doesn't mean anything for Kepler, else Wizzard is in nVidia's pocket.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
4) Future Multi-monitor Setups - As far as I know, GTX680s cannot run 5x1 monitor setups and with those Catleaps being so cheap, I was tempted to sell my U2412s and buy 5 of em and start playing in portrait mode. It looks pretty awesome if you've seen the setup.

I would love to see that setup in a pic, also if you ever get around to it, maybe you can give a quick update on how GTX680s or w/e else you in at that time can cope with 5x 2560x1440 screens. I honestly think for that level of pixel pushing power you'd want GTX780 Tri-SLI or something. :D

NV has better sli drivers these days, but AMD has been just as solid if not better with single gpu drivers for years.

I agree with that. It's different for 3 monitors though where NV cards simply don't have the GPU power to handle that much pixels, unless you get 3x GTX680s (but Annisman* showed that 3rd card scales very poorly). Xbitlabs showed that GTX690 SLI is just 5 fps faster overall than a ~$900 HD7970 Ghz setup with 3 monitors. For single monitor, SLI is probably more consistent. Single GPU cards, I wouldn't even talk about driver problems since they are on both sides.

AMD's are poverty cards and I would not dream of demeaning myself in such a manner.

Can you please send me the Sapphire HD7970 TOXIC? I'd love one, honestly I would. Since they are poverty cards, you should be able to buy one and send it to me :)

perfrel_1920.gif


GTX7970 TOXIC is 19% faster than a GTX670 FTW = GTX680.
perfrel_2560.gif

perfrel_2560.gif


If HD7970 TOXIC is a poverty card, I'll take it.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So we are hand picking games? You said previously that Dirt Showdown was basically cheating Nvidia outright. Now you change your mind. Sleeping Dogs just came out and it's an AMD Evolved title to boot. I have never seen Skyrim shown to run better on AMD hardware. As for Batman...Physx adds a very real effect to the game. There are whole sections of smoke and effects that are gone when physx is off. It's a big deal for that title when taken side by side IMO. That outweighs any performance differences to me.

I am not denying that Dirt Showdown, Sleeping Dogs and Sniper Elite are cheating NV. :p AMD Gaming Evolved. What I am saying is that AMD put $ behind their developer relations and now these 3 games perform faster on AMD. In which games is GTX680 much faster? Shogun 2, Lost Planet 2, Hawx 2, Project Cars, Hard Reset, World of Planes, Wargame: European Escalation, maybe a couple other games escaping me. That list is getting smaller every day AMD's driver team is at work trying to claw back the performance delta, while NV fixed none of their performance issues in Anno 2070, Bulletstorm, Serious Sam 3, Alan Wake, Arma II Operations, Dirt Showdown, Sleeping Dogs, Sniper Elite 2, Metro 2033, Crysis 1/Warhead, etc.

So if you look at 20-30 games, NV barely has any wins left and where it loses, those losses are huge.

As for Skyrim, it has been running faster on HD7970 series since June 22nd, with Cats 12.7. AT tests it at 1080P with no mods and 4xMSAA, which is CPU limited for modern cards. If they are going to test Skyrim at 1080P, they should enable ENB mods or something:
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660-ti/24/

Alternatively, take up the settings to shift the load to the GPU:

skyrim_2560_1600.gif

1345736700tJwmf64Bk6_2_4.gif


As for Batman, I said before that it's one of the best PhysX implementations on the NV side. Not denying that it looks better with PhysX. But it runs worse with MSAA on NV cards:

http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-660-ti/28/

I think PhysX > MSAA in Batman, so overall the NV cards provides a better graphical experience. Then again, it's not just 5 games. Look at the list of games I noted above. My point is I can't think of many games where GTX680 is actually faster by 20-30%, but I can think of plenty of games where 7970 Ghz edition is.

You also mentioned in this thread that BF3 doesn't matter.

I said BF3 doesn't matter because the performance difference between an OCed 670/680 and OCed 7970 is single % digits. At your resolution, it's just single frames. BF3 now is a game where the performance between an OC 7970 and OC 680 is so close, it's a wash. Before GTX680 was killing 7970 in it. Of course if you don't overclock, the something like a Gigabyte Windforce 3x 670 > 925mhz 7970 in BF3.

You made a thread talking about the new Medal of Honor running better on Nvidia hardware with the Frostbite 2 engine. So now you're going back on those comments?

Medal of Honor does look great for NV right now in Beta, not denying that. But look at what happened with Guild Wars 2? AMD's driver team has time to fix their performance. If Medal of Honor driver issues are fixed by AMD where it's again just single digits, then it won't matter again.

Also have you played Dirt Showdown? It looks almost exactly like Dirt 3 but runs 10 times worse. That's ridiculous...I don't even call that fair at all. Just like I didn't call Crysis 2 very fair to AMD when it used tessellation oceans underground that were unnecessary. I'm talking specifically about the graphics. It doesn't look that different from Dirt 3 but runs much worse which is pretty lame IMO.

Ya vs. Dirt 3, I am not seeing anything worth a 10-15 fps hit, nevermind a 50-60 fps performance hit. I am hoping someone comes up with a way to have more realistic global lighting model without such a severe lighting hit. At the same time when HDR came out, it hammered GeForce 6800 series in Far Cry. So maybe we need 2-3 generations of cards to see if this is a problem with modern cards.[/quote]

I think the performance hit in Dirt Showdown is not worth the graphical improvement personally. So I agree with you on that. But the game is playable on AMD cards nonetheless. I also don't think the performance hit with SSAA and HDAO is worth it in Sleeping Dogs. I think Crysis 1 looks better than Sleeping Dogs and actually runs smoother. But if DirectCompute is the direction future games will go more and more if AMD starts throwing more $ at developers, it could grow from 3 games to 5-10 games, etc.

Actually I am amazed that since Crysis 1, Metro 2033, Witcher 2, BF3, hardly any new games have come out that look pretty, despite them having a huge performance hit with their global lighting/HDAO/contact hardening shadows, etc. From that point of view I am not overly impressed with the level of optimization these new graphical features provide vs. their performance hit. I am not sure if these new extra features will really eat GPU performance or they are just not optimally coded in games at the moment.
 
Last edited:

hokies83

Senior member
Oct 3, 2010
837
2
76
How about this.... I have a 680 who with a 7970 wants to have a game bench off?

3rd party results are crap imo.. Cause on OCN people post the same stuff but it shows a diff outcome.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126

Read his post I quoted. He preemptively defended against GPU-Z ASIC doesn't work or GPU-Z is in the pocket of nVidia.

You showed W1zzard said it doesn't work, so I just jokingly put it together - either it doesn't work or W1zzard is employed by nVidia.

Bah, if I have to explain it, it isn't as funny.

Kepler has 100% ASIC, just remember that part :)
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Hey Russian thanks for at least being honest in your response. Also what performance issue in sleeping dogs? I run 2560x1440 maxed out except AA on high not extreme and gwt over 80fps avg and min around 60. With AA maxed I get averages of 65ish and min of 30.