Best Graphics Card for Phenom II X4 980BE

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Clutchc

Member
Dec 12, 2012
49
0
0
A few last questions (maybe)...

1) The rebate that comes with the AMD cards expires May 15. Is that based on the day of purchase or date the card is received or date one mails in the rebate?

2) What AMD cards are Mantle ready?

3) Does a memory bandwidth over 256 bit really do any good at 1920x1080p?
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
The above post is misleading as it cites three cases, two of which use TITAN GPUs. The third case is BL2 with GTX 680 which excels at that particular game. Going from 51 fps average to 61 fps average isn't exactly earth shattering. A CPU upgrade to Intel means new mobo, CPU, and possibly RAM at minimum and isn't something to be taken lightly. OP's CPU even when not overclocked is enough to drive at least a GTX 760 for most games at OP's resolution without tons of bottlenecking.

Thank you. Certain posters make it sound like the CPU is utterly useless for gaming at 1080p when paired with a fairly capable GPU and that's not the case. There are many, many recent games it will do just fine with at good quality settings. Yes, its aging, but you can't compare the cost of the items you laid out with just purchasing a new midrange GPU (which depending on the old one could help quite a lot). I'm fairly sure I can get another year or so out of my X4 980.

A few last questions (maybe)...

1) The rebate that comes with the AMD cards expires May 15. Is that based on the day of purchase or date the card is received or date one mails in the rebate?

Date of purchase. Even if you buy on the last day of a rebate period you have anywhere from a couple weeks to a month to get the rebate forms in (read the fine print).
 

Makaveli

Diamond Member
Feb 8, 2002
5,014
1,614
136
Thank you. Certain posters make it sound like the CPU is utterly useless for gaming at 1080p when paired with a fairly capable GPU and that's not the case. There are many, many recent games it will do just fine with at good quality settings. Yes, its aging, but you can't compare the cost of the items you laid out with just purchasing a new midrange GPU (which depending on the old one could help quite a lot). I'm fairly sure I can get another year or so out of my X4 980.



Date of purchase. Even if you buy on the last day of a rebate period you have anywhere from a couple weeks to a month to get the rebate forms in (read the fine print).

You are correct it all depends on which games and at what settings.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
Thank you. Certain posters make it sound like the CPU is utterly useless for gaming at 1080p when paired with a fairly capable GPU and that's not the case. There are many, many recent games it will do just fine with at good quality settings. Yes, its aging, but you can't compare the cost of the items you laid out with just purchasing a new midrange GPU (which depending on the old one could help quite a lot). I'm fairly sure I can get another year or so out of my X4 980.
I am with you on this one. You upgrade CPU when you think it is necessary, because upgrading CPU's is often a lot more hassle than swapping out a video card for a faster one. I know, many people go for balanced systems and what not, but in reality, you decide when it is time for upgrades. After all, it's your computer and you know better than anybody here how it performs in your games at your settings :thumbsup:

You know, at one point I had an AGP Gainward 7800GS+ (G71) paired with an ancient Pentium 3 1.4 Ghz, 512MB RAM and it played my games fine at my settings, albeit I was using maybe a quarter of that GPU on a best day. Ah, damn, who cares, I had a blast :D
 
Last edited:

toughtrasher

Senior member
Mar 17, 2013
595
1
0
mysteryblock.com
I am with you on this one. You upgrade CPU when you think it is necessary, because upgrading CPU's is often a lot more hassle than swapping out a video card for a faster one. I know, many people go for balanced systems and what not, but in reality, you decide when it is time for upgrades. After all, it's your computer and you know better than anybody here how it performs in your games at your settings :thumbsup:

You know, at one point I had an AGP Gainward 7800GS+ (G71) paired with an ancient Pentium 3 1.4 Ghz, 512MB RAM and it played my games fine at my settings, albeit I was using maybe a quarter of that GPU on a best day. Ah, damn, who cares, I had a blast :D

Shows you don't need much balance to game, tho it does put a load on your GPU.

I used to do the same thing with one of my video cards & my Pentium 4 :D
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
A few last questions (maybe)...

1) The rebate that comes with the AMD cards expires May 15. Is that based on the day of purchase or date the card is received or date one mails in the rebate?

2) What AMD cards are Mantle ready?

3) Does a memory bandwidth over 256 bit really do any good at 1920x1080p?

1) Can't speak to this sorry

2) All of them, but the 260(x) and 290(x) get superior scaling out of Mantle in Battlefield 4 at least, while all the cards get fairly good scaling in Thief. I don't think we can reliably extrapolate how future games will scale based on just 2 games (both with general buginess issues too). Mantle would help significantly on a X4 980

3) Irrelevant. The memory bus size is pointless to talk about in the abstract. You need to talk about the specific card, and whether that specific card has sufficient memory bandwidth for 1080p. If you are referring to the 7950/7970/280(x) vs. the GTX 680 or 770, then it does not make much of a difference at 1080p right now. Kepler seems to have more efficient memory bandwidth, and Tahiti just has more of it. Different architectures have different memory needs based on the number of levels of cache, design of cache, amounts of cache, speed of memory attached, etc. So its not helpful to talk about bus size without the context of the card its in.
 
Last edited:

ColonelBlimp

Member
Jun 12, 2013
33
0
0
I have to say the 280 or 280x here, particularly as the prices have dropped to almost normal levels.
The point about Mantle is a good one if you play BF4.
I have noticed a good increase in performance with my Phenom 965 (at 3.8ghz) and 280x compared to Direct X. I haven't measured it but it is most definitely there.

Of course, the advantage of the getting a higher performing card now is that you can take it with you when you get round to upgrading the Mobo/CPU
 

Clutchc

Member
Dec 12, 2012
49
0
0
Thank you all for your insight. It was very helpful. I ended up getting the SAPPHIRE DUAL-X R9 280 that Newegg had for $230.
http://www.newegg.com/Product/Produc...82E16814202099
The hardest part was trying to find 3 free games. Ended up getting Hitman, Tomb Raider, and Murder Soul Suspect.

@ ColonelBlimp
I'm not much of a fan of playing tag with guns, too old I guess. So I won't get a chance to test out Mantle w/BF4. But I have several other games I will be playing with it.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136


Have a look at Core 2 Quad Q9450 (default at 2.66GHz) and how it raises performance with Mantle.
Q9450 is slower than Phenom II 980BE but with Mantle it can play at 50fps Average and over 30fps Minimum with HD7950 @ 1000MHz

http://forums.anandtech.com/showthread.php?t=2370857&page=6&highlight=thief+mantle

x37q0j.jpg


2nivdli.jpg
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Thank you all for your insight. It was very helpful. I ended up getting the SAPPHIRE DUAL-X R9 280 that Newegg had for $230.
http://www.newegg.com/Product/Produc...82E16814202099
The hardest part was trying to find 3 free games. Ended up getting Hitman, Tomb Raider, and Murder Soul Suspect.

@ ColonelBlimp
I'm not much of a fan of playing tag with guns, too old I guess. So I won't get a chance to test out Mantle w/BF4. But I have several other games I will be playing with it.

Good choice on the card and on the games. Tomb Raider was one of the best games released last year. Hitman is also great, and both can be played without firing a weapon, if you're very, very good.

It's funny, I was just saying to someone yesterday who was criticizing the violence of BF4 that it's no different than playing tag. Then she reminded me that it involved guns.

Just a tip - Newegg dropped the price on that card today - you might want to ask if they'll credit you the difference: http://www.newegg.com/Product/Produc...82E16814202099
 
Last edited:

ColonelBlimp

Member
Jun 12, 2013
33
0
0
Thank you all for your insight. It was very helpful. I ended up getting the SAPPHIRE DUAL-X R9 280 that Newegg had for $230.
http://www.newegg.com/Product/Produc...82E16814202099
The hardest part was trying to find 3 free games. Ended up getting Hitman, Tomb Raider, and Murder Soul Suspect.

@ ColonelBlimp
I'm not much of a fan of playing tag with guns, too old I guess. So I won't get a chance to test out Mantle w/BF4. But I have several other games I will be playing with it.

Too old? How old is too old? I'm 46, is that too old? Good choice with the 280 btw :)
 

Clutchc

Member
Dec 12, 2012
49
0
0
Too old? How old is too old? I'm 46, is that too old? Good choice with the 280 btw :)
No, 46 is young. I'm older than dirt.
Anyway, I have another 3 questions.

1) I got this R9-280 mentioned above and it isn't performing very well. My FPS is lower than I had with a GTX 750 Ti in several game benchmarks I've tried. The GPU usage never goes over 60 % - 80%. Mostly it is around 50% during game benchmarks. It's not the CPU holding it back, it never goes above 50% - 70% either. Mostly the CPU and GPU stay around 50% and 60%. I ran the Metro 2033 benchmark at max settings and got horrible results; stutter and FPS were worse than I had with any card previously. But, I ran Furmark and got it to run at 98%-100% (according to Furmark). Defective?

2) Certain games will not display using HDMI. I get a "input not supported" msg on a black screen. Tried 2 different monitors that work fine with every other card using HDMI. But when I use the DVI output, the issue is gone. Another sign of a defect?

3) What is the button with the Sapphire logo on it for? Nothing in the manual and nothing on the web page about it. It doesn't appear to be the OC button like the HD 7950 had. Pushing it during a 3D app does nothing to the clock speed.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
No, 46 is young. I'm older than dirt.
Anyway, I have another 3 questions.

1) I got this R9-280 mentioned above and it isn't performing very well. My FPS is lower than I had with a GTX 750 Ti in several game benchmarks I've tried. The GPU usage never goes over 60 % - 80%. Mostly it is around 50% during game benchmarks. It's not the CPU holding it back, it never goes above 50% - 70% either. Mostly the CPU and GPU stay around 50% and 60%. I ran the Metro 2033 benchmark at max settings and got horrible results; stutter and FPS were worse than I had with any card previously. But, I ran Furmark and got it to run at 98%-100% (according to Furmark). Defective?

2) Certain games will not display using HDMI. I get a "input not supported" msg on a black screen. Tried 2 different monitors that work fine with every other card using HDMI. But when I use the DVI output, the issue is gone. Another sign of a defect?

3) What is the button with the Sapphire logo on it for? Nothing in the manual and nothing on the web page about it. It doesn't appear to be the OC button like the HD 7950 had. Pushing it during a 3D app does nothing to the clock speed.

Can you download and run either 3dMark11 or the newest 3dMark (2013). That will give us some ability to compare benchmarks.

Furmark is not a benchmark, it's a stress test.
 

Clutchc

Member
Dec 12, 2012
49
0
0
Can you download and run either 3dMark11 or the newest 3dMark (2013). That will give us some ability to compare benchmarks.

Furmark is not a benchmark, it's a stress test.

OK. I'll do that now. I already have several the Futuremark pgms installed.
And yes, I realize Furmark is a stress test, I just mentioned it to indicate that for some reason it took the card to 100% while the other pgms didn't. Maybe as a stress test, it forces the GPU to perform.
 
Feb 19, 2009
10,457
10
76
No, 46 is young. I'm older than dirt.
Anyway, I have another 3 questions.

1) I got this R9-280 mentioned above and it isn't performing very well. My FPS is lower than I had with a GTX 750 Ti in several game benchmarks I've tried. The GPU usage never goes over 60 % - 80%. Mostly it is around 50% during game benchmarks. It's not the CPU holding it back, it never goes above 50% - 70% either. Mostly the CPU and GPU stay around 50% and 60%. I ran the Metro 2033 benchmark at max settings and got horrible results; stutter and FPS were worse than I had with any card previously. But, I ran Furmark and got it to run at 98%-100% (according to Furmark). Defective?

Did you ran Metro 2033 with PhysX enabled? That would cause major slowdowns.

Also, not all games will be using all 4 threads on your CPU, so while your CPU is showing 50-70% it may actually be 100% load on 2 threads, but windows throw the threads around to balance the load.

GPU usage at 50% indicates CPU bottleneck. Furmark is a GPU stress test, it doesn't care about your CPU so thats why you get your 280X to run at 100%.

To identify the CPU bottleneck in your situation, if you overclock your CPU and you see a good increase in the game benchmark results, its definitely the bottleneck.
 

Clutchc

Member
Dec 12, 2012
49
0
0
Did you ran Metro 2033 with PhysX enabled? That would cause major slowdowns.

Also, not all games will be using all 4 threads on your CPU, so while your CPU is showing 50-70% it may actually be 100% load on 2 threads, but windows throw the threads around to balance the load.

GPU usage at 50% indicates CPU bottleneck. Furmark is a GPU stress test, it doesn't care about your CPU so thats why you get your 280X to run at 100%.

To identify the CPU bottleneck in your situation, if you overclock your CPU and you see a good increase in the game benchmark results, its definitely the bottleneck.
1) Core temp (for one) showed all threads working at various loads.
2) I have a 280 not a 280X. My 280 never showed 100% except in Furmark.
3) My CPU is overclocked.
 

Clutchc

Member
Dec 12, 2012
49
0
0
Btw, I just removed my PH II X2 B59 that I had unlocked and OC'd, and put my Ph II X4 965BE back in. I thought that might be part of the problem. But the 965 is set to the same 3.74GHZ as the unlocked PhII X4 B59 was for consistency.

Firestrike score: http://www.3dmark.com/fs/2151542
(it reports my 280 as a 7950. I guess that's expected)
 
Feb 19, 2009
10,457
10
76
That's what I just said, if you get 4 threads at 50% its actually more like 2 threads only, but windows spread it around on all 4, switching many times per second. What you should see if your GPU has enough grunt, and the game actually supports 4 threads, is all 4 cores pegged >90%

A Phenom II in a light threaded game is not going to push any decent GPU at all.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Btw, I just removed my PH II X2 B59 that I had unlocked and OC'd, and put my Ph II X4 965BE back in. I thought that might be part of the problem. But the 965 is set to the same 3.74GHZ as the unlocked PhII X4 B59 was for consistency.

Firestrike score: http://www.3dmark.com/fs/2151542
(it reports my 280 as a 7950. I guess that's expected)

Right off the bat, that looks spot on. Your card is coming up with a score 5% above my overclocked 7870, exactly as it should. And your CPU is about 40% as fast as my 4770K. Again, totally expected.

I'd say everything is fine, but if you want to provide another few benchmark scores, I can confirm whether they're in the ballpark.

As we mentioned from the start, you were buying a card well beyond the capabilities of your platform, but of course it should always be faster than what you had before or there may be a problem.
 

Clutchc

Member
Dec 12, 2012
49
0
0
Well, I'll mull that over for a bit. I was expecting better I guess. I ran the Metro2033 benchmark at max settings and it was unplayable. Maybe I'll leave the 965BE in and OC more it and see what the card will do. I'm afraid to take the PhIIX2 any higher. I was lucky it unlocked to a quad and was stable up to 3.7GHZ.

What do you recommend for watching CPU and GPU usage during runs?

Can anyone answer my other 2 questions?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Well, I'll mull that over for a bit. I was expecting better I guess. I ran the Metro2033 benchmark at max settings and it was unplayable. Maybe I'll leave the 965BE in and OC more it and see what the card will do. I'm afraid to take the PhIIX2 any higher. I was lucky it unlocked to a quad and was stable up to 3.7GHZ.

What do you recommend for watching CPU and GPU usage during runs?

Can anyone answer my other 2 questions?
you were asked if you were running physx in Metro2033 but I never saw you answer. and the bench for both Metro games will hitch a bit unless you have more than 4 threads.