3700+ or X2 3800+

Golden Leg

Junior Member
May 8, 2006
13
0
0
I'm building a new computer, and I have all of my components selected except for one: the CPU.

I was originally going to go with a 3700+ as it was cheaper and seemed to offer the same gaming performance as the X2 3800+.

My question to you all is if I should go with the 3700+ or the X2 3800+. I am going to use the computer only to play games.

Thanks a lot. Oh, and this is what I'm getting (ignore the CPU).
 

biostud

Lifer
Feb 27, 2003
19,887
6,985
136
Originally posted by: Golden Leg
Would you go into detail as to why?

games are starting to take advantage of the 2nd core and will do so in the future.
 

F1shF4t

Golden Member
Oct 18, 2005
1,583
1
71
Originally posted by: biostud
Originally posted by: Golden Leg
Would you go into detail as to why?

games are starting to take advantage of the 2nd core and will do so in the future.

even if it does not take advantage of it its usefull to have all the windows firewall, virus scan stuff running on the second core. Some games are considerable more smooth to play with a dual core cpu, san andreas and neverwinter nights are some of the examples.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
If you're overclocking then the 3800X2 will almost equal the 3700 for most games (single cores tend to OC higher than dualies). If you're running at stock then the 3700+ will be faster. Despite the advertised dual core support for future games i doubt it's going to be a serrious issue in the near future.
 

soloz2

Member
Apr 20, 2006
145
0
0
it depends on your goals. the 3700+ will be faster at stock speeds for single treaded applicatins like games becuase it is 2.2Ghz instead of 2ghz and has double the cache of the x2. However, for multitasking, and general use the x2 will probably be faster.
If you want only a slight OC you can get the best performance out of the x2 if you can get it to 2.4-2.5 or so which should be possible for a 3800+x2
If you just want to squeeze as high of an OC as you can the 3700+ wouldprobably be better as it shouldbe able to get to2.8-3Ghz

I would actually suggest a 3rd alternative. get an opteron 165. while it is only 1.8Ghz stock, it does have 1mb cache per core so the performance at stock will be very similar to the 3800+x2. And almost all 165's get to 2.6Ghz.
I am currently stress testing my opty 165 at 2.9Ghz on 1.37v
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
I'll repeat what I've said elsewhere... and I've been flamed hard for it as it seems my opinion is in the minority. But... while the CPU does play a factor in gaming performance, the rest of your system also plays a very important part-- i.e. what video card you have, what resolution you play at, etc. I don't know what games you play or want to play, but here is BF2 with chips ranging from an FX57 to a Celeron D. At a mid-range resolution (1280x1024), there is virtually no difference between an FX57 and a Sempron 3400+ in BF2. The difference would be even less at higher resolutions, and likely more pronounced at lower resolutions. The graphics card is a 7800GT.

Here is FEAR. It's even more GPU dependent. As you can see, they even dropped the resolution down lower to 1024x768 and show that an FX57 performs the same as a low-end Celeron D. (Another FEAR example)

As you can see, a simple 3000/3200+ would be a fine gaming CPU, and it would save you some money to ensure you can afford the fastest video card possible (or better monitor or something). If you already have a great display, and have the budget for a GTX or XT (or dual GPU solutions) and STILL have extra money, go ahead and get the dual core (x2 3800+ or 165 opty). All, IMO, of course.
 

soloz2

Member
Apr 20, 2006
145
0
0
Originally posted by: deadseasquirrel
I'll repeat what I've said elsewhere... and I've been flamed hard for it as it seems my opinion is in the minority. But... while the CPU does play a factor in gaming performance, the rest of your system also plays a very important part-- i.e. what video card you have, what resolution you play at, etc. I don't know what games you play or want to play, but here is BF2 with chips ranging from an FX57 to a Celeron D. At a mid-range resolution (1280x1024), there is virtually no difference between an FX57 and a Sempron 3400+ in BF2. The difference would be even less at higher resolutions, and likely more pronounced at lower resolutions. The graphics card is a 7800GT.

Here is FEAR. It's even more GPU dependent. As you can see, they even dropped the resolution down lower to 1024x768 and show that an FX57 performs the same as a low-end Celeron D. (Another FEAR example)

As you can see, a simple 3000/3200+ would be a fine gaming CPU, and it would save you some money to ensure you can afford the fastest video card possible (or better monitor or something). If you already have a great display, and have the budget for a GTX or XT (or dual GPU solutions) and STILL have extra money, go ahead and get the dual core (x2 3800+ or 165 opty). All, IMO, of course.


you make a good point! and I will grant you that... to an extent. There is not much difference between cpus as for gaming gpu is much more dependant. However; cpu, RAM, and hard drives play an important role in load times and can also help keep the framerate from dropping. But, if everything but the cpu is changed in a system there will be a performance increase or decrease, even if slight.
I went from a 3200+ Venice with 1gb of RAM and a 7800GT to an opteron 148 (kept the RAM and gpu the same) and I saw an increase in performance across the board. Games played more smoothly and all other applications were faster.
The move from 1gb of RAM to 2gb of RAM also showed a performance boost!
Going from a 7900GT from my 7800GT as expected also helped things.

So, while the rest of your system does affect performance, so does your cpu. I now have a dual core and have not had it long enough to fully test it out, but my 165 at 2.88Ghz (same speed that I ran my 148 at 24/7) does give me a slight increase in gameplay. My framerates drop less so overall gameplay is more smooth

In short, do not short yourself on any component if you don't have to.
 

m1ldslide1

Platinum Member
Feb 20, 2006
2,321
0
0
I've never seen the 3800+ in action, but with the 3700+ intuitively you have:
200mhz higher clock
double the L2 cache
Probably better overclock

Weigh that against *maybe* having worthwhile multi-core support in FUTURE games.

The idea of the extra 3800+ core running windows in the background to improve frame rates is a fallacy. My CPU runs at <1% utilization while staring at my windows desktop and PCProbe. In other words, the resources required by windows in the background are negligible and do not override the benefits of extra mhz and extra L2 cache.

In other words, my OC'd 3700+ would kick the 3800+ up and down the block every which way but Sunday (don't know if my cliche syntax is right) in strictly gaming performance, at least for the next year or more.

If you buy a 3700+ you can use the money you save on a video card and you'll be stoked until it's time to move away from socket 939 anyways.

Seems like most people don't understand the concept of a strictly gaming rig. I don't even have a DVD burner in mine, and I never ever multitask. I just assemble the best gaming parts I can afford.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: soloz2
So, while the rest of your system does affect performance, so does your cpu. I now have a dual core and have not had it long enough to fully test it out, but my 165 at 2.88Ghz (same speed that I ran my 148 at 24/7) does give me a slight increase in gameplay. My framerates drop less so overall gameplay is more smooth

In short, do not short yourself on any component if you don't have to.

I agree. A system's overall performance is definitely a sum of its parts. However, we are seeing a new trend in gaming lately. They're becoming more graphically demanding. We used to benchmark at 1024x768 in order to show CPU scaling, but as we can see in a benchmark of FEAR, there is very little to no CPU scaling at 1024x768. Keep in mind that FEAR is a 7-month old game now. Newer games, such as Oblivion, are pushing the GPU even more... and it was even coded (according to developers) with dual-core optimizations. But as we can see here, all CPUs are created equal at resolutions 1280x1024 and above.

I'm totally for a balanced system in order for best performance... however, if Oblivion is a sign of things to come, any extra money would be better spent on a high-end video solution rather than a faster (or dual-core) CPU. Just taking some pricing from Anandtech's real-time pricer (and, bear with me... I know I've put waaaay too much thought into this):

If your budget is say $600 for CPU and GPU...

1) x2 3800+ dual core & 7900gt
.......total price-- $567
.......avg fps in Oblivion at 1280x1024.......17.2

2) Now you could save an extra $185 and get a single-core 3000+ because...
.......avg fps in Oblivion at 1280x1024.......the same as the x2 3800+ since Oblivion doesn't care about CPU speed

3) Or, you could spend a little over budget for a 3000+ & SLI 7900GTs
.......total price-- $652
.......avg fps in Oblivion at 1280x1024.......31.7

Now, granted you've just gone over budget by $52 and are paying 15% more than config #1, but you've also increased your performance by 84%. Not a bad trade-off for 15% more money.

I know there are a lot of other things to consider-- such as is the PSU good enough to handle the SLI, do you already have an SLI mobo, etc. My point is that if you find yourself with a budget, building a balanced system is a good thing... but, depending on the games you play, you may find that going more lopsided with a cheaper CPU and VERY powerful GPU(s), you could greatly benefit. I'll take that $652 choice #3 over that $567 choice #1 for any purely gaming rig out there (honestly, I'd rather pick the x19xt for Oblivion instead but that's a subject for another thread).

I hope my long-winded example makes sense. There are way too many variables out there for me to say that my idea will work across the board for a gaming rig. After all, they were able to get an amazing 30% improvement in Quake 4 with dual-core optimizations at 1280x1024 (though there is no gain at all at 1600x1200). I hope that trend continues. I don't forsee Oblivion to be the odd-ball in the graphics department either. Games are getting more and more graphically intense, as seen by COD2 and FEAR as well. And, we can see that right now, games are chewing up GPU power as fast as they can get it... while levelling most CPUs on equal ground at mid-to-high resolutions.

 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
a 200 MHz bump is really irrelevant considering a minor OC will put the x2 in that range.
also remember video cards drivers now use the second core to offload some of the load form the gpu so you'll definably see a boost.