A new AMD build

RD48

Junior Member
Apr 4, 2008
9
0
0
Hello, first time posting here.

I need to put together a new system. My current system is about 7 years old (AMD 1700+). I am on a budget right now.

What I have so far is a, Corsair 550w power supply, 2gig Kingston HyperX 1066 memory, Seagate SATA 250gig HDD, Lito-on SATA DVD-rw drive.

I am looking at completing the build with an AMD 64 X2 5600, GIGABYTE GA-MA770-DS3 AM2+/AM2, and a EVGA 8800gt.

What do you thing about this system build? I will be doing some home video/photo editing, and some mild gaming. I beleive it will be a significant upgrade to my current system. My only concern is the 1066 memory. Will I be able to underclock the memory to 800?

I am hoping that maybe later next year when they get the bugs worked out with the Phenoms to be able to upgrade.

Thanks,
RD48
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: RD48
Will I be able to underclock the memory to 800?

Yes.

Be prepared for a slew of "get Intel not AMD" comments. ;) Are you going to be overclocking? What made you choose AMD?
 

RD48

Junior Member
Apr 4, 2008
9
0
0
I was considering an Intel Q6600 hence the 1066 memory. But, my budget changed. I figured I could same some money and keep about the same performance for what I will be using the system for.

Sorry, I'm not getting an EVGA 8800gt, I'm getting an EVGA 9600gt.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
You will indeed be able to underclock the memory to 800mhz as it will probably default to 667mhz when your computer boots up for the first time. I don't know much about the Giga 770 board but I have heard good things from the Abit 770 board the AX78 (notably in this thread)
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Ok, I'll bite.

Intel versus AMD.

AMD is better on the low end if you don't plan to overclock. That 5600+ is a pretty sweet processor for stock systems without breaking the bank. It's slightly faster than an e4500 at stock speeds.

If you are willing to overclock there's simply no comparison. Even the e2180 (<$80) can typically overclock to ~3GHz and easily stomp the fastest AMD processor into the ground.



A note on memory: no reason really to buy the more expensive DDR2-1066 unless you are getting a 1333 Intel processor. Memory dividers don't hurt AMD performance much and the 800/1066 Intel cpus (which correspond to 200/266 fsb) won't typically clock much above 400fsb (DDR2-800) if they even reach that point. Just get good quality (G.Skill, Geil, Mushkin, Corsair, Crucial) DDR2-800 and you'll be fine.
 

phexac

Senior member
Jul 19, 2007
315
4
81
Seconded for DDR2 800. Even the new Intel chips that run on 1333 FSB, mean 333 for memory speed. You can keep overclocking all the way to 400MHz before you even hit your memory's rated limit, and most quality brand DDR2 will overlock way past that. For most companies, 1066 rated memory is same as 800 rated, it has just been tested to run that high.
 

Ebichan

Junior Member
Jan 30, 2008
14
0
0
I'm not sure why you would need ddr2-1066 RAM for the Q6600. Especially if you are not overclocking.
Unless you thought the 1066 FSB meant you needed 1066 RAM.
Even 533 MHz RAM would have been suitable if you weren't overclocking.
 

RD48

Junior Member
Apr 4, 2008
9
0
0
Corsair 550w power supply, 2gig Kingston HyperX 1066 memory, Seagate SATA 250gig HDD, Lito-on SATA DVD-rw drive.
These are items I currently have. I bought them when my budget was leaning me towards a Q6600 or an e8400. I was planning on trying some overclocking.
Due to other circumstances, my budget has been cut.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
If you are overclocking, I'll strongly recommend an Intel setup.

For the price of an X2 5600+ you can get an E4500 or E4600, both of which should overclock to 3.2GHz+ (I have an E4400 @ 3.33GHz on the stock HSF). The X2 5600+ can also hit such clockspeeds but is some ~25% slower clock for clock, it also runs MUCH hotter than the E4x00 chips, whether it be stock or overclocked.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Denithor
AMD is better on the low end if you don't plan to overclock.

Not trying to disagree or set an "intel is superior" tone here, but most analysis I have seen which come to this conclusion are analyzing price/performance of only the front-end costs (time-zero purchase prices from the merchant) and ignore the TCO aspect which includes electricity costs month after month.

I have no experience with the dual-core systems of either AMD or Intel, but for quads it doesn't matter how much cheaper the Phenom is versus a Yorkfield (for my situation) because the Phenom will cost me more in electricity expenses than the chip itself over the course of 3 years in comparison to the Yorkfield.

The irony to me is that folks who are on a true budget such that they must buy a silly cheap quad-core just to save $50 at time-zero are the same folks who really aren't in a position to find their power bills go up $10/month because of that quad-core system.

I'm sure the same line of thinking works on dual-core level, albeit with the total prices and power bill levels reduced by 50% or so.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Idontcare
Originally posted by: Denithor
AMD is better on the low end if you don't plan to overclock.

Not trying to disagree or set an "intel is superior" tone here, but most analysis I have seen which come to this conclusion are analyzing price/performance of only the front-end costs (time-zero purchase prices from the merchant) and ignore the TCO aspect which includes electricity costs month after month.

I have no experience with the dual-core systems of either AMD or Intel, but for quads it doesn't matter how much cheaper the Phenom is versus a Yorkfield (for my situation) because the Phenom will cost me more in electricity expenses than the chip itself over the course of 3 years in comparison to the Yorkfield.

The irony to me is that folks who are on a true budget such that they must buy a silly cheap quad-core just to save $50 at time-zero are the same folks who really aren't in a position to find their power bills go up $10/month because of that quad-core system.

I'm sure the same line of thinking works on dual-core level, albeit with the total prices and power bill levels reduced by 50% or so.

Actually, at the low end the power portion of TCO is close enough to not be much different at all...
At even $0.10/kW, the most you'll see is closer to $10/year, and even that is a stretch.
Even on the quads, the price for power on a single system is negligible...

Edit: To be clear, the power difference between a Phenom 9850 (highest power of AMD's quads) and a Q6600 (the lowest power of Intel's quads) would cost you on average (assuming you run them 24/7 and have a decent workload at least 25% of that time) $25/year...
Of course that's not counting the memory (certainly higher clocked memory sucks more power, and if it's a Xeon then your talking FBDimms...).
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I've seen several reviews where a 9850 based system draws 50 watts more than a slightly higher performing Q6600 at load and 10W more at idle. The 10W is probably noise, but at 50w (not even considering in some tasks the 9850 would have to compute 25% longer) and an 8 hour full load day we're talking... 2.5 days to make up a KW/hr. Depending on where you live (some parts of the EU are at 40c/kwhr) that could be as high as $4/month.

The 9300 performs on par with a Q6600 and 9850 as well, and could up the savings to $6/month. On tasks with < 4 parallel threads the 45nm dual core (E8400) will outperfom the quads by a wide margin, and even on many highly parallel computations it keeps up or beats the quads. We could be talking an $8/month savings there if you live in Germany.

The higher clocked Intel 45nm models also have power profiles comprable to a Q6600, but finish some work 20-50% faster. Now we're talking real savings, but the cost differential here is significant enough to make the comparison meaningless.

Triple the savings for workloads such as IDC and Markfw are running (24x7 crunching) and multiply by multiple machines, and you can quickly see how some Phenom models remain the lowest cost quads. And that's all you can say for them.

If someone *needs* a quad it's a fair assumption they're going to run it a lot. If they're buying for epeen, then it completely doesn't matter and any CPU is probably sufficient for their needs and monthly power usage differences are as you concluded noise.

Edit: one last thought. In the server farm environment, using 50 watts more at load (or 25% more total system power) also places a much higher strain on the HVAC. That power translates directly into 25% more heat dissipated from the machine. Getting it of the building will cost additional $.

Which is why Intel can get away with pricing their higher end offerings the way they do -- the ROI of upgrading to the next level when you consider total system cost vs total system power draw is under a year. People started cluing into machine room power & HVAC issues during the days of Netburst P4 servers -- quite ironic when you think about it.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: v8envy
I've seen several reviews where a 9850 based system draws 50 watts more than a slightly higher performing Q6600 at load and 10W more at idle. The 10W is probably noise, but at 50w (not even considering in some tasks the 9850 would have to compute 25% longer) and an 8 hour full load day we're talking... 2.5 days to make up a KW/hr. Depending on where you live (some parts of the EU are at 40c/kwhr) that could be as high as $4/month.

The 9300 performs on par with a Q6600 and 9850 as well, and could up the savings to $6/month. On tasks with < 4 parallel threads the 45nm dual core (E8400) will outperfom the quads by a wide margin, and even on many highly parallel computations it keeps up or beats the quads. We could be talking an $8/month savings there if you live in Germany.

The higher clocked Intel 45nm models also have power profiles comprable to a Q6600, but finish some work 20-50% faster. Now we're talking real savings, but the cost differential here is significant enough to make the comparison meaningless.

Triple the savings for workloads such as IDC and Markfw are running (24x7 crunching) and multiply by multiple machines, and you can quickly see how some Phenom models remain the lowest cost quads. And that's all you can say for them.

If someone *needs* a quad it's a fair assumption they're going to run it a lot. If they're buying for epeen, then it completely doesn't matter and any CPU is probably sufficient for their needs and monthly power usage differences are as you concluded noise.

Edit: one last thought. In the server farm environment, using 50 watts more at load (or 25% more total system power) also places a much higher strain on the HVAC. That power translates directly into 25% more heat dissipated from the machine. Getting it of the building will cost additional $.

Which is why Intel can get away with pricing their higher end offerings the way they do -- the ROI of upgrading to the next level when you consider total system cost vs total system power draw is under a year. People started cluing into machine room power & HVAC issues during the days of Netburst P4 servers -- quite ironic when you think about it.

1. The Q6600 is slightly better in some things and slightly worse in others.

2. I've never ever seen a system that runs at max load for 8 hours a day, 7 days a week...have you? What was it doing? The usual is for max load running maybe 1-2 hours a day, but I stipulated 25% (which is almost 4 times that).

3. I admit I was using the average power cost in the US (which is 10.3 cents/kw/hr)
Average Power Costs US
My assumption was based on my belief that IDC lives in the US (since he has been working for US companies).
I don't doubt that there are places in the world that charge more for power, and of course they charge more for everything else as well...

4. The 45nm quad cores you speak of can have a MUCH greater power draw (the QX9775 can draw 70w more than the Phenom under load for example).


At the end of the day, while I'm sure there are scenarios in which you could show that just about anything is more expensive than something else, for the average user here (and the vast majority of people) the power costs are pretty much negligible...even on a quad.

Edit: BTW, while the EU is higher than the US for power, the average cost is closer to $0.12/kw/hr...
EU Pricing for Electricity 2007
 

Accord99

Platinum Member
Jul 2, 2001
2,259
172
106
Originally posted by: Viditor
4. The 45nm quad cores you speak of can have a MUCH greater power draw (the QX9775 can draw 70w more than the Phenom under load for example).
That's only because the system contains two QX9775.

The QX9650 on the other hand barely uses more power under load than the Phenom 9850 at idle.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Accord99
Originally posted by: Viditor
4. The 45nm quad cores you speak of can have a MUCH greater power draw (the QX9775 can draw 70w more than the Phenom under load for example).
That's only because the system contains two QX9775.

The QX9650 on the other hand barely uses more power under load than the Phenom 9850 at idle.

Well spotted, and a fair call...
But I take it that you essentially agree with me that the differential cost in power at this level is still fairly negligible?
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
If you're willing to OC a little, you can get a Celeron E1200 (~$55) + something like a P35 DS3L (~ $100), & that will compare to even an OCed X2 5600+ system.


 

bradley

Diamond Member
Jan 9, 2000
3,671
2
81
Originally posted by: Viditor
Originally posted by: Accord99

The QX9650 on the other hand barely uses more power under load than the Phenom 9850 at idle.

Well spotted, and a fair call...
But I take it that you essentially agree with me that the differential cost in power at this level is still fairly negligible?

If we're speaking about the isolated CPU power consumption (while using power saving tech,) according to Lost Circuits....

Phenom 9850 (65nm 2.5GHz) @ idle 30W .012W per MHz
Phenom 9850 (65nm 2.5GHz) @ load 101.2W .040W per MHz

QX9770 (45nm 3.2GHz) @ idle 33.6W .011W per MHz
QX9770 (45nm 3.2GHz) @ load 114.4W .036W per MHz

QX9650 (45nm 3.0GHz) @ idle 21.2W .007W per MHz
QX9650 (45nm 3.0GHz) @ load 64.8W .022W per MHz

Phenom 9850 (65nm 2.5GHz) $235.99 Retail @ Newegg
QX9770 (45nm 3.2GHz) $1499.99 Retail @ Newegg
QX9650 (45nm 3.0GHz) $1039.99 Retail @ Newegg
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: n7
If you're willing to OC a little, you can get a Celeron E1200 (~$55) + something like a P35 DS3L (~ $100), & that will compare to even an OCed X2 5600+ system.

Really?? benches?
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Cookie Monster
Originally posted by: n7
If you're willing to OC a little, you can get a Celeron E1200 (~$55) + something like a P35 DS3L (~ $100), & that will compare to even an OCed X2 5600+ system.

Really?? benches?

I have to agree that I find that very hard to believe...
Do you have any examples?

Edit: To clarify, the E1200 is 30%+ slower than even the X2 5200+ at stock...I find it very hard to believe that anything would bring it close to parity.
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: Viditor
Originally posted by: Cookie Monster
Originally posted by: n7
If you're willing to OC a little, you can get a Celeron E1200 (~$55) + something like a P35 DS3L (~ $100), & that will compare to even an OCed X2 5600+ system.

Really?? benches?

I have to agree that I find that very hard to believe...
Do you have any examples?

Edit: To clarify, the E1200 is 30%+ slower than even the X2 5200+ at stock...I find it very hard to believe that anything would bring it close to parity.

Actually, n7 is correct, a heavily overclocked E1200 @ 3GHz+ is capable of challenging an overclocked X2 @ 3GHz+.

The Celeron E is actually quite comparable IPC wise with an X2, with gaming being the exception where the smaller cache really hurts performance.

Check out Xbitlab's E1200 review where they overclocked the chip to 3.4GHz and it challenges a stock E6750 in the majority of non gaming benchmarks. Keep in mind that an E6750 is faster than an X2 6400+, which at 3.2GHz is near the limit of K8 clockspeeds.

However, with all that being said, there are better options than an E1200 for the budget overclocker. An E2160 is only $70, and the extra 512KB L2 cache improves overall performance immensely, especially in gaming. When overclocked to 3.4GHz, it exceeds the performance of an X6800 - no X2 overclock can come close to that level of performance.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
You mean you agree with n7 ;).

The E6750 trades blows with the X2 6400+. Its marginally faster overall at stock. But claiming that the E21x0 or E1200 will hit something along the lines of 3.4GHz on stock cooling is kind of over the top. You would be running these CPUs with a high vcore, and also a 3rd party HSF. Id say that the E1200 would lose to the X2 5600+ especially in gaming due to the lack of L2 cache. (Might actually hurt performance across the board)

I mean we all know intel chips have alot of OC headroom, but saying all of them can hit 3.4GHz easy (xbitlabs had to use 1.5Vcore and used a 3rd party cooler Zalman 9700 which means more $$$) and that an OCed E1200 is better than a OCed X2 5600+ is abit ridiculous dont ya think?
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Haha my apologies, yes I meant n7, in my defence its been a long day at work and I have bad eyes. :p Post edited to avoid confusion. ;)

I'm not suggesting these chips will all hit 3.4GHz easily, but 3GHz shouldn't be a problem with stock cooling, Xbitlabs hit 2.96GHz at default voltage, so a slight voltage bump will easily bring it to 3GHz. As I've already said, the Celeron E is especially weak in gaming, but in everything else it would stand up to an X2 5600+, clock for clock.

 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: harpoon84
Haha my apologies, yes I meant n7, in my defence its been a long day at work and I have bad eyes. :p Post edited to avoid confusion. ;)

I'm not suggesting these chips will all hit 3.4GHz easily, but 3GHz shouldn't be a problem with stock cooling, Xbitlabs hit 2.96GHz at default voltage, so a slight voltage bump will easily bring it to 3GHz. As I've already said, the Celeron E is especially weak in gaming, but in everything else it would stand up to an X2 5600+, clock for clock.

There are still a few problems there...

1. They used an Engineering Sample E1600 chip from Intel with an unlocked multiplier...not something you can actually buy.

2. it was compared to a stock 5200+, not an overclocked 5600+...



 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Yes, they used an engineering sample.
But every E1200 will do 3 GHz.

That's not at all hard to accomplish; it doesn't take ES chips to do that.
Most will do a lot more, though yes, better cooling is needed.

Will every X2 5600+ do 3 GHz?
I'd say sure.

So let's consider the X2 5600+ basically an X2 6000+, fair enough?

Here's the X2 6000+ (& X2 5600+) vs. an E6750.
http://www.anandtech.com/cpuch...howdoc.aspx?i=3012&p=4
The X2 6000+ is often worse than the E6420 in this review, though in some benches it is close to the E6750...so it varies.

And then yes, we have the Xbit review: http://www.xbitlabs.com/articl...ron-e1200_7.html#sect1
Now since we can assume not every E1200 will do 3.4 GHz, we'd have to imagine scores a little lower.

But as you'll see, it'd be performance around the same level as the E6750 in everything but games, which basically means it'd beat the X2 5600+ (even OCed) in everything but games...granted, they also used CPU bound resolutions, which means during actual gaming the difference would be even smaller.

I'm obviously not trying to say everyone should go out & buy E1200s.

I'm just pointing out that Intel's weakest chip, the very bottom of their barrel, can beat even the mid-range dual cores AMD is putting out.

And then you have the E2140 & 2160, which are still much cheaper, & with 1 MB L2 now, will be even faster @ 3+ GHz.

If no OCing will ever happen, AMD makes some great options right now.
If OCing is going to happen, there's absolutely nothing worth buying from AMD; it's that simple.