• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Comparing Apples and Oranges?

superstition

Platinum Member
To be fair to Intel, we are comparing a single core AMD processor to a dual-core processor.
This is being fair?

Frankly, the AMD LE1600 is just on the verge of not being an acceptable processor during HD playback.
Hmm. So, we're talking about a slow single core chip being compared against - what again? A much faster dual core processor?

What happened to the Celeron 440 Conroe?

the review

Also, I admittedly went through it quickly, but I don't recall seeing any emphasis on work per watt.

Since that AMD chip is rated at 45 watts and the Celeron is rated at 35 watts, it would be interesting to see how the Celeron would perform overclocked to 45 watts.
 
Originally posted by: superstitionAlso, I admittedly went through it quickly, but I don't recall seeing any emphasis on work per watt.

from the 'First Thoughts' section right at the end:
"The next question we will answer is performance per watt and those results are likely to lead us down another path." 🙂


 
"The next question we will answer is performance per watt and those results are likely to lead us down another path."
So the review is missing a single core vs. single core comparison as well as work per watt. What's the point, then?
 
I think the review included a single-core CPU to see how capable 780G is for HD playback. There are purists when it comes to power consumption and noise for HTPC, and I think AMD platform's showing is great in that review. As we know there is no IGP HD acceleration on Intel front yet, and you won't be able to watch any 1080p clip with a single-core Celeron on G35.
 
As we know there is no IGP HD acceleration on Intel front yet, and you won't be able to watch any 1080p clip with a single-core Celeron on G35.
So that's why a dual-core was chosen? Without IGP HD acceleration the dual-core can keep up? It seems like it would be more sensible to use a less expensive lower wattage Intel CPU like the Celeron 440 with an inexpensive 3rd party PCI-e card for HD playback.
 
You're right but that wasn't the purpose of the review. As for my opinion, with cheapest dual-core CPUs going for $50, single-core CPUs are officially dead. And if anything the review 'chose' a single-core Sempron to demonstrate 780G's capability.
 
Originally posted by: lopri
You're right but that wasn't the purpose of the review. As for my opinion, with cheapest dual-core CPUs going for $50, single-core CPUs are officially dead. And if anything the review 'chose' a single-core Sempron to demonstrate 780G's capability.
The purpose of the review is to show that IGP HD acceleration allows a person to use a single core AMD chip and get low power, right? Well, then... why chose an Intel dual core setup without HD acceleration and not a single core system with an add-on card? It seems that for people who are really concerned about heat-wattage, a single-core Intel setup with a low-power PCI-e card makes more sense than a dual core running HD brute-force.

I agree about single core chips being dead, which also makes the focus on a single core AMD odd to me.
 
Originally posted by: superstition
Originally posted by: lopri
You're right but that wasn't the purpose of the review. As for my opinion, with cheapest dual-core CPUs going for $50, single-core CPUs are officially dead. And if anything the review 'chose' a single-core Sempron to demonstrate 780G's capability.
The purpose of the review is to show that IGP HD acceleration allows a person to use a single core AMD chip and get low power, right? Well, then... why chose an Intel dual core setup without HD acceleration and not a single core system with an add-on card? It seems that for people who are really concerned about heat-wattage, a single-core Intel setup with a low-power PCI-e card makes more sense than a dual core running HD brute-force.

I agree about single core chips being dead, which also makes the focus on a single core AMD odd to me.

Because the add on card is going to use more power and generate more heat than the IGP, not to mention add to the cost. Besides, which add in card are you going to use? There are many to choose from but this isn't a video card article, if you want to know how well cheap video cards run HD content I suggest you read an article on that.
 
Because the add on card is going to use more power and generate more heat than the IGP, not to mention add to the cost.
Same thing with a dual-core cpu.
Besides, which add in card are you going to use?
Ideally, one that is inexpensive and low-wattage.
There are many to choose from but this isn't a video card article, if you want to know how well cheap video cards run HD content I suggest you read an article on that.
Again, what's the point of comparing a board with IGP acceleration with a single core CPU against a dual-core setup without IGP acceleration other than to simple show that brute force HD processing is less energy-efficient? Why use an Intel setup at all?

Instead of spending more money on a higher wattage and higher priced dual core Intel, that money could have been used to get a cheap graphics card. I've seen GeForce 8400gs for $28. The Radeon 2400 Pro doesn't use many watts at all.

The point, to me, would be to compare apples to apples - inexpensive low wattage single core (if you really want to go there) systems with HD decoding offloaded to another chip, whether it's an IGP (AMD) or a PCI-e card (Intel).
 
The Radeon 3450 is now available on Newegg for $22.50 after rebate, which makes the IGP obsession even more questionable.
 
Back
Top