i3 pricing

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
That statement has a ton of wiggle room. What's a fast dual-core? What's "enough"? Why is a "fast dual-core" equal to a "small triple-core"? And what is a "small" triple-core anyway? And what does the whole statement actually imply?

The results come from a synthetic benchmark that apparently spikes CPU utilization way up because of all the NPCs.

So if that is the toughest part of the game for CPUs it is implied the rest of the game will run even smoother.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
i3(Dual core) on a Quad crossfire platform? You're nuts.
I'd love to see how you plan to accomplish that with the limited PCIe lanes on the P55 chipset.

It seems you still have your hard-on for dual core...Enjoy it while it lasts.

To even manage to feed this setup you need a quad no doubt about it. Hell, You'd need an x58 with THREE x16 PCIe slots AND an overclocked i7. Drivers can take advantage of multiple threads as well as games. Seriously, no P55 board could handle this type of PCIe bandwidth.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I forgot about the limited PCI-E bandwidth.

So with two HD5870s or a single HD5890 we are talking even more "GPU bottleneck" with triple monitors.



GPU time + CPU time= Frames per second <-----I think a person would like to achieve equivalent amounts of this with the least amount of waste (money, heat, energy)

It's not even GPU bottlekneck, it's PCIe bandwidth starving. Then you would have to look at the CPU to feed these cards and as it is now, even a quad SLI (two GTX 295) will be starved by everything short of an i7 overclocked. There is wasted potential with lesser CPUs.

Not saying the experience totally sucks, but common here...
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That statement has a ton of wiggle room. What's a fast dual-core? What's "enough"? Why is a "fast dual-core" equal to a "small triple-core"? And what is a "small" triple-core anyway? And what does the whole statement actually imply?

For me, it only means "If you already have a decent dual-core processor, don't rush out to buy a new one. Just get a better video card if you need one". It's not in any way directed to mean (at least, as far as I understood it) that dual-cores are a better deal. It's all over but the shouting. Dual-cores are old news, and the way forward is quad-cores (now) and eventually more cores than that. We can argue in circles whether dual-cores would have been perfectly fine, but the fact that the manufacturers are moving on already sealed that deal.

You're right. Present is quad future is 6, 8, 12 etc cores.

For the record, there is no such thing as good enough. There is never fast enough. You can and will always benefit from better and faster hardware.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
It's not even GPU bottlekneck, it's PCIe bandwidth starving.

With triple monitors it is a GPU bottleneck. Heating up a quad core to 4 Ghz isn't going to do very much in that situation so smart money is on the Core i3.

P.S. LCD prices have been dropping. Right now a person can buy a thin bezel 23" Viewsonic 1080p monitor for $119 AR. Regular price on newegg I think is $149.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Seriously, no P55 board could handle this type of PCIe bandwidth.

Cheap AMD boards can run dual x16 lanes but unfortunately I don't think they have anything that can compete with Core i3 at the moment.

We have to wait until Fusion for that and hopefully they will be implementing some of sort of switchable graphic solution around or after that time.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
You're right. Present is quad future is 6, 8, 12 etc cores.

For the record, there is no such thing as good enough. There is never fast enough. You can and will always benefit from better and faster hardware.

Yeah but this is just hype until gamers have practical reasons to need the hardware. It has been 3 years since the first quad core was released at only until very recently have we seen any sort of optimization.

As far as this hex core stuff goes how long until I am able to fully utilize that? Why not buy the hardware that works today? I don't want to buy something that wont start working until 3 years after I buy it.
 
Last edited:

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
Yeah but this is just hype until gamers have practical reasons to need the hardware. It has been 3 years since the first quad core was released at only until very recently have we seen any sort of optimization.

As far as this hex core stuff goes how long until I am able to fully utilize that? Why not buy the hardware that works today? I don't want to buy something that wont start working until 3 years after I buy it.


Gamers never need the hardware, some want it. You can almost always adjust the settings so it would run at an acceptable frame rate.

You know that getting better fps cost more money and the higher you get, you start getting a lower return for your money. That is the way it has always been with computing. You know that there are lots of games that will show increases switching to quad core and has been that way for a while, not just recently. Yet you continue to post this bs here.

If you spend a third of the time that you used posting about i3, extra at work, you could have paid for the difference between the i3 and the i5, a better hsf and bigger psu.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
that is going to be in a crappy psu though

Part of the 80 plus certification is to be able to be 80% or higher efficient at 20%, 50% and 100% load.

Just guessing but I think in most of the psus that hobbyists consider "good/excellent", you will find a difference of around 3% or less.
Well just taking the first AT review (yep I'm lazy) we end up with that:
http://www.anandtech.com/casecoolingpsus/showdoc.aspx?i=3516&p=28

I'm not sure where you get your infos, but looking at the graph, the difference between 20 and 50% is measurable, don't you think so?
It depends on the PSU how big the difference is, but you can't say that the tested PSUs were the cheapest stuff they could find :p
You maybe need 80% efficiency to get a 80+ certificate, but you can still have 90% efficiency at 80% load and only 80% @20%..

Efficiency isn't a constant factor - you get the best efficiency usually somewhere between 60 and 80%... therefore if you care about it, you better buy an appropriate PSU.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
check out jonnyguru.com and tell me what you think.

I never said it wasn't measurable, just small. I'll change my stance somewhat to excellent quality units around 3&#37; difference with what most of us here would consider a "good/very good" psu to be around 5%.

If you find a psu good enough to score a 90% at 80% load, I seriously doubt that it would be 80% at 20% load.


Look at this review on the Seasonic X-650:

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story3&reid=169

82% load scored a 90.2% efficiency while a 20% load scored 90.5% efficiency. And there are not a lot of psus that can score 90% to begin with so this should be easy to check out.


and, regarding best efficiency ranges, I don't see it at 60~80%. Looks to me like it is usually between 30~50%.
 
Last edited:

Voo

Golden Member
Feb 27, 2009
1,684
0
76
check out jonnyguru.com and tell me what you think.

I never said it wasn't measurable, just small. I'll change my stance somewhat to excellent quality units around 3% difference with what most of us here would consider a "good/very good" psu to be around 5%.

If you find a psu good enough to score a 90% at 80% load, I seriously doubt that it would be 80% at 20% load.
So you think the ones tested by AT are all "bad" PSUs? At least the Seasonic got a gold 80+ certificate, so it can't be that bad, but still it drops quite a bit.

I do not say that it is true for every PSU, but it's rather usual at least..
 

Zoomer

Senior member
Dec 1, 1999
257
0
76
except that a PSU efficiency curve typically reaches max efficiency in the high middle of its available power curve. or, 65~% load.

if you have a machine that takes 200 watts max, buying a high efficiency 350 watt PSU would be ideal in terms of total power consumption.

if it's constantly in the bottom or top 15% of its avail power curve, then you're operating at a lower level of efficiency.

in other news, operating on 220v power is more efficient for PSU's than 110v.

If my electric stove gets 220v, why can't more 220v sockets be installed? I'll probably end up using that for my computer and other components with switched supplies. 220 v, 60 Hz would probably be more efficient than 220 v 50 Hz Europe uses. :)
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
So you think the ones tested by AT are all "bad" PSUs?


No

but when I want to learn about PSUs, I will go to whom I think has the most knowledge about them.

Go over to the power supplies subforum and ask whom they think knows the most about psus. Try looking at the stickies before you do (and who wrote them)...
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
You maybe need 80&#37; efficiency to get a 80+ certificate, but you can still have 90% efficiency at 80% load and only 80% @20%..

I get what you mean now. I'm still of the opinion that it won't matter since you're getting the 80% efficiency promised either way, it's just that you maybe get higher efficiency at ideal loading. But I certainly see your point, and I understand how you and some people would generally find it to be a valid concern since it might net you a couple of more watts saved.

I just wouldn't want to do it myself, because of upgrade reasons. If I buy a power supply that is ideal for my needs right now, 1 or 2 years later I might decide to get a faster, more power hungry video card. I'd rather buy one reasonably large, high-quality, efficient PSU that can serve me well for at least 5 years, keeping in mind component upgrades, so that I don't have to replace it as often.

Good point though, and something I failed to consider initially. Thanks for bringing it up. It certainly added knowledge to this discussion for people like me who fail to consider the varying efficiencies at different loading.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
So you think the ones tested by AT are all "bad" PSUs?


I might have misunderstood what you said.

Anand's results actually are very similar to jonnyguru's. At that chart you posted it has this:


Note:
The first column is 10&#37; load, the second 20%, the third 50%, and the last column represents 100% load.


I would disregard the first column (the 10% load) as you can see they all perform substantially worse at that little of a load. And as you already posted,


"therefore if you care about it, you better buy an appropriate PSU. "


Those results are in line with my 3 and 5% claims.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
If you spend a third of the time that you used posting about i3, extra at work, you could have paid for the difference between the i3 and the i5, a better hsf and bigger psu.

I don't want a Core i5 750,Tower cooler and a bigger PSU pushing another 100 watts of heat through it. Thats like saying if I worked harder I could afford to light money on fire. Why do this?
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Gamers never need the hardware, some want it. You can almost always adjust the settings so it would run at an acceptable frame rate.

You know that getting better fps cost more money and the higher you get, you start getting a lower return for your money. That is the way it has always been with computing. You know that there are lots of games that will show increases switching to quad core and has been that way for a while, not just recently. Yet you continue to post this bs here.

There are not a lot of games showing any practical advantage of quad core over dual core. For example Modern Warfare 2 plays completely fine on an old dual core.

When a person is building a gaming machine usually they are trying to balance the Video card costs with the CPU costs. Obviously this is different from building something strictly for benchmarking (where cost is no object).
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,497
136
I don't want a Core i5 750,Tower cooler and a bigger PSU pushing another 100 watts of heat through it. Thats like saying if I worked harder I could afford to light money on fire. Why do this?

Its not like 100 watts, but I will find a link, more like 20.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
I don't want a Core i5 750,Tower cooler and a bigger PSU pushing another 100 watts of heat through it. Thats like saying if I worked harder I could afford to light money on fire. Why do this?


Why make up stuff that has nothing to do with this?

the i5 won't use 100 more watts.

Tower cooler won't affect the power consumption, it will allow a better transfer of heat away from the cpu. You don't need one.

Bigger PSU, Come on! In the other thread you acknowledged that it won't matter once you get down to cheap, good quality psus. You can't find ones that are smaller and still good build quality for a lower price so why do you PRETEND that this is an issue?

Everyone can see now that you are FIXATED on the i3.

Nothing wrong with that but stop with your silly arguments. The only one that could possibly have some merit is that an i3 overclocked with a stock HSF might OC better than an i5 overclocked with a stock HSF. That has yet to be proven but I suspect that it is true.

Since the i3 should use less power, it will expel less heat into the room. How much is the difference? Take a smaller 10x10 bedroom that is kind of common. How much warmer will an i5 bring the room to vs an i3?

First you claimed that GTA4 was the only game that could utilize four cores, then you changed you stance to no game could utilize four cores. Now you claim that for "practical use" dual cores are enough. I said that duals are enough and if they are struggling, you can lower in-game settings. You change your argument every hour. MANY benchmarks have proven that there is a difference over many different games. Is that worth a premium to you? Nobody here cares...


I'm talking a 4ghz Core i5 750.

Still won't be 100 watt difference unless you are comparing it to an underclocked i3
 
Dec 30, 2004
12,554
2
76
If my electric stove gets 220v, why can't more 220v sockets be installed? I'll probably end up using that for my computer and other components with switched supplies. 220 v, 60 Hz would probably be more efficient than 220 v 50 Hz Europe uses. :)

Actually it's the other way around. The higher the frequency, the more energy you lose to the magnetization in the core that happens each cycle and resists the next 180 degrees of current. Losses from the windings' internal resistance are minimal. This can be improved with higher quality/different core materials, but those are of course more expensive.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
First you claimed that GTA4 was the only game that could utilize four cores, then you changed you stance to no game could utilize four cores.

I never said there wasn't a game that could utilize four cores. I said if the GPU(s) became a bottleneck with multi-monitors any extra gains from better quad core cpu scaling could be drowned out by the longer GPU time. This would result in negligible FPS improvement.

Frame per second= GPU time + CPU time.