CPU to GPU cost Ratio - What should it be?

adamkavon

Junior Member
Nov 17, 2009
19
0
0
So, I'm in the market for a new computer, but I'm on a pretty strict budget. I want the entire thing to come in under $900 (including monitor) and I've allotted around $400 for CPU and GPU. I have another "productivity" machine, so this would be PURELY for gaming. Most of that gaming will be World of Warcraft. Thus, my question:

How much more should I spend on the GPU over CPU? Or should they be equal?

CPU = GPU --- For example, if I get the Core i5 750 ($200) then I'd be looking at the HD 4890 ($200).
CPU < GPU --- If I get a Phenom II X4 940 ($160) then I could get the HD 4850 x2 ($230).
CPU << GPU -- I mean, it's even possible for me to go into $100 CPU-range (Athlon X4 620 or Phenom X2 555) and get an HD 5850 ($300).

I want there to be NO bottleneck. I don't want a CPU that is wasting power because the GPU can't process graphics. And I don't want a GPU that is being CPU-limited. Suggestions?
 

Indus

Lifer
May 11, 2002
15,288
10,716
136
I'd say $170 CPU and $230 GPU roughly is a good deal.

Edit: I'm not biased.. I'm actually an Intel buyer and go with 170/230 ratio but right now my ratio seems to favor AMD especially with the intel expensive motherboards priced in the mix.
 
Last edited:

adamkavon

Junior Member
Nov 17, 2009
19
0
0
Thanks, Indus. It seems that spending $50 to $100 more on the GPU than CPU seems like the "sweet spot," but I'm hoping for some confirmation from people.

And my past 3 builds have been Intel, but right now AMD low- and mid-end boards are sooo much cheaper it's hard to justify not switching.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Your CPU should be lower priority then your video card. I would say get a Phenom II X4 925 for ~140

For the video card, I would say the ati 4890 for ~260 would be your best bet, though I would rather go with the 5850 for ~300 if at all possible.

You want a quad core processor for sure, After that, ATI generally offers the best bang for the dollar.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
There will always be some application which is either CPU or GPU limited. WoW, for example, is heavily dependent on CPU where most other games are GPU limited.

We're in a video hardware drought at the moment which limits your choices. If I were in your shoes I'd pick the i5 750 + 5770. DX11, 4870 level performance at low res (I presume you already realize an ultra budget box isn't going to drive 1920x1200 or higher) and savings on power over the years. I've also seen great deals on a GTX260 ($130 and under) which would also be a good match for your application. I'd pick a 4870 or GTX260 at $130 or even 5770 at $170 over a 4890 at $200 -- Stepping up to a 4890 would not let you go up a whole resolution.
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
Thanks, v8. I know WoW loves CPU power, but the game came out 5 years ago, so I feel like even the lowest end CPU should be good enough, eh? As for resolution I'm aiming for 1080p. Not the highest, not the lowest, but comfy on a 22" monitor.

Most of this might be made moot in Jan/Feb when Intel and Nvidia both release their new wares:

Core i3 will be (presumably) very fast dual-core processors for ~$125 and Fermi should pump out better graphics than the 5750/5770 for similar price. What do you guys think?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
First of all, expansions and updates to WoW have been trickling out since the game's release. Like all other MMOs Blizzard knows people upgrade their machines and enjoy eye candy. My favorite MMO, Eve Online, has had vastly increased hardware requirements to correspond with vastly improved graphics over the ages. While the original 2003 release would run great on a ti4200 and XP1600+, the 2010 version will not.

Second, unlike FPSes high frame rates aren't required to play MMOs. The designers might make a conscious choice that 10 fps is "playable" for someone on extremely low end hardware in a raid setting. A 60 fps purist would want a lot more firepower than the bare minimum -- how is your tolerance for single digit frame rates?

If you can build in the future then by all means build in the future. Your options will be better and the amount of hardware each $ buys is likely to be larger. You probably won't see entry level Fermi boards until well into 2010 (like, Q2 or Q3), definitely not Jan/Feb. Some specs for the high end product (likely to retail for well, well over $300) have been announced but there's been very little info on budget parts. I try to build in late February myself -- I've found that's when hardware pricing is most favorable.
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
I know Blizz has updated things and will even more with Cataclysm. In fact, the new models are stunning compared to the old ones. As for tolerance of low frame rates: I abhor them.

I don't understand people who call anything sub-30 "playable." I'd much rather drop video quality to "low" and get 30+ FPS than run everything at "high" and get ~10 FPS.

What I'm hoping to do is find a nice middle ground where I get ~30 to ~40 FPS (during a raid) with everything on "high." I imagine it's completely doable for under a grand, but I'm trying to minmax my hardware. :D Guess my WoW-attitude has carried over into my computer-building.

As for waiting, as long as I get something new before Cataclysm comes out, I'm fine. My current gaming machine is about to croak, but no chance in buying something new now when I have nothing new to play on it.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,044
3,524
126
I want there to be NO bottleneck. I don't want a CPU that is wasting power because the GPU can't process graphics. And I don't want a GPU that is being CPU-limited. Suggestions?

Dude keep dreaming then...

GPU's have shown to scale up to a 4ghz i7 and beyond.

Someone on this forum said it best.... the cpu determines your max frame rate in relationship to your GPU.

Your gpu holds your minimum frame rates in relationship to its power / resolution

How about not worrying about a bottleneck relationship, and more of a practical relationship.
Like WHAT resolution monitor are you running, and whats the most intensive game you intend to play.

If your running 1024x768, then even a 9600GT would be overkill.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
You should be aware there are two major factors affecting WoW fps: your system (CPU/RAM/GPU - in that order) and the network latency. I generally play on an e8400/GTX 260 setup and it's smooth as silk at 1920x1200 with everything turned to high in nearly all zones. Except for Wintergrasp and sometimes Dalaran, depending on population. Raids are no problem (even large ones - no hiccups at all even with lots of shit going on on-screen). The only times I dip into the low fps (teens) are really when the network bogs down.

I was on ClearWire for a while and nearly quit playing due to horrible latency (averaged 300-500 ping and it would spike occasionally into the 1000+ region).

If I was building a system for WoW right now I would pick a PhII X4, 4GB/8GB DDR3 (32-bit or 64-bit OS?) and probably a cheap GTX 260 (or 4870/5770 if in a similar price range).
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
@aigomorla

I like that CPU = max frame and GPU = min frame. I want to get a 22" 1080p monitor, so the res. I'd be running is 1920 x 1080. I don't care if my game is hitting 120 fps (because no one human can see all those frames), but I definitely don't want it to dip under ~30 to ~40 fps.

The most intense game? Probably Bioshock 2. But like I said in the original post, most of my gaming will be WoW (and SCII and DIII, when they're released). I guess NO bottleneck can't happen in the real world, but since GPU scales with CPU power I want to spend as much as I can on my GPU without sacrificing so much CPU that there's no gain.
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
@Denithor

Thank you! Information like that is exactly what I was looking for! With my budget it's looking like 4 GB of DDR3 1600 RAM and Windows 7 Ultimate (64-bit).

Sadly, I've got Charter internet and it's HORRENDOUS, but there's a good chance I'll be moving soon.
 

SunSamurai

Diamond Member
Jan 16, 2005
3,914
0
0
I would say 3:1.

60$ CPU wont be a bottleneck to a 180$ GPU.
100$ CPU wont be a bottleneck to a 300$ GPU.
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
Films run at 23.97 and appear to be constant motion. If you figure you can perceive differences above that, let's just say 2x that = ~48 FPS. Getting a stable frame rate around there is all I need.

60 FPS might look a lil' more smooth, but paying $100 more for it? Not worth it.
 

SunSamurai

Diamond Member
Jan 16, 2005
3,914
0
0
Yes, all YOU need, thats fine, but dont say silly things like its not possible for humans to see the difference. Thats just silly.

And yes, i rather have a stable 60fps than a 30-120fps variation. Most people would, but that has nothing to do with your reasoning.

(some) Films running at 23.97 has nothing to do with anything.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
For the video card, I would say the ati 4890 for ~260 would be your best bet, though I would rather go with the 5850 for ~300 if at all possible.


I wish I could find people to buy 4890s for $260 each. I would quit my job and sell them all day long.
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
@sunsamurai

No need to get all testy. First of all, most films are 23.97. NTSC (US Television) is 29.97. That's the low-end threshold. As far as the high-end MOST people can't tell the difference between 100 FPS and 200 FPS. Just as most people can't tell the difference between a .wav from a CD and a 320 kbps .mp3. Some people can. I'm sorry I've offended you and your supernatural vision.

I came here for some help, but I guess that's too much to ask...
 

adamkavon

Junior Member
Nov 17, 2009
19
0
0
@edplayer

I live in Los Angeles. Never been to one, but I'll Google it later. Gotta go teach some kids right now! Thank to everyone thus far, btw!
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,044
3,524
126
Bullshit. I can tell the difference between 100 and 120FPS, just as I can tell the difference between 50 and 60 fps.

u must be superman.

Sorry bro, but physiology says on average your eyes can only see 60fps.
If your very good, and trained, you can go as far as 90fps.

100 fps, and your name must be clark kent.

@ OP, if your a gamer, get the most expensive and highest end video card your wallet can afford.
24 inch resolution size, which is near what your running, u dont want a mid tier or lower.

So i would go all out on the gpu sector, and try to get the fastest overclocked processor.

But gamers would focus on videos, while crunchers go on cpu.
 

edplayer

Platinum Member
Sep 13, 2002
2,186
0
0
@edplayer

I live in Los Angeles. Never been to one, but I'll Google it later. Gotta go teach some kids right now! Thank to everyone thus far, btw!


There is one in Tustin. A little drive for you but the i5 750 is $150 there vs. $200 at most other stores.
 

SunSamurai

Diamond Member
Jan 16, 2005
3,914
0
0
u must be superman.

Sorry bro, but physiology says on average your eyes can only see 60fps.
If your very good, and trained, you can go as far as 90fps.

100 fps, and your name must be clark kent.

This is why youre the Case and Cooling Moderator, not the Video Cards and Graphics Moderator. :hmm:

So sorry, 'bro', but you just pulled that information out of your hat. Anyone thats played a FPS game on a monitor that can hanle high refreash can tell you there is a CLEAR difference between 60 and 120fps.

This kind of discussion if two decades old way back in the pvp quake forums. People have gone from the human eye not being able to see 12, 16, on and on upto whatever number, and all they are really saying is "i dont know anything about human physiology and I am the poster-boy for pseudo-intellectualism".

Why do you think DX10 implemented motion blur? Because people are fine with 24-60fps? No. Because perceived motion is dependent on the sharpness of the visual. The sharper it is, the easier it is to see frame-rate differences. If a game feels smoother at lower framerates, you get more people to buy your game and more developers happy with your OS.

There is more to framerate than "lol u must b suparman". There is actual science behind it all over the web that contradicts you. Anyone that has taken photography or video media in general learns this. Its more complex.

By the way 24fps is a standard for reasons that differ from bs like the eye cannot perceive more. The main reason was COST and what was AVAILABLE. Crazy I know.

Before trying to berate someone with no basis yourself, think of something obvious, indeed think at all a moment and you might see how foolish it is to say such thing; what do you suppose the framerate of reality is? Because I, for one think its far more than 24 or 100, and yet I can see perfectly fine.

http://www.100fps.com/how_many_frames_can_humans_see.htm
http://beagle.colorado.edu/courses/3280/lectures/class14-1.html
 
Last edited:

SunSamurai

Diamond Member
Jan 16, 2005
3,914
0
0
@sunsamurai

No need to get all testy. First of all, most films are 23.97. NTSC (US Television) is 29.97. That's the low-end threshold. As far as the high-end MOST people can't tell the difference between 100 FPS and 200 FPS. Just as most people can't tell the difference between a .wav from a CD and a 320 kbps .mp3. Some people can. I'm sorry I've offended you and your supernatural vision.

I came here for some help, but I guess that's too much to ask...

No need for you to say things wholly untrue either.

You dont need supernatural vision to be able to see. How silly.


Back when digital cameras started to come out we had hordes of people saying the human eye cant see past 2 MP detail. How dumb is that? It shows complete lack of understanding. But I guess its not helpful to tell you when you're wrong. People get all hurt and victimized.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I would say 3:1.

60$ CPU wont be a bottleneck to a 180$ GPU.
100$ CPU wont be a bottleneck to a 300$ GPU.

Bottlenecking is determined by the type of game, resolution and quality of settings applied, not just pricing. You can pair an E5300 $60 with a 4890 $180 and you still will get subpar performance in GTA4 since it generally needs a quad core to run smoothly. Similarly, if you game at 1024x768 0AA on an AMD Athlon II X4 620 $100 with 5850 $300, then you will bottlenecked in almost every modern shooter because of the CPU. However, try playing Crysis at 2560x1600 8AA on the same system and the videocard will take you into the teens making it the limiting component.

There is no hard rule. It depends.

Right now a Core i5 750 or a Phenom X4 provide a good starting point, but are far from best bang for the buck for gaming. Even a Core 2 Duo 3.0-3.4ghz or an X3 overclocked with a single graphics card will be a decent cpu for a lot of games, but not for the few games that take advantage of quad cores (Arma2, GTA4, Resident Evil 5, etc.)

However, one thing is certain -- an overclocked cpu can outlast last 2-3 videocard generations and that a $500 graphics card today will cost about half of that or less in 12 months.
 
Last edited: