CPU to GPU cost Ratio - What should it be?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
This is why youre the Case and Cooling Moderator, not the Video Cards and Graphics Moderator. :hmm:

So sorry, 'bro', but you just pulled that information out of your hat. Anyone thats played a FPS game on a monitor that can hanle high refreash can tell you there is a CLEAR difference between 60 and 120fps.

This kind of discussion if two decades old way back in the pvp quake forums. People have gone from the human eye not being able to see 12, 16, on and on upto whatever number, and all they are really saying is "i dont know anything about human physiology and I am the poster-boy for pseudo-intellectualism".

Why do you think DX10 implemented motion blur? Because people are fine with 24-60fps? No. Because perceived motion is dependent on the sharpness of the visual. The sharper it is, the easier it is to see frame-rate differences. If a game feels smoother at lower framerates, you get more people to buy your game and more developers happy with your OS.

There is more to framerate than "lol u must b suparman". There is actual science behind it all over the web that contradicts you. Anyone that has taken photography or video media in general learns this. Its more complex.

By the way 24fps is a standard for reasons that differ from bs like the eye cannot perceive more. The main reason was COST and what was AVAILABLE. Crazy I know.

Before trying to berate someone with no basis yourself, think of something obvious, indeed think at all a moment and you might see how foolish it is to say such thing; what do you suppose the framerate of reality is? Because I, for one think its far more than 24 or 100, and yet I can see perfectly fine.

http://www.100fps.com/how_many_frames_can_humans_see.htm
http://beagle.colorado.edu/courses/3280/lectures/class14-1.html

I said OVER 60 FPS in games. 1-60fps i agree i can see the difference, but AFTER 60fps, you dont notice jack in GAMES.

Not in full motion Blue Ray Films.

I dont know about you, but i dont see jack squat past 100fps in ANY game i play.

Even the high end ones with full on Quadfire enabled.

The OP is clearly talking about GAMES. WOW in particular.

So you wanna tell me you see more detail at 150FPS in WOW vs a 60FPS?
Have you even played WOW?
 
Last edited:

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
However, one thing is certain -- an overclocked cpu can outlast last 2-3 videocard generations and that a $500 graphics card today will cost about half of that or less in 12 months.

Which is exactly why I made the recommendation I did - PhII X4 paired with a $130-150 GTX 260/4870-ish level card. OC the PhII and you're good to go in >90% of today's games. In a year or so upgrade the GPU alone (5870 should be $150 by then) and keep cruising.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Aigo, do this. Find a 60hz LCD. Now move your mouse in a circle. Move it faster, in a larger circle. See how at some point it starts to *JUMP* between positions? Congratulations, you're seeing at faster than 60 fps.

There is a whole site devoted to debunking the "24 fps" myth. One of their examples was: some fighter pilots can reliably recognize images flashed for 1/500th of a second -- vision at 500 fps!

Do not confuse high persistence, motion blurred film as the maximum activation rate of your rods & cones. You don't have to be like my kids who can spot a rabbit in brown grass from 3 blocks away to exceed the limits of low end gaming.

Considering the low, low prices of midrange computer hardware it's not very bright to try and save $50 out of $600-1000 (storage, OS, memory, case, PSU, optical) by getting the cheapest CPU (or graphics card) you can get your hands on. Upper midrange is where you need to look for both, and screw the ratios.

i7, 4870, and skip a lunch or three if that's what it takes to move up from bargain basement dual core and entry level video.
 

Mothergoose729

Senior member
Mar 21, 2009
409
2
81
It is more about price points then a ratio. Either way a 5870 and a athlon II will perform better then a i7 975 and an 8800gt. If buying cheap CPU can allow you step up from a 4850 to a 4870 then you should do it. If you aren't going to gain more then 20% performance then get the better CPU, if that makes sense.
 

Swivelguy2

Member
Sep 9, 2009
116
0
0
24 fps on film looks okay because it contains all of the motion that took place during that frame. In order to replicate that effect in a computer game, you'd have to render about 500 frames per second, then blur 20 consecutive frames together to make each 25th of a second. It would look "smooth," but wouldn't look very pretty when the camera is moving. In fact, next time you're in a movie theater, watch for when the camera pans, it looks like crap. You can't make out any details.

Sure, a fighter pilot can identify an image shown for 1/500th of a second, because his or her retina is still storing the image for longer than that, and his rods and cones are still wiggling his optic nerve. In this case, the reason he can see that image is precisely because his vision is slow to adjust, not in spite of it. Can he distinguish 5 images consecutively flashed for 1/500th of a second each? No, he'll tell you it was 1 image.

When someone tells you they want to see higher than 60 fps, they're correct, and it can be for one of many reasons:

1. They've got superhuman vision
2. They really like things to be super-smooth. They're like the frame-rate equivalent of audiophiles. Frameratophiles.
3. They spent a lot of [their parents'] dough on their rig, and they want to see a large number displayed in the FPS meter (to compensate for the small number in the kills column?).

Just let it go. It's off-topic too!
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
So, I'm in the market for a new computer, but I'm on a pretty strict budget. I want the entire thing to come in under $900 (including monitor) and I've allotted around $400 for CPU and GPU. I have another "productivity" machine, so this would be PURELY for gaming. Most of that gaming will be World of Warcraft. Thus, my question:

How much more should I spend on the GPU over CPU? Or should they be equal?

CPU = GPU --- For example, if I get the Core i5 750 ($200) then I'd be looking at the HD 4890 ($200).
CPU < GPU --- If I get a Phenom II X4 940 ($160) then I could get the HD 4850 x2 ($230).
CPU << GPU -- I mean, it's even possible for me to go into $100 CPU-range (Athlon X4 620 or Phenom X2 555) and get an HD 5850 ($300).

I want there to be NO bottleneck. I don't want a CPU that is wasting power because the GPU can't process graphics. And I don't want a GPU that is being CPU-limited. Suggestions?

The GPU to CPU cost ratio is based strictly on your personal preferences.

If a person likes higher resolution, higher detail settings, higher IQ then they increase GPU/CPU cost ratio.

P.S. I think you are getting some of this confused with the idea of value. For example what at X resolution/detail/IQ settings does putting more dollars in the GPU become a waste comparing to shifting more money into the CPU?
 
Last edited:

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,067
3,574
126
When someone tells you they want to see higher than 60 fps, they're correct, and it can be for one of many reasons:

1. They've got superhuman vision
2. They really like things to be super-smooth. They're like the frame-rate equivalent of audiophiles. Frameratophiles.
3. They spent a lot of [their parents'] dough on their rig, and they want to see a large number displayed in the FPS meter (to compensate for the small number in the kills column?).

Just let it go. It's off-topic too!

lolz... i see one other person thinks the same way as i do.

Aigo, do this. Find a 60hz LCD. Now move your mouse in a circle. Move it faster, in a larger circle. See how at some point it starts to *JUMP* between positions?

No i see many many mouse pointers from after images when i do that. :X
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Thread title and topic is really relevant to the question of how to determine the balance between any two components on which the performance of a given application depends.

CPU and Ram? Do I spend more on CPU and less on ram, and for ram do I spend more on bandwidth and less on latency?

Ram and hard-drive? Do I spend more money on more ram and cheap spindle drives or less ram and a nice fast SSD?

The search for "balanced performance" in an expense-normalized manner is timeless. Good topic OP. I hope you got good info somewhere in the course of your thread
icon14.gif
 

GlacierFreeze

Golden Member
May 23, 2005
1,125
1
0
When I piece together a new computer, my GPU to CPU price ratio is around 2:1 or a little under.

Also, 4GB RAM is the minimum I'd get at this point. More on my next build.
 
Dec 30, 2004
12,553
2
76
Thanks, Indus. It seems that spending $50 to $100 more on the GPU than CPU seems like the "sweet spot," but I'm hoping for some confirmation from people.

And my past 3 builds have been Intel, but right now AMD low- and mid-end boards are sooo much cheaper it's hard to justify not switching.

Yeah I'd say it's the sweet spot too.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I don't think there is a ratio for everyone, but I would lean to a 1:1 price relationship for most gamers. This means a $150-200 CPU would pair nicely with a GTX 275 or 4890 GPU. There is probably more to the "formula" than just comparing the CPU and GPU. These days, your MB often times costs nearly as much (or more) than your CPU itself. The 1:1 works for the gamer who is fine with a fast GPU, but doesn't have to have the best. With how good modern CPUs overclock, and assuming you do overclock, you are looking at a 1:2 or 1:3 relationship in most cases to get the best GPU. This would represent an i7 920 ($200-300) with a X2 GPU (~$500).

Edit: Of course any formulas go out the window if you are talking EE $1000 CPUs...
 

kalrith

Diamond Member
Aug 22, 2005
6,628
7
81
It also depends on when you'll upgrade again. I tend to upgrade my video card about twice as often as my cpu, so I would lean toward your first example of spending the same on both the video card and cpu or maybe even spend a little more on the cpu. On my last upgrade I spent $250 on a Q6600 and $170 on a 2900 Pro. I plan to upgrade to a 5850 around March or April and will stick with the same cpu for a couple more years.
 

Scionix

Senior member
Feb 25, 2009
248
0
0
Bullshit. I can tell the difference between 100 and 120FPS, just as I can tell the difference between 50 and 60 fps.

Just curious, when can you see this? I feel like it's one of those traits that certain people have.

For instance, I use a 24" monitor that has a max refresh rate of 60Hz, and some people cannot stand to look at a refresh rate that low because it hurts their eyes. I cannot tell the difference between 60Hz and 120Hz.

Is it the same way for FPS? In EVERY game I have played, I have never, ever been able to discern anything over 60fps. From 0-60, I can tell a drastic difference, but I couldn't tell you if COD4 was running at 60fps or 260fps. The only time I have ever noticed an effect with something running over 60fps is when WoW does it. If I don't put on VSync, the image seems to hitch or stutter when in motion, but when the fps is locked to 60 it appears buttery smooth.

TL;DR: How and in what games can you discern a difference between 60 and 100 fps.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
CPU that won't hold any of it back.
That's the difficult part: how do you determine which CPU won't hold back your GPU? Is it a quad-core? Is it one with a ton of L3, or can one with no L3 suffice as well? Will a blazingly fast dual-core be better off? Aside from sorting through tons of benchmarks, how does one find out?

The OP was probably interested in a much quicker way to determine it, via a cost ratio.
 

deimos3428

Senior member
Mar 6, 2009
697
0
0
lolz... i see one other person thinks the same way as i do.
The rule of thumb is your minimum fps should meet (or very slightly exceed to account for latencies) the refresh rate of the display. Excessive frames might be written to the framebuffer but they'll never be displayed and are essentially wasted frames.

For most people the limiting factor is still the refresh rate of the display and not anything to do with the eyes. You can't tell the diffference between 100 fps and 120 fps on a 60Hz display regardless of your vision -- because there is no difference to observe. If you're going by the fraps number in the corner you might get a placebo effect and think one is superior, though.

Now if SunSamurai has a 120Hz display, the limiting factor becomes the quality of his vision and he just might be able to detect 100 fps vs 120 fps.