A Typhoon in a teapot coming?

lopri

Elite Member
Jul 27, 2002
13,310
687
126
According to an 'insider', NV is preparing a massive assault on CPU, declaring why GPU matters (than CPU).

Originally posted by: CJ
NV will soon start a new massive campaign where they're going to point out that you can better buy a GPU than an expensive CPU. They're directly targeting Intel with this. I don't think Intel will be too happy about this and I wouldn't be suprised if this means that Intel won't ever support SLI on their chipsets.

http://forum.beyond3d.com/showthread.php?t=47415&page=2
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Interesting... But what I find more interesting is that people will gladly pay $1,000 - 1,500 for dual GPU's, but are scared to shell out more than $300 for a CPU. I have always wondered about that, especially since the CPU has more longevity. They are both very important, and for gaming the GPU is certainly better, so I am not comparing a GPU to a CPU in that way, just that a CPU is upgraded less than a GPU, yet people still are quite frugal when it comes to their CPU's, myself included.

Though, it all fairness the GPU comes with a board and memory while a CPU does not. Which leaves you to purchase system memory and a system board, if your chip changed sockets. Still, spending $270 for my Q6600 was difficult, while spending $355 for my 8800GTS didn't bother me.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
For gamers, yeah. A 8800gts 512meg on a celeron will play most games better than a 640meg gts on intels latest XTREMECPUOMGWTFONLY$1200!!!!!!

But what gamer doesn't already know this? I mean shit, buy the lowest clocked cpu from whatever family you like, Overclock it and the added system bandwidth will let you crush that top tier cpu that costs four times as much. I always figured the only market for the high end chips were those guys obsessed with WR Superpi's and the server market... and idiots... but intel could slap an EXTREME label on a rock and sell it for $800 to the idiots. Hell, FX60's are still selling for like $400 on eBay. I saw that and it was a true WTF moment.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Lithan
For gamers, yeah. A 8800gts 512meg on a celeron will play most games better than a 640meg gts on intels latest XTREMECPUOMGWTFONLY$1200!!!!!!

But what gamer doesn't already know this? I mean shit, buy the lowest clocked cpu from whatever family you like, Overclock it and the added system bandwidth will let you crush that top tier cpu that costs four times as much. I always figured the only market for the high end chips were those guys obsessed with WR Superpi's and the server market... and idiots... but intel could slap an EXTREME label on a rock and sell it for $800 to the idiots. Hell, FX60's are still selling for like $400 on eBay. I saw that and it was a true WTF moment.

Uhhh... You might want to rethink what you wrote here. Just because someone buys an Extreme chip, it does not make them an 'idiot' or a superPI e-peen freak. There are several community mods here on this forum that have Extreme Edition chips and I doubt they would take kindly to being thought of that way. A lot of people use their PC for more than just a gaming machine...
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Ok, let's add an additional category. People with more money than they know what to do with. But nvidia's marketing isn't gonna work on them, since the concept of more economical is a foreign one. That said, 99.99% of the computing forum staff I know with EE chips didn't exactly pay for them.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
IMO the difference between an extreme budget CPU overclocked and a premium CPU is 50-100% performance difference in benchmarks and zero or little difference in games.

The difference between a budget GPU and enthusiast GPU can be as much as 2400%. That's why gamers are frugal with CPU purchases. My $65 cpu can handle any game out there. And once it can't, I'll upgrade to a different $65 CPU. By the same token a $65 video card or on-board video is not going to cut it, not by a long shot. And I can't overclock say an X300 to be within 10% of a 3850.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: v8envy
IMO the difference between an extreme budget CPU overclocked and a premium CPU is 50-100% performance difference in benchmarks and zero or little difference in games.

The difference between a budget GPU and enthusiast GPU can be as much as 2400%. That's why gamers are frugal with CPU purchases. My $65 cpu can handle any game out there. And once it can't, I'll upgrade to a different $65 CPU. By the same token a $65 video card or on-board video is not going to cut it, not by a long shot. And I can't overclock say an X300 to be within 10% of a 3850.

Those are true statements, but they still only apply to a gamer only type rig. A computer has more uses than games. If you only want to play games and are willing to overclock your CPU, you can skimp on the CPU. This is evident in benchmarks. No one will contest that. But if you buy a celeron and leave it at stock, you will have a serious limitation in games, and not only games, but pretty much anything besides writing a paper and light internet surfing.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
Originally posted by: v8envy
IMO the difference between an extreme budget CPU overclocked and a premium CPU is 50-100% performance difference in benchmarks and zero or little difference in games.

The difference between a budget GPU and enthusiast GPU can be as much as 2400%. That's why gamers are frugal with CPU purchases. My $65 cpu can handle any game out there. And once it can't, I'll upgrade to a different $65 CPU. By the same token a $65 video card or on-board video is not going to cut it, not by a long shot. And I can't overclock say an X300 to be within 10% of a 3850.

I like the midrange. I bought a 2.4ghz C2D a year ago for $180, and AFAIK it's still good enough for every game. My video card was a 7950gx2 I got for $200 off Craigslist, and now I have an 8800GT 1gb that cost a net of $165 once I sold the old card. I could have saved, maybe $100 by going lower end and OCing over the whole past year, but why bother? Going higher end makes no sense, because of the huge expendituers it would take to get significantly better performance.

Mid range worked out well in the past too. 9800GT Pro, GF4 4400Ti, TNT 2, Voodoo Rush...
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
Originally posted by: ArchAngel777
Originally posted by: v8envy
IMO the difference between an extreme budget CPU overclocked and a premium CPU is 50-100% performance difference in benchmarks and zero or little difference in games.

The difference between a budget GPU and enthusiast GPU can be as much as 2400%. That's why gamers are frugal with CPU purchases. My $65 cpu can handle any game out there. And once it can't, I'll upgrade to a different $65 CPU. By the same token a $65 video card or on-board video is not going to cut it, not by a long shot. And I can't overclock say an X300 to be within 10% of a 3850.

Those are true statements, but they still only apply to a gamer only type rig. A computer has more uses than games. If you only want to play games and are willing to overclock your CPU, you can skimp on the CPU. This is evident in benchmarks. No one will contest that. But if you buy a celeron and leave it at stock, you will have a serious limitation in games, and not only games, but pretty much anything besides writing a paper and light internet surfing.

The trouble, if you're aren't playing games then chances are that celeron is plenty for everything you do. When was the last time some one had a cpu that was to slow to run excel?
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
"but why bother?" Because you would have saved $100? Overclocking isn't for everyone, but you can't really call it a "why bother" scenario. The benefits are evident.

I do agree on your second statement though. That's my main point. Hell look at the EE's intro. It was a what $1000 chip that was STILL beat by the $200ish AMD's at the time in games and was marketed MAINLY to gamers. It was a giant turd for pretty much everyone except benchers who focused on intel heavy benches and/or Screenshot WR chasers. Now that Intel is soundly in the lead it isn't as bad, but it's still 2x the money for a few % pts performance average and far less than that in games. As a gaming chip the EE is a massive failure... IF you have to pay for it. If its on the company dime or something, hell knock yourself out.

But seriously, what's intel's top of the line DESKTOP chip cost now... I think I saw it for $1600 at newegg. I can't believe anyone would EVER think that that's a better investment for a gamer than a top of the line gfx for $400-600. So basically Nvidia's telling everyone what they already know.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: PingSpike

The trouble, if you're aren't playing games then chances are that celeron is plenty for everything you do. When was the last time some one had a cpu that was to slow to run excel?


A single core celery can and will significantly impact excel. Add in active anti-virus, single core and office tasks and you will quickly see just how terrible of a CPU it is. The dual core celery isn't as bad, but it just came out.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Uhh... active antivirus uses less than 1% cpu utilization on every computer I've ever owned... going back a pentium 200... (comps before that weren't on net so no antivirus). What do you mean by office tasks... what office tasks do you run in the background?

Hell I dont even know many businesses using core 2's yet. Most of em are running p3's and p4's still. And I'd bet dollars to donuts a core celly crushes a p4. Yeah there's a difference, but only a stupid boss will approve an $800+ cpu for your workstation because it's a little faster than a $75-300 one.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Lithan
Uhh... active antivirus uses less than 1% cpu utilization on every computer I've ever owned... going back a pentium 200... (comps before that weren't on net so no antivirus). What do you mean by office tasks... what office tasks do you run in the background?

Hell I dont even know many businesses using core 2's yet. Most of em are running p3's and p4's still. And I'd bet dollars to donuts a core celly crushes a p4. Yeah there's a difference, but only a stupid boss will approve an $800+ cpu for your workstation because it's a little faster than a $75-300 one.


Both of those paragraphs prove that you are 1) Not very old and 2) Not very experienced.

The main indicator that you are young and or inexperienced is that you cannot understand that there are in fact situations where a faster CPU is crucial to business. Remember, just because you do not know of any reason does not mean they don't exist. I know that is difficult to do for a lot of people but ignorance of a legitimate use doesn't make it illegitmate. In other words, you not understanding why someone would buy a QX9650 for legitmate reasons doesn't make you right. It just means you don't know it all.

I see this in the tech field all the time. A tech will make a blanket statement something on the likes of "Why did this Admin do it that??? What an idiot" and when you actually investigate with an open mind, you find that there actually are good reasons why he did it that way and he isn't such an idiot after all. Then you walk away humbled and if you are wise, you learn from it and learn not to make statements like it again.

So, to rehash, there is two ways when you lack all the data to make a conclusion.

What an idiot! I cannot believe someone bought a QX9650!

OR you can be wise and say

I do not understand why someone would pay this much money for a CPU, can someone please help me understand these situations?

To address the anti-virus using 1% CPU only. I have worked on hundreds of PC where this is simply not true in practise. Norton, and Trend Micro often get stuck processes that take up 100% CPU utilization. On a single core CPU, this is devistatingly painful to work on. Try using a single core when a process has locked 100% CPU usage. It is darnright painful! Even getting the start button or task manager to come up can be a 5 minute task! This also isn't some rare occurance that happens.

I guess my final statement will be this. Once you have worked in the field for a while, you accept that theory often times does not reflect reality. If you want to discuss this further, we can do it via PM. Otherwise, I'd rather not derail this thread further. I just want people to know that whether they realize it or not, a faster CPU has a purpose whether you know the reasons or not.

 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
That's not active it's scan or update. You should actually learn modern computing terms before you start talking down to people.

It seems to me that you ought to provide some instances where it's efficient to purchase a $1600 cpu for business tasks.
 

jaqie

Platinum Member
Apr 6, 2008
2,471
1
0
Coming at this from another angle here.

I am a gamer and power user. I transcode video, compress very large (10GB and higher) 7zip files, and also play games. While my rig is a blas from the past as far as high dollar gamers are concerned, being that I am on a fixed disability check of $650 a month (that also has to go to rent, food, medicine that disabilty doesn't cover, et cetra) the logic behind what I say still holds despite my system not being a "high dollar" gaming box.

When the ahtlon64 x2 first came out, I decided it was finally time to upgrade from my socket A system. I took every ounce of cash I could muster from selling off all of my older hardware, and bought a new PCIe mobo, geforce 6600 256MB, and I plunged $350 into a retail athlon64 x2 3800+. Remember, they had just come out. I am still using most of those parts, though I have since upgraded video to a 7900GS/OCE. The system is overpowered for just games CPU wise with the x2 even with my new video card (purchased a couple months ago), but I use the cpu to 100% on both cores quite often for other things, and so do other people. If I could afford it, I would probably have a top of the line (or maybe a step or two from it) quad core, perhaps even an extreme edition, just so I could do more of what I do in the same amount of time. I care little for what other people think about my computer, I care very much about what I can do with it within a defined amount of time. I don't like waiting the time I do have to wait to transcode video and such, but I also want good gaming performance out of my system. With my financial situation, I would have to say I am doing quite well hardware wise for how I use my box, and there is my whole point - some people do both heavy gaming and heavy non-gaming tasks, and that is why they get a high end cpu.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Lithan
That's not active it's scan or update. You should actually learn modern computing terms before you start talking down to people.

It seems to me that you ought to provide some instances where it's efficient to purchase a $1600 cpu for business tasks.

First off, lets use realistic examples. $1600 desktop CPUs pretty much exist to win benchmarks and are only purchased in a few instances:

1: By people who "have to" own the fastest(i.e. most expensive) available hardware at the time. In a business setting, this is often the CEO. :D

2: Professionals whose work tasks are CPU-bound, where the cost of the CPU over time is negligible compared to the value of the worker's productivity.

If we use less extreme examples,like your $800 vs $300 CPU argument, #2 ends up applying to a significant percentage of workers who use their PC for anything more strenuous than basic web-browsing.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Yes, there's no question that if money were available we'd want the best proc around. The question is, are the EE's economical. And it's foolish to suggest that they are. Maybe in some extreme, one in a million case... but everyone knows that as a general rule, no they aren't... especially for gamers which are obviously the folks this is targeting. Does Nvidia really think they'll find someone (capable of building their own rig) who will listen to them and say;"My god... he's right! I'm now going to buy a core 2 duo and 2x 9800gx2's for quad sli instead of a 3.2ghz quad EE and this 8600gt I WAS buying. Thank you Nvidia for letting me know that one was better for gaming than the other! I had no idea!"
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: aka1nas
Originally posted by: Lithan
That's not active it's scan or update. You should actually learn modern computing terms before you start talking down to people.

It seems to me that you ought to provide some instances where it's efficient to purchase a $1600 cpu for business tasks.

First off, lets use realistic examples. $1600 desktop CPUs pretty much exist to win benchmarks and are only purchased in a few instances:

1: By people who "have to" own the fastest(i.e. most expensive) available hardware at the time. In a business setting, this is often the CEO. :D

2: Professionals whose work tasks are CPU-bound, where the cost of the CPU over time is negligible compared to the value of the worker's productivity.

If we use less extreme examples,like your $800 vs $300 CPU argument, #2 ends up applying to a significant percentage of workers who use their PC for anything more strenuous than basic web-browsing.


Most of those tasks can be offloaded onto servers... which is a much more economical solution.

And where's this idea that 10.2ghz of cpu power is only capable of web browsing coming from?
You're comparing a 12ghz cpu to a 10.2ghz one. Anything that is taxing enough there's going to be a noticable difference in those two cpus would be damn foolish to expect a workstation to process.

edit: it's $1000 not $800 by the way, Just checked.

And need I remind you that there are quad xeons for well under $300, and dual socket mobos for ~$200... so even looking at straight up proc power, the only place the EE's gonna win is a situation where 4 cores are supported but not eight.


xeon = $800 for 19.2ghz vs EE @ $1000 Plus mobo cost for 12ghz.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
I don't know why NVIDIA keeps pushing this, not everyone is a gamer and the GPU>CPU thing only applies for gamers, they are going to have a tough time leveling it all out.

I mean there's more to computers than just games, CPUs do just about everything else (notice I don't say better because GPUs don't even do some of the stuff CPU can do)
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Yeah, but most of nvidia's market is gamers. So it makes sense for them to target them. I mean hell, most gamers use Intel onboard... the % share Nvidia has of nongamers is probably so insignificant they just ignore it.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Lithan
[
Most of those tasks can be offloaded onto servers... which is a much more economical solution.

And where's this idea that 10.2ghz of cpu power is only capable of web browsing coming from?
You're comparing a 12ghz cpu to a 10.2ghz one. Anything that is taxing enough there's going to be a noticable difference in those two cpus would be damn foolish to expect a workstation to process.

In theory, this sounds great. The reality of the situation is that there are tons of LOB apps that aren't multi-user capable and/or don't run properly in a terminal server-type environment.



Originally posted by: Lithan

edit: it's $1000 not $800 by the way, Just checked.

And need I remind you that there are quad xeons for well under $300, and dual socket mobos for ~$200... so even looking at straight up proc power, the only place the EE's gonna win is a situation where 4 cores are supported but not eight.

xeon = $800 for 19.2ghz vs EE @ $1000 Plus mobo cost for 12ghz.


What is your point? Are you arguing about the EE being a good buy or whether there are sometimes business cases for user to have more expensive CPUs?

Also, you're being a little dishonest adding up the clockspeeds of the individual cores given the state of multi-threaded software in most business apps at this point.

 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
What line of business apps do you run that crunch enough that a 3.0ghzx4 proc will be significantly faster than a 2.55ghzx4 proc, enough to create a noticable increase in productivity and justify an additional $725 per workstation? I ask because it's not typically a technological hurdle that keeps some places/programs from using a LOB model, it's almost exclusively because the processor load is so minute it's not worth the network traffic... which is a case where your argument doesn't apply.

My point is that EE is clearly NOT a business chip, is at a Massive disadvantage in price and performance to true business chips, so excusing an overpriced gaming chip due to it's performance in business apps is ridiculous. And if your processing capability is limited by your softwares support for multithreading, I doubt a faster processor will increase productivity, since apparently your client has absolutely zero interest in actually being productive.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Piuc: its mainly about games, but what about video playback? or for those that want IGP solutions that can actually be used for casual gaming and utilize all of Vista's capabilities? Quoting ATs review when they compare both the MCP78 (Geforce 8200), 780G and the G35. The G35 is literally miles behind.

More important than the raw numbers is that throughout testing, neither the 780G nor GeForce 8200 once experienced pausing, judder, or outright blank screen events - something we cannot say about the G35. Certainly, our processor choices have a significant impact on CPU utilization rates, but considering our two choices are priced equally we have to give the nod to AMD for having a better media solution in this price range. As far as power consumption goes during H.264 playback, the AMD platform averaged 106W, NVIDIA platform at 102W, and the Intel platform averaged 104W - too close to really declare a true winner.

The Intel G35 platform will show its strength in areas like office applications and video/audio encoding thanks to the Core 2 processor family. However, it has an Achilles heel that keeps it from being an all around champion. Besides dismal H.264 decoding abilities with low-end processors, casual gaming is an almost complete disaster on this platform. This is an area we will report on thoroughly in our next article. In the meantime, we leave you with these screenshots to ponder which platform is best suited for that casual gamer in the household.

Notice that there was stuttering EVEN with a core 2 duo? While for AMD/NV there was none even with a weaker CPU (althonX2 4850E). Ever since AMD/NV stepped into the IGP arena, theyve proven time and time again that IGP solutions CAN be good. System/app performances may depend on the CPU, but when it comes to video playback and casual gaming intel just cant compete.

Point is, theres just no reason to get an intel IGP solution (especially for a HTPC) when theres alternative solutions at similar price that offers so much more for your money.

Out of all this, i find it sad that both intel and NVIDIA are just ignoring AMD/ATi if they didnt really exist. :D
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Lithan
What line of business apps do you run that crunch enough that a 3.0ghzx4 proc will be significantly faster than a 2.55ghzx4 proc, enough to create a noticable increase in productivity and justify an additional $725 per workstation? I ask because it's not typically a technological hurdle that keeps some places/programs from using a LOB model, it's almost exclusively because the processor load is so minute it's not worth the network traffic... which is a case where your argument doesn't apply.

My point is that EE is clearly NOT a business chip, is at a Massive disadvantage in price and performance to true business chips, so excusing an overpriced gaming chip due to it's performance in business apps is ridiculous. And if your processing capability is limited by your softwares support for multithreading, I doubt a faster processor will increase productivity, since apparently your client has absolutely zero interest in actually being productive.

I think I agreed with you in my first post regarding the EE, I was referring to the following statement: :confused:

Yeah there's a difference, but only a stupid boss will approve an $800+ cpu for your workstation because it's a little faster than a $75-300 one.

I would hope that a boss would make a cost-benefit analysis and decide if it was worth it based on how much the employee was worth. A content creation guy who is billing a few hundred an hour might be worth it if it lets him get his work done 10% faster, while the receptionist playing Bejewelled is probably fine with a Celeron.:)

I'm not sure what you mean regarding the bolded statement. It's often not the client's choice what software they get to run, especially if there is only one or two players producing software for their market. I'm mostly coming from an Accounting/Tax IT background, and most of the software vendors in that arena are absolutely glacial(grrr, Intuit) in improving their products.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Ok, you got me there. Can we agree that it's a rare employee who is worth a $1000 cpu? By client I meant that Bejeweled playing secretary... only now they're the content creation guy whose saying he might as well play bejeweled because he can't work while his processor is doing something... in other words, an intentional situation where he's held up by his comp (not hard at all to avoid) so he can goof off with an excuse in case he gets caught. Basically the problem isn't the software is incapable of multithreading, it's that the user is incapable of time management and/or multitasking.

But what accounting software is such a beast that you actually have to sit there waiting for it to crunch something? Hell, I'd expect the US federal government's discretionary spending accounts could be calculated by even a slow processor in a matter of seconds. Shifting numbers around and FP and integer calculations aren't all that processor intensive, I'd guess hard drive access is your biggest lag issue doing that stuff... but maybe I'm mistaken about what you do.

Edit: Not ignoring you cookie, I just don't keep up with IGP's.