A Typhoon in a teapot coming?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: ArchAngel777
Originally posted by: Lithan
For gamers, yeah. A 8800gts 512meg on a celeron will play most games better than a 640meg gts on intels latest XTREMECPUOMGWTFONLY$1200!!!!!!

But what gamer doesn't already know this? I mean shit, buy the lowest clocked cpu from whatever family you like, Overclock it and the added system bandwidth will let you crush that top tier cpu that costs four times as much. I always figured the only market for the high end chips were those guys obsessed with WR Superpi's and the server market... and idiots... but intel could slap an EXTREME label on a rock and sell it for $800 to the idiots. Hell, FX60's are still selling for like $400 on eBay. I saw that and it was a true WTF moment.

Uhhh... You might want to rethink what you wrote here. Just because someone buys an Extreme chip, it does not make them an 'idiot' or a superPI e-peen freak. There are several community mods here on this forum that have Extreme Edition chips and I doubt they would take kindly to being thought of that way. A lot of people use their PC for more than just a gaming machine...

intel released the 9650 four MONTHS before the other 45nm quads, too. plus, unlocked multi is a nice bonus on those 7.5 and 8x quads :frown:
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: ArchAngel777
Originally posted by: Lithan
Uhh... active antivirus uses less than 1% cpu utilization on every computer I've ever owned... going back a pentium 200... (comps before that weren't on net so no antivirus). What do you mean by office tasks... what office tasks do you run in the background?

Hell I dont even know many businesses using core 2's yet. Most of em are running p3's and p4's still. And I'd bet dollars to donuts a core celly crushes a p4. Yeah there's a difference, but only a stupid boss will approve an $800+ cpu for your workstation because it's a little faster than a $75-300 one.


Both of those paragraphs prove that you are 1) Not very old and 2) Not very experienced.

I agree with Lithian. Does that make me young or inexperienced? I wish. Over 18 years as a code monkey, team lead, business owner, bean counter, manager, investor, contractor, conslutant. Sometimes all at once. And all my experience backs up exactly what he said.

The main indicator that you are young and or inexperienced is that you cannot understand that there are in fact situations where a faster CPU is crucial to business. Remember, just because you do not know of any reason does not mean they don't exist. I know that is difficult to do for a lot of people but ignorance of a legitimate use doesn't make it illegitmate. In other words, you not understanding why someone would buy a QX9650 for legitmate reasons doesn't make you right. It just means you don't know it all.

And if the businesses success hinges on people having a 20% faster CPU on their desktop something has gone horribly wrong.

Hardware comes out of one budget -- capital expenditures. Salaries come out of a completely different bucket. The two are accounted for and reported to investors on different line items. So even if a 20% faster CPU would make an employee more productive and in fact have an ROI of less than a week there may be business reasons to keep the said employee on a 4 year old PC instead. This is norm, not exception.

Seriously, show me one business outside of the dotcom money torch era that bought top of the line equipment for anyone except possibly the CEO.

As Lithian said, the heavy lifting is done on the servers. And a multiplier unlocked enthusiast CPU is the last thing you'll be shoving in those. Server socket CPUs, error correcting RAM.

You know who buys these things? My lawyer friends. They're into conspicuous consumption and are fiercely competitive.

You khow who doesn't buy these things? Any of my business clients or IT buddies.

To address the anti-virus using 1% CPU only. I have worked on hundreds of PC where this is simply not true in practise. Norton, and Trend Micro often get stuck processes that take up 100% CPU utilization. On a single core CPU, this is devistatingly painful to work on. Try using a single core when a process has locked 100% CPU usage. It is darnright painful! Even getting the start button or task manager to come up can be a 5 minute task! This also isn't some rare occurance that happens.

Which means your IT department is not properly supporting your desktops. They should 1. get a patch that fixes these issues or 2. migrate to a non-broken product or 3. just ignore whining from people whose opinions and productivity doesn't matter.

3. is usual.



 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Lithan
Yeah, but most of nvidia's market is gamers. So it makes sense for them to target them. I mean hell, most gamers use Intel onboard... the % share Nvidia has of nongamers is probably so insignificant they just ignore it.

that's simply not true. it's true for their GPU business, but what about amd chipsets? nvidia made the best mobos for amd systems for YEARS, so many many people (sadly I wasn't "in the know back then") bought them for oc'ing, stability, etc. Also, how many integrated graphics chipsets does nvidia sell every month? Probably a LOT more than all their 9series trash put together I'd wager...
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: lopri
According to an 'insider', NV is preparing a massive assault on CPU, declaring why GPU matters (than CPU).

Originally posted by: CJ
NV will soon start a new massive campaign where they're going to point out that you can better buy a GPU than an expensive CPU. They're directly targeting Intel with this. I don't think Intel will be too happy about this and I wouldn't be suprised if this means that Intel won't ever support SLI on their chipsets.

http://forum.beyond3d.com/showthread.php?t=47415&page=2

intel will support sli on their chipsets if nvidia will let them because it will help intel to sell more chipsets. not allowing sli on intel chipsets is the only way right now that nvidia can have a reasonable volume of intel-based chipset business because the p35/x38/x48 are so much better than the crap that nvidia has out (although 790i does look promising).
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: bryanW1995
Originally posted by: Lithan
Yeah, but most of nvidia's market is gamers. So it makes sense for them to target them. I mean hell, most gamers use Intel onboard... the % share Nvidia has of nongamers is probably so insignificant they just ignore it.

that's simply not true. it's true for their GPU business, but what about amd chipsets? nvidia made the best mobos for amd systems for YEARS, so many many people (sadly I wasn't "in the know back then") bought them for oc'ing, stability, etc. Also, how many integrated graphics chipsets does nvidia sell every month? Probably a LOT more than all their 9series trash put together I'd wager...

*hugs DFI Lanparty UT Nforce 4*
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: v8envy
Over 18 years as a code monkey, team lead, business owner, bean counter, manager, investor, contractor, conslutant. Sometimes all at once. And all my experience backs up exactly what he said.

If you do not see the failure in logic here, I really cannot help you. Essentially what you are saying is "Because I have never seen it, it must not exist" and that would work if you knew everything, but you don't. The arguement is not "Does it make sense for most businesses?" the arguement is "Does it make sense for any businesses?"

Even if no one in this forum could provide a good reason (and good reasons have already been explained) for it, you'd still be hard pressed to make a statement like that as you have not worked in all industries

I think people just argue for the sake of argueing at this point.

A few thing projected here that I did NOT say were

"Most people need an EE chip"

"Most business need an EE chip"

Since I never said those things and do not agree with those statements, I don't see what all the fuss is about. My bone is calling someone an idiot or an arrogant fool for buying an EE CPU, a statement that I find irresponsible.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
well i bought an 3.4 "EE" P4 .. nice CPU at 3.74Ghz that blew the doors off my 2.80c@3.31Ghz and nicely paired with my x1950p/512m

of course i didn't pay much for it ... $115 last year and it gave a little life to my old Abit ic7 MB as a gaming platform

loved it too .. for about 2 months and sold it for $115 when i got my e4300 ... the old EE could keep up with it in Single threaded applications when my 4300 was at stock speeds
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
The value of a fast CPU is undervalued when it comes to video games. There are serious benefits to having a faster CPU, such as load times and actual gameplay performance.
The reason 1000$ CPUs are not worth it is because of lack of competition at the higher end. A 1000$ intel CPU is only 20% or so faster then a 200$ CPU. With video cards you tend to get more for your money.

And good luck trying to game on the weakest celeron with the fastest GPU on the market. Your FPS will suck in most games. And it will be unplayable in CPU heavy games like anything with the source engine (Half life 2 and derivatives)
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: bryanW1995
Originally posted by: Lithan
Yeah, but most of nvidia's market is gamers. So it makes sense for them to target them. I mean hell, most gamers use Intel onboard... the % share Nvidia has of nongamers is probably so insignificant they just ignore it.

that's simply not true. it's true for their GPU business, but what about amd chipsets? nvidia made the best mobos for amd systems for YEARS, so many many people (sadly I wasn't "in the know back then") bought them for oc'ing, stability, etc. Also, how many integrated graphics chipsets does nvidia sell every month? Probably a LOT more than all their 9series trash put together I'd wager...

Well yeah, but this isn't selling chipsets, it's selling gfx.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: taltamir
The value of a fast CPU is undervalued when it comes to video games. There are serious benefits to having a faster CPU, such as load times and actual gameplay performance.
The reason 1000$ CPUs are not worth it is because of lack of competition at the higher end. A 1000$ intel CPU is only 20% or so faster then a 200$ CPU. With video cards you tend to get more for your money.

And good luck trying to game on the weakest celeron with the fastest GPU on the market. Your FPS will suck in most games. And it will be unplayable in CPU heavy games like anything with the source engine (Half life 2 and derivatives)

Weakest celeron is faster than socket a's and many of my buddies still play on socket a's. In fact the only reason they even think about upgrading is to get pci-e. And their fps are just fine in HL2. Your claims quite simply aren't true.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: ArchAngel777
Since I never said those things and do not agree with those statements, I don't see what all the fuss is about. My bone is calling someone an idiot or an arrogant fool for buying an EE CPU, a statement that I find irresponsible.

Noone even suggested that you said either of those things. And I most certainly didn't say that everyone who bought an EE was an idiot or arrogant fool. Perhaps you should reread this thread and then look in a mirror since you seem to have god's-all victim crisis going on.


More over, you've both said that experience can't be used to draw conclusions and that by disagreeing with you someone proves that they lack experience... which by logical analysis of your stance... is meaningless. It's all very convoluted.

Alright you two, I think you've picked at each other long enough. Tone it down before someone ends up taking a vacation.

-ViRGE
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Originally posted by: Lithan
Well yeah, but this isn't selling chipsets, it's selling gfx.
NV now calls their IGP chipset "mGPU". :D

Also note that this 'campaign' looks like to overlap with the debut of 'Hybrid SLI'.
 

Dream Operator

Senior member
Jan 31, 2005
344
0
76
Originally posted by: ArchAngel777
Originally posted by: Lithan
Uhh... active antivirus uses less than 1% cpu utilization on every computer I've ever owned... going back a pentium 200... (comps before that weren't on net so no antivirus). What do you mean by office tasks... what office tasks do you run in the background?

Hell I dont even know many businesses using core 2's yet. Most of em are running p3's and p4's still. And I'd bet dollars to donuts a core celly crushes a p4. Yeah there's a difference, but only a stupid boss will approve an $800+ cpu for your workstation because it's a little faster than a $75-300 one.


Both of those paragraphs prove that you are 1) Not very old and 2) Not very experienced.

The main indicator that you are young and or inexperienced is that you cannot understand that there are in fact situations where a faster CPU is crucial to business. Remember, just because you do not know of any reason does not mean they don't exist. I know that is difficult to do for a lot of people but ignorance of a legitimate use doesn't make it illegitmate. In other words, you not understanding why someone would buy a QX9650 for legitmate reasons doesn't make you right. It just means you don't know it all.

I see this in the tech field all the time. A tech will make a blanket statement something on the likes of "Why did this Admin do it that??? What an idiot" and when you actually investigate with an open mind, you find that there actually are good reasons why he did it that way and he isn't such an idiot after all. Then you walk away humbled and if you are wise, you learn from it and learn not to make statements like it again.

So, to rehash, there is two ways when you lack all the data to make a conclusion.

What an idiot! I cannot believe someone bought a QX9650!

OR you can be wise and say

I do not understand why someone would pay this much money for a CPU, can someone please help me understand these situations?

To address the anti-virus using 1% CPU only. I have worked on hundreds of PC where this is simply not true in practise. Norton, and Trend Micro often get stuck processes that take up 100% CPU utilization. On a single core CPU, this is devistatingly painful to work on. Try using a single core when a process has locked 100% CPU usage. It is darnright painful! Even getting the start button or task manager to come up can be a 5 minute task! This also isn't some rare occurance that happens.

I guess my final statement will be this. Once you have worked in the field for a while, you accept that theory often times does not reflect reality. If you want to discuss this further, we can do it via PM. Otherwise, I'd rather not derail this thread further. I just want people to know that whether they realize it or not, a faster CPU has a purpose whether you know the reasons or not.

Excellent post! I remember the agonizing pain of locking up my single core 3500+.

I use Steinberg's Nuendo and Sony's Vegas. They have the ability to utilize 4 cores. Mixing a song with full instrumentation demands a lot of CPU power. The number of effects and the latency of your audio is determined by CPU speed. Many people recording music would prefer not to enter the realm of overclocking for obvious reasons. EE could be justified for those, but hopefully they understand the cost/performance ratio first.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Hah.. speaking of the devils..

NVIDIA CEO: "We're Going to Open a Can of Whoop Ass"

I think this is getting serious. *grabbing popcorn*

Throughout the bulk of the conference call, Huang continued to lament Intel for poor graphics performance. He hints at another Intel slide deck that claims Intel GMA 3100 is Windows Vista Premium compatible -- a claim which was debunked by Microsoft employees in a recent lawsuit.

Intel fired back minutes later, sending emails to analysts detailing NVIDIA's poor track record when it comes to Vista crashes due to incomplete drivers. Almost on queue, Huang responded once again.

"NVIDIA has to support several new titles every week," he said, alleging that Intel's graphics just have to support the basic office packages. "You already have the right machine to run Excel. You bought it four years ago," he said.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Lithan
Weakest celeron is faster than socket a's and many of my buddies still play on socket a's. In fact the only reason they even think about upgrading is to get pci-e. And their fps are just fine in HL2. Your claims quite simply aren't true.

You may very well be right about HL2, but are you sure you'd be happy with 7.1 FPS @ 1024x768 w/ 0x AA & 0x AF, with an 8800 Ultra: http://www23.tomshardware.com/...2&model2=945&chart=421 Yeah, that's a 2 Ghz single-core Athlon 64, not a 1.6 Ghz Celeron 420, but I can assure you that the A64 is at least as fast as a Celeron 420. BTW, my point isn't really that you're wrong, just that not all games are completely GPU-bound, as you seem to think. Most are, and with those games, I agree with you. A few like Supreme Commander, MS's FSX, and Crysis to some extent, aren't.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
I am partial to NV when it comes to NV vs Intel. Reasons being:

1. (Comparatively) Underdog FTW :D
2. It is true that GPU matters more than CPU for gaming and possibly future computing. (Fusion, Cell, Larrabe,..)
3. I don't want to see 'x86 everywhere' as Intel dreams.
4. This can unintentionally help AMD.
5. etc.

Of course the biggest factor that can decide the winner is Microsoft per usual. Devs will matter but not as much as OS & API.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
This fight is interesting and not unexpected. Nvidia is the odd man out in this situation as AMD\Intel work to build "platforms" around their CPU\GPU's. Nvidia must be feeling heat about the upcomming Intel GPU from OEMs and felt the need to face it head on right now to do this.

Like I said in the other thread. Intel has a set of arrogant balls to think game devs will build an engine to use ray tracing when they represent 0% of the performance gaming market.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Nvidia better be somewhat careful. Intel has and can acquire vast resources to produce what they want. Like they showed with Core architecture they can for the most part destroy the competition when they put their weight behind something. I'm actually glad Intel will be joining the GPU market. They may be basically starting from scratch but with their muscle they may be able to pull it off.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: thilan29
Nvidia better be somewhat careful. Intel has and can acquire vast resources to produce what they want. Like they showed with Core architecture they can for the most part destroy the competition when they put their weight behind something. I'm actually glad Intel will be joining the GPU market. They may be basically starting from scratch but with their muscle they may be able to pull it off.

That is very true. Although the company is larger, it does have divisions. Sort of like Samsung and Western Digital. Samsung is a much larger company, but the hard drive division has a set budget that is probably lower than what Western Digital has to work with. I imagine the CEO of the company would have to shift funds into their hard drive division if they wanted to dominate - something they would only do if it paid off in the end.

So a larger company does have more resources, but since often times they have several divisions, it generally means that those extra resources only really come into play if the company itself decides to shift and dominate.

Anyway, I am not argueing you, or dissagreeing with you. In fact, I am sure you and everyone else already knows this information already. Just maybe helpful to some people. I too think Intel, if they put their mind to it, could produce some wicked GPU's. Problem is, Intel has never went into this full force. They act like they are, but nothing comes of it. They fooled a lot of people with GMA3000 and GMA3100 that they were behind those products and writing competent drivers - but they didn't. It will be interesting no doubt and I'd be more than thrilled to have a CPU and GPU from Intel.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Blacklash
I like nVidia and we saw what happened when AMD called out Intel. I wish them well.

Slightly different scenario but I do agree Intel is a giant. Intel is moving onto Nvidia's home turf and not the other way around. I dont think Intel's track record of moving into other markets and dominating is steller. Though I do applaud them trying to make integrated graphics catchup to the low end of the current market. That helps the video game industry big time imo.
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
If Intel actually, with more then just words, entered the discrete graphics market and was willing to put the resources behind it, they probably could catch up and become competitive. In the past, its been all talk...and frankly, their plans so far seem to be trying to move the goal posts to a tech no one is really using (all ray tracing when the entire market uses something else) rather then go toe to toe with nvidia and ati on rasterization. So I kind of think they're probably all talk again. I don't doubt they could enter the market in force, I just think they don't want too. Because its a messy market and they're used to striking from a dominate position? Who knows.

Its a strange time for all involved. With AMD purchasing ATI, nvidia lost a pretty close business relationship. They go running to intel, but with intel developing its own division they'll never be tightly knit. Nvidia is afraid they're going to be left in the lurch. Both companies could effectively lock them out of the game if they wanted too, and if that happened there would be no where to run too anymore.

I think thats why you see nvidia so aggressive lately. They know they have to be for the long term health of the company.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Originally posted by: myocardia
Originally posted by: Lithan
Weakest celeron is faster than socket a's and many of my buddies still play on socket a's. In fact the only reason they even think about upgrading is to get pci-e. And their fps are just fine in HL2. Your claims quite simply aren't true.

You may very well be right about HL2, but are you sure you'd be happy with 7.1 FPS @ 1024x768 w/ 0x AA & 0x AF, with an 8800 Ultra: http://www23.tomshardware.com/...2&model2=945&chart=421 Yeah, that's a 2 Ghz single-core Athlon 64, not a 1.6 Ghz Celeron 420, but I can assure you that the A64 is at least as fast as a Celeron 420. BTW, my point isn't really that you're wrong, just that not all games are completely GPU-bound, as you seem to think. Most are, and with those games, I agree with you. A few like Supreme Commander, MS's FSX, and Crysis to some extent, aren't.

Yeah, one game is cpu bound @ 1024x768 with low details. Crysis? Not really. Where is the config info anyway? Cause their 2004 numbers arent even close. I could host with a socket A three years ago with bots and get better FPS than some of their core 2 duo's are getting.


And a 1.6 dual core celeron would destroy a 2.0 single core athlon... Just destroy. Using a $45 cpu isn't fair when a $60 is that much better. Hell spring the whole $70 and get a core 2. Or hell, Drop a whole <$250 and get a quad if you want to play supreme commander. But 99.999999% of games played today still play fine on ANY cpu you can buy for modern platforms if you have enough mem and gfx.