A Typhoon in a teapot coming?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: PingSpike
If Intel actually, with more then just words, entered the discrete graphics market and was willing to put the resources behind it, they probably could catch up and become competitive. In the past, its been all talk...and frankly, their plans so far seem to be trying to move the goal posts to a tech no one is really using (all ray tracing when the entire market uses something else) rather then go toe to toe with nvidia and ati on rasterization. So I kind of think they're probably all talk again. I don't doubt they could enter the market in force, I just think they don't want too. Because its a messy market and they're used to striking from a dominate position? Who knows.

Its a strange time for all involved. With AMD purchasing ATI, nvidia lost a pretty close business relationship. They go running to intel, but with intel developing its own division they'll never be tightly knit. Nvidia is afraid they're going to be left in the lurch. Both companies could effectively lock them out of the game if they wanted too, and if that happened there would be no where to run too anymore.

I think thats why you see nvidia so aggressive lately. They know they have to be for the long term health of the company.

NVIDIA has alternatives besides becoming a [very profitable] niche like Apple if AMD and Intel successfully squeeze them. And the "squeezing" depends on how much NVIDIA is viewed as a "threat". Tempest in a Teapot describes it as it appear to me to be mostly "PR" .. a "war of words" .. as intel and NVIDIA still appear to be exploring and testing their odd and very competitive "partnership".

NVIDIA is cash rich and they command the GPU performance and enthusiast and high-end market. The may feel the need to expand into CPUs but would need to buy Via for the licensing [and that IS risky] .. or even AMD [and that is unlikely]. AND if they rush headlong into CPU they will meet a very strong response back from the entire industry. It appears to me now that they want to take the entire add-in PC Graphics market away from AMD will make a push into every price segment with excellent value that AMD will have a difficult time matching.

However, NVIDIA needs to be careful with intel in taking the IG market. The margins are slim but there is a lot of market. They want to partner with intel yet they need to compete with them. i am *hoping* they have a genius to guide them as here is where it gets tricky and i will not attempt my analysis here.

Intel is LAZY [again] in my opinion - their talk of Ray Tracing is all Smoke and Mirrors designed to Bullsh!t us - just like with their stupid P4 and nonsense talk of 10Ghz when they could barely get 1/3rd of it.

NVIDIA can tell if *I* can tell and here is intel's weakness for the world to see .. they have no clothing - no substance and their attempts to "buy" devs is PATHETIC - and their forcefield is wavering

"Fire on the DeathStar" is my recommendation to NVIDIA
:sun:>>>>:moon:

rose.gif

 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: apoppin
The may feel the need to expand into CPUs but would need to buy Via for the licensing [and that IS risky] .. or even AMD [and that is unlikely].

Actually I thought the license can't be transferred even if bought out.

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: thilan29
Originally posted by: apoppin
The may feel the need to expand into CPUs but would need to buy Via for the licensing [and that IS risky] .. or even AMD [and that is unlikely].

Actually I thought the license can't be transferred even if bought out.

You'd keep 'em as Via - a Division of NVIDIA :p

Graphics powered by NVIDIA
CPU Powered By Via/NVIDIA


rose.gif


it might work .. they need to tiptoe ...
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: apoppin
Originally posted by: thilan29
Originally posted by: apoppin
The may feel the need to expand into CPUs but would need to buy Via for the licensing [and that IS risky] .. or even AMD [and that is unlikely].

Actually I thought the license can't be transferred even if bought out.

You'd keep 'em as Via - a Division of NVIDIA :p

Graphics powered by NVIDIA
CPU Powered By Via/NVIDIA


rose.gif


it might work .. they need to tiptoe ...
Actually you'd keep S3 as a division of NVIDIA. VIA proper doesn't have any notable licenses, while S3 holds some patents Intel uses for Itanium. It's S3 that keeps Intel from marginalizing VIA.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Actually you'd keep S3 as a division of NVIDIA. VIA proper doesn't have any notable licenses, while S3 holds some patents Intel uses for Itanium. It's S3 that keeps Intel from marginalizing VIA.

thank-you!!!

the piece that fits just right

Much [much] better!, yes - heck NVIDIA could buy them both in a single swoop and piecemeal VIA out for cash as they don't seem to be doing so well atm!! They make their new division S3 into a CPU-creating powerhouse with the Engineers they got a couple of years back that specializes in working with Graphics ala fusion and NVIDIA is good to go for the next century. .. well, ten years baring disaster. ... and intel :p

.. Someone really gets it here also and i think nV already does too as they are pretty smart long-term strategy planners [unlike the big Lazy Boys who lucked out last time because AMD got Cocky too] .. i considered these options also last year about this time when my ideas were simply considered way too radical back then

rose.gif


 

imported_ST

Senior member
Oct 10, 2004
733
0
0
living in Silicon Valley, it's interesting that both headquarters are literally a mile a part from each other...wonder what happens if both CEOs accidentally meet for lunch... ;)

As the AMD vs. Intel fight shows, the clear winner here will be the consumer as I hope new innovative things come to fruition from this fierce competition!
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: myocardia
Originally posted by: Lithan
Weakest celeron is faster than socket a's and many of my buddies still play on socket a's. In fact the only reason they even think about upgrading is to get pci-e. And their fps are just fine in HL2. Your claims quite simply aren't true.

You may very well be right about HL2, but are you sure you'd be happy with 7.1 FPS @ 1024x768 w/ 0x AA & 0x AF, with an 8800 Ultra: http://www23.tomshardware.com/...2&model2=945&chart=421 Yeah, that's a 2 Ghz single-core Athlon 64, not a 1.6 Ghz Celeron 420, but I can assure you that the A64 is at least as fast as a Celeron 420. BTW, my point isn't really that you're wrong, just that not all games are completely GPU-bound, as you seem to think. Most are, and with those games, I agree with you. A few like Supreme Commander, MS's FSX, and Crysis to some extent, aren't.

Those games tends to balance the load more toward the CPU than many other games, but a faster GPU will give more performance to those games than a faster CPU.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
There has been some speculation in the technical community over the last few months about ray tracing becoming the dominant technique for rendering real-time games.

PC Perspective recently interviewed two of the world?s most prominent game developers, Cevat Yerli (Crytek, creators of Crysis) and John Carmack (id Software), and they both cast doubt on the claims about ray tracing. They said rasterization will continue to be the dominant form of rendering in the future, and that although ray tracing may be useful for some effects, GPU technology will easily handle them.

Crytek's Cevat Yerli Speaks on Rasterization and Ray Tracing

John Carmack on id Tech 6, Ray Tracing, Consoles, Physics and more

Editorial comments from Ryan Shrout: (PC Perspective)

?Intel's push for ray tracing is more than likely a result of their upcoming Larrabee architecture and its inherent weakness with current GPU programming models; because it is essentially a many-core x86 compatible device it will be forced to emulate DirectX and OpenGL and probably won't have the power to compete with the dedicated GPU products from NVIDIA and AMD out of the gate. Emulation always produces a performance overhead that will cut back on the utilization of the full power of Intel's Larrabee and obviously they'd rather that NOT be the case.

?Cevat does see a future for ray tracing, but more in the form of a mixed rendering design that probably won't be implemented for several years; five or more. By his count, for the next three years or so rasterization will continue to be the dominate rendering method for games and thus any potential graphics hardware for this market will need to compatible and perform well on rasterization. Cevat thinks that in a time span of three to five years we might begin to see some implementation of ray tracing in games but not in the pure, classical ray tracing fashion. Instead we will likely see the hybrid rendering techniques that we have discussed several times in previous interviews: ray tracing for shadows, certain reflective objects, etc.?

Given the comments of these two "reknown" developers, RayTracing is not as close as you think. And even when it does arrive, it could possibly be only a part of the overall rendering process in gaming. In other words, Intel cannot rely soley on RayTracing, but needs to also support full Rasterization as well. We're talking FIVE years out until only THIS point is reached.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Everyone knows that the best bang for the buck is to spend more on the gpu, for gamining anyways. A "massive assault" would be more than a few interviews or investor meetings. I assume when a new super gpu hits the market, this stuff will fade into the viral background.

No doubting Intel has the resources to compete, but one would expect keys is right and we are a few years away.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
Everyone knows that the best bang for the buck is to spend more on the gpu, for gamining anyways. A "massive assault" would be more than a few interviews or investor meetings. I assume when a new super gpu hits the market, this stuff will fade into the viral background.

No doubting Intel has the resources to compete ...

Bluff

:p

maybe in ten years

analysis

ok, lets *say* that Intel is going to CRIPPLE CPU production and put TEN times more R&D and big $$$ into GPU

it takes 5 years to make an architecture work properly with everything when you know how to do it

... and in the meantime Phenon catches up .. ouch!

NOPE .. intel pulled their OLD *P4 10Ghz or NetBust* PR out of the mail room and gave them this campaign again

while Intel *figures* out how to do it we see a Distraction Dance with their PR bullsh!tting us about Ray F'ing Tracing - that every serious Dev is LAUGHING at [behind inte's back; it IS intel and they can't be free like me].

So, intels STRATEGY is clearly to bu .. [sorry] influence devs with "research" and throw up a PR campaign of *Obfuscation* with Smoke and Mirrors

i am famous here for reminding members to look to the Past


.. and Intel is just "doing it" again
:disgust:

conclusion:




Fire on the death star
:sun: >>>>>>> :moon: <<<<<<<:sun:
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Thank you much for the links. I was unclear about what exactly is ray tracing and how it differs from rasterization, and the linked interviews and other articles from there helped me understand the two.

In the meantime, NV has begun the campaign and you can check it from their front page. :D

http://www.nvidia.com/object/balancedpc.html
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Sable
Originally posted by: lopri

In the meantime, NV has begun the campaign and you can check it from their front page. :D

http://www.nvidia.com/object/balancedpc.html

Bwahahahaahahaha!!!

Cripes nvidia. This is nearly as bad as Creative when they made that "experience graph".

http://images.asia.creative.co...me_audio/xfi_graph.jpg


Total PC performance = 535%? Gimme a break. :laugh:
It's a bit out there, but they do list their testing methodology and it's sound. Creative's graph was BS in every which way as far as I know, while the numbers NVIDIA has posted are plausible. I'd imagine we could even get them to cough up the exact results if we tried hard enough.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Fact 1: GMA sucks utter balls for gaming. Just to give you a reference point, my G33 chipset drops as low as 14 FPS @ 320 x 240 in UT2004. That?s right, 320 x 240 in a four year old game. And that?s with no AA or AF either.

Fact 2: I can't see how a ray tracing approach is going to run existing Direct3D/OpenGL games without a massive performance hit through emulation.

Fact 3: Games are generally GPU bound far more than CPU bound. Magnitudes more.

Based on these facts, I don?t see Intel taking any GPU crown any time soon. Maybe in 5-10 years time, but by then nVidia may well have moved to ray tracing too.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ViRGE
Originally posted by: Sable
Originally posted by: lopri

In the meantime, NV has begun the campaign and you can check it from their front page. :D

http://www.nvidia.com/object/balancedpc.html

Bwahahahaahahaha!!!

Cripes nvidia. This is nearly as bad as Creative when they made that "experience graph".

http://images.asia.creative.co...me_audio/xfi_graph.jpg


Total PC performance = 535%? Gimme a break. :laugh:
It's a bit out there, but they do list their testing methodology and it's sound. Creative's graph was BS in every which way as far as I know, while the numbers NVIDIA has posted are plausible. I'd imagine we could even get them to cough up the exact results if we tried hard enough.

It's actually kind of nice to see that someone is finally pushing the concept of the balanced PC. I think that most people on AT forums already understand this (and therefore take it for granted) but to many consumers this is a 'new' concept. The tough part for NV is that most consumers don't know what card is better between an 8800GT 512MB and an 8600GT 1GB.
 

Lithan

Platinum Member
Aug 2, 2004
2,919
0
0
Holy shit. PC's can only do Multimedia and Games! What a waste of money, I'm buying a TV and an Xbox instead.


But I gotta admit, Nvidia wins at math. According to them, in a budget PC an E4500 costs the same as an 8600GT. In a midrange PC an e4500 costs the same as a 9600gt. And in a high end PC an e4500 costs the same as an 8800gt.

Hey who wants to trade me an 8800gt for a 8600gt, since they cost the same?

Oh, and apparently a shader unit counts as a GPU now. :thumbsup:
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Lithan
Holy shit. PC's can only do Multimedia and Games! What a waste of money, I'm buying a TV and an Xbox instead.


But I gotta admit, Nvidia wins at math. According to them, in a budget PC an E4500 costs the same as an 8600GT. In a midrange PC an e4500 costs the same as a 9600gt. And in a high end PC an e4500 costs the same as an 8800gt.

Hey who wants to trade me an 8800gt for a 8600gt, since they cost the same?

Oh, and apparently a shader unit counts as a GPU now. :thumbsup:

...assuming you have an intel integrated motherboard and the rest of the system is equal (all prices newegg):

q6600 $250 + GMA X310 $0 ($250) -- e4500 $120 + 8600 GT $75 ($225)
q6600 $250 + 8400GS $35 ($285) -- e4500 $120 + 9600GT $150 ($270)
q6600 $250 + 8500GT $50 ($300) -- e4500 $120 + 8800GT $180 ($300)

The point they are making is that if you spend a bit less on the cpu, but a bit more on the gpu you will end up with a better performing multi-media/gaming PC for your money. I am inclined to agree with them...
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: nitromullet


The point they are making is that if you spend a bit less on the cpu, but a bit more on the gpu you will end up with a better performing multi-media/gaming PC for your money. I am inclined to agree with them...

Only because currently the sweet spot mainstream CPUs are pretty darn close to their high end equivalents. 30% more performance in a *few* applications costs you $1300 more dollars. And for some SKUs it's almost as bad -- 5% more CPU, nearly 0 difference in apps at 100% increase in price. (q6600 vs q6700). And once you factor in overclocking you've got $60 E2XXX CPUs dancing with their $1500 cousins, at least in a few applications.

With GPUs the difference between integrated onboard vs. high end has been as high as 2400%. Last generation you saw abortions like the 8600GTS and 2600XT yielding 25% the performance of their high end cousins at 60% the asking price (at release).

Today the picture is not as clearcut as NV would like -- the performance difference between a 9600GT and a 9800GTX isn't all that large even on NV's graph, but the difference in e.g. Supreme Commander between a E4500 and Q6600 sizable. $200 price differential vs $100, lower price for a CPU upgrade and better performance.

Once a true high end GPU is released by either NV or ATI the picture will change again, of course. But now is not the best time to beat the GPU vs CPU drum, IMO.







 

adlep

Diamond Member
Mar 25, 2001
5,287
6
81
Originally posted by: BFG10K
Fact 1: GMA sucks utter balls for gaming. Just to give you a reference point, my G33 chipset drops as low as 14 FPS @ 320 x 240 in UT2004. That?s right, 320 x 240 in a four year old game. And that?s with no AA or AF either.

Fact 2: I can't see how a ray tracing approach is going to run existing Direct3D/OpenGL games without a massive performance hit through emulation.

Fact 3: Games are generally GPU bound far more than CPU bound. Magnitudes more.

Based on these facts, I don?t see Intel taking any GPU crown any time soon. Maybe in 5-10 years time, but by then nVidia may well have moved to ray tracing too.

That is also my experience.
It means that for example an ATI Radeon 9800/ Nvidia 6600 series discrete part is still faster that even the latest integrated/chipset video card.

Also, if the CPU speed is not enough, just overclock... :)