• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GeForce3.....Is Intel at it again?

paulzebo

Member
Too many things are starting to add up. First, Intel releases a P4 which has pipelining out the wazzu but is slower than the current PIII even with higher Megahertz. AMD Kills the new P4 in benchmarks and is the choice for all gamers due to the low price to performance ratio.

Enter Nvidia. We all wondered why there were so many driver releases, until it became know that they hooked up with Microsoft for the XBox. Drivers, i.e. software, and Microsoft. Maybe the boys at Redmond were just fooling around with the current hardware to make a big leap with the XBox. Maybe not.

Enter Nvidia again. Why oh why do they issue a press release linking them to Apple, when their bread and butter will be from the PC market? Illusions? I think so. Read on.

Enter Nvida again, with some benchmarks on the new GeForce3 which would seem to make it an awfull expensive option since most games go 16 bit anyway. See the other post in the forum. Comments on slow fill rates compared to what everone expected http://www.digit-life.com/articles/gf3/index.html .

But, after reading the article above, I could not help but notice the results are from a PIII @1000. Now look is article http://www.hardocp.com/articles/nvidia_stuff/gf3_tech/

Notice what seems to be the Nvidia powerpoint presentation. Notice also, it is not until Intel achieves a speed around 1200-1500 megahertz does the GeForce3 really shine. Stop me if I'm wrong, but the only thing Intel has in this area is none other than the poorly selling, slow by current standards, RAMBUS infected,....... P4.

Now ask yourselves, could all three of these companies be in bed together?

Until tests are performed on the three leading cpu's, the PIII, AMD T-Bird 1200 and the
P4-1500 we may never know. But all those pipelines in the P4 and GeForce3, HMMMM. Has the Company which brought us RAMBUS been up to it again???

Just when I was thinking of upgrading to a T-Bird@1200. I guess I will wait until more tests are done.
 
It's called advertising, Apple pays Nvidia to release it for Mac first to get the spot put on them.
 
I think you might be misunderstanding that graph a little bit. It's NOT saying that "CPU mhz X gives GPU performance Y". Rather, time is on the X-axis, and performance on the Y. It's merely stating that, over time, the performance of graphics cards has grown faster than CPUs.
 
Plus there's the face that Nvidia nor MS need to buddy up to Intel. They're both doing wonderful regardless of what processor a consumer chooses, and hence taking sides in a market that's looking to eventually balance off between AMD and Intel would be a bad move.
 
Sheesh, these conspiracy theorists are hittin' the ole crack pipe a little too hard once again.



<< hence taking sides in a market that's looking to eventually balance off between AMD and Intel would be a bad move. >>

Read what AMD's head of CPU operations, Ajay Marathe, said? They are working very hard to maintain the 17-18% market share they have gotten. His exact words were

<< &quot;With the right price points and the right geographies, we definitely can maintain our market share with the right execution,&quot; he said. AMD currently holds 17 to 18 percent share of the global market for microprocessors, although it is &quot;behind the curve&quot; compared with rival Intel Corp in Asia, he said. >>

 
moore's law doens't apply to video cards because they started out way behind the curve. of course they're going to grow faster when there is technological headroom compared to what the bleeding edge faces. now that nvidia is the bleeding edge, they are finally coming to butt against moore's law, and i don't think the result will be pretty.
 
Certainly graphics silicon has been playing &quot;cathchup&quot; in terms of manufacturing density. I figure that Nvidia can always go multichip if the transistor count gets too high in thier top-end designs. At least until the next round of die shrinks comes along again. I bet a dual .13u PCB design is easier than one with 2 chips at .22u like the Voodoo5. It'll certainly use less power.

Oh, and Moore's law refers to the density of silicon IC's in general, not any paricular type of processor.

-Aaron
 


<< Sheesh, these conspiracy theorists are hittin' the ole crack pipe a little too hard once again. >>



Dude you gave me a good laugh with that one, entered my mind as well as

why are many of them junior menbers with no profile.
 
Its accurate for cpus. The problems is graphics cards are redisgned every 12 months or so. CPUs every 4-5 years.

Consider this a GeForce DDR was clocked @ 120MHz a TNT2 Ultra was clocked @ 150 MHz. Which was faster? Its not always about clock speed.

I think someone mentioned that Moores law is going to hit a brick wall somewhere around 0.13 micron just because of other problems that are introcuded because of the die size. Otherwise its about 5-10 years from screetching to a complete stop.
 
Well, Moore's law can definately be said to be false. A law doesn't flucuate, it's constant. Moore's law can (and will) eventually not be correct, and therefore cannot be called a law (scientifically speaking). I mean after humanity is gone (or the Universe converts all energy to kinetic and goes completely dark 🙂) then chip density sure ain't gonna be doubling every 18 months. It also would have to work backwards as well, and there were no IC's more than 40-50 years ago and hence their capacity couldn't double every 18 months then either. Hence Moore's &quot;Law&quot; really is a prediction as best, a joke at worst, but by no means a law.

Midnight Raider: When I said that AMD/Intel market share were going to balance off I meant (and said) eventually, meaning over 3-4 years (to me). Market analyists when making comments like the one you posted are generally only looking at the upcoming year.
 
Nice response rate. and now to answer some of the questions.

First- Junior member does not always mean what you think. After 24 years in one of the top five world's largest corporations I can say that,they,corporations, are always looking to have an advantage in the market place. Hmm, 24 years, that's more than likely longer than most senior members have been alive. Sounds like your Dad doesn't it?

Second- Intel has hit the wall. Here in Phoenix it's common knowledge and if you have been reading any of the stories in the papers(yes, they still have papers) you would know that. Poor chipsets, slow &quot;new&quot; P4's, and a chip which caused the first massive motherboard recall in history. Think liability there, plus the hit to that corporate profit.

Third- Every major Corporation is looking at what it can do to please Wall Street. Intel's stock has been cut in more than halve. It's sitting at 29 7/16 right now. Down from over 70+.
If you think that Intel will not form strategic alliances with the gaming industry or any industry(RAMBUS) to get an edge on any competitor, you have been smoking that crack pipe Midnight. The rules in corporations change when their stock goes down this far. Who set the SDRAM specs and bought into the company which held the patents? Not Intel? Be assured they wined and dined anyone who had a say in the position. Don't believe that? Take another hit on the pipe.

Fourth-Games drives the CPU speed race. Whatever you have now it likely to be just fine for the MS Office apps. No need to speed up here if business apps are all you care about. Intel knows this as all of us do. So when you design a processor, would you target office apps or games? The gaming tech passes right down to video, the second reason for a higher speed processor.

Fifth-Nvidia went from a no player to king. I'm sure they learned learned from the trip and will treat any competitor (ATI) as a serious threat to their long-term dominance. Intel started as king when the competitor held less than 1% of the marketshare. Now they hold 17-18%. That is a big difference and Nvidia will more than likely work hand in hand with the majority player. I'm very sure that Intel hawked its P4 product to Nvidia while in the design stage. Think Nvidia didn?t learn from the marketshare mistakes of Intel? Build a card around the next generation cpu or stick with the older x86 systems. My money says this GPU was built around the P4. AMD will be playing catch up and the exchange rate will not always benefit us in the USA.
When it swings back to the Euro, Intel will be the same price as AMD or less.

Sixth-Everyone used to say you could not go faster than the speed of sound (1950's)Today, it's common place. Several months ago I read where they now feel light is no longer the fastest thing in the universe. Moore's law may hit a wall at .13 micron as noted, but experience tends to dictate it will remain in effect long after the youngest member passes on.

Last-Can't do Crack-Would not pass the Company's drug testing(hair tested-not urine).




 
Videocard and harddrive makers seem to always take a half-step back before making a full step forward. They'll probably drop the speed back to 120mHz when QMR hits the PCB, too. Look at hard drives, how hard is it to make progress in density of the platters versus speed of the platter. Sometimes a half-step back is better.
 
Paulzebo: I really think you are looking WAYY to deep at these preliminary benchmarks.



<< Notice what seems to be the Nvidia powerpoint presentation. Notice also, it is not until Intel achieves a speed around 1200-1500 megahertz does the GeForce3 really shine >>


This is a mute point, I would HOPE a brand new graphics card would be on the cutting edge of technology, accordingly, you need the industrys Best CPU's in order to take full advantage of it.



<< Build a card around the next generation cpu or stick with the older x86 systems. My money says this GPU was built around the P4. >>


I am not a graphics card expert but i have serious doubts about whether this so called GPU is going to give that great of a boost to the p4's crippled FPU.

We will see soon enough 🙂
 


<< Games drives the CPU speed race. Whatever you have now it likely to be just fine for the MS Office apps. No need to speed up here if business apps are all you care about. Intel knows this as all of us do. So when you design a processor, would you target office apps or games? The gaming tech passes right down to video, the second reason for a higher speed processor. >>

Utterly incorrect. The SERVER market drives the CPU speed race. You're out to lunch if you think it's games.

I can write code that will take a P4 to its knees. We're FAR from having &quot;good enough&quot; cpus.



<< Intel's stock has been cut in more than halve. It's sitting at 29 7/16 right now. Down from over 70+. >>

Considering their P/E ratio is now a more realistic 16, I'd say they don't have anything to worry about.



<< Moore's law may hit a wall at .13 micron as noted >>

There are already processes under way for .01 micron designs. You're a dope if you think Moore's Law won't apply for at least the next 5 years.

As for your comments on benchmarking, would you feel better if you saw GF3 scores on a Pentium 200?
 
Actually, it's a good analogy:

EDO ram isn't in production really anymore, and it is far older, and is therefore more expensive and/or harder to find.
 
Shazam : sorry, but that's wrong. The server market is about and will be about stability and reliability, NOT raw speed. For at least the past year or two, CPU speed has become the domain of the desktop and workstation processors, not servers.

Example : a few years ago, everyone was sure that Alpha would be first to 1ghz, as, from your comment, I would expect you would be.

Instead, look what happened. AMD went there first with a purely consumer-aimed processor, and Intel followed after with their (non-Xenon) P3. Now the P4 is the clockrate king, but guess what, the P4 is NOT and will not be a server solution, at least until Foster. (Even then, Foster is more of a workstation-aimed solution; IA64 is the server solution) Most server-related solutions are toiling along at less than half the clockrate of the P4. Of course, clockrate is not the same thing as speed, but that's not the point here.

So. What drives the consumer market? The only consumer-related products that actually push the processing limits right now are games. so I think that comment is a fair one.
 
&quot;Intel has hit the wall. Here in Phoenix it's common knowledge and if you have been reading any of the stories in the papers(yes, they still have papers) you would know that.&quot;

What &quot;wall&quot; is that?
 
The ?WALL? is the one, which every company serves. The ultimate consumer. Intel has hit it hard by losing market share to rival AMD. One could expect the reasonable consumer to pay a premium to see ?INTEL INSIDE?. But would the average consumer pay a $400 tariff? No. They buy PIII?s and AMD?s but not the high margin P4. Here in Phoenix we have a little paper that cheers for the hometown industry but does tell of the problems they are having. Yes, Intel has a major presence in Phoenix and New Mexico, our neighbors to the east.

Read the articles below. The first tells that Intel is spending some 12 BILLION on FABS and research. Jeez, that?s lot even to the government. Second tells that Intel is subsidizing Samsung to make RDRAM. Essentially giving some of the profit it makes on its share of RAMBUS so one of the worlds largest Memory makers will manufacture RDRAM at a lesser cost in quantity. Something RAMBUS cannot do alone. Third, Intel is sticking to RAMBUS as its favorite child but will embrace a stepchild DDR memory.

http://electic.com/webnews.shtml#983328417

http://www.tomshardware.com/technews/index.html

http://www.xbitlabs.com/news/#983408273

http://www.xbitlabs.com/news/#983231044


Given all this, is it not reasonable to believe this company would court NVIDIA over its new GF3 in the design stage and help them develop it to showcase the P4. Look at the time line when the GF3 was supposed to be out. Right around the time the P4 came in. No, not right on it but very, very close. Keep in mind Intel would want backward compatability for the PIII which allows AMD to still be a player. Slower, but a player.

I guess we will have to wait until the GF3 is tested side by side with the AMD 1200 o/c'ed to 1500. No, it would not make me feel better if it were tested with a P200. I just believe that if you have access to the product(GF3) you more than likely have access to a AMD 1200 and P4 1500. So, why do you choose to benchmark it against the last generation PIII. Not unless someone told you to.
 
I don't know paulzebo- you're making a lot of assumptions on pure speculation.

At no time has even a rumor come up that Nvidia is &quot;in bed&quot; with Intel or anti-AMD in the slightest.

And I don't think that its very likely that anything could be engineered to favor only the PIV besides having SSE2 optimized drivers which every video card manufacuter will have as well. Also the next refresh of the Athlon will have SSE2 intructions(which one again guys?)

I bet we'll find the GF3 nearly equally competitive with the P3, P3, and Athlon- and probably fastest on the 1.2Ghz Athlon.

And in any case- the GF3 is significantly faster in high res/32-bit than the GF2 Ultra and smokes with FSAA.

And if you want to be really off the wall- Nvidia would prefer to slow down CPU acceleration. Why? B/c as CPUs get faster and faster- they can do in software what Nvidia's GPUs do in hardware, making having the newest and best Nvidia card moot. So to showcase how their GPU is the best thing since sliced bread- they would prefer to have CPUs advance more slowly than they have.

I don't belive in that statement at all, but hey one hairbrained idea deserves another.
 
Process technology is not going to end at 0.13um. It will be very hard to use current UV technology for photolith, but they are busy working on that. I don't see any problems other than power as we scale down below 0.07um. Somewhere around 0.05um it may start to get a little more difficult - of course, this is what engineers have always been saying. I remember people saying that about 0.35um back when the world is at 1um. To paraphrase something Craig Barrett said recently, we can see forward fairly clearly for the next 10 years, and there will be challenges to be overcome as there always are, but it should be business as usual for the next 10 years. That will put us well past 0.07um. Heck, I just started work on a microprocessor that uses process technology smaller than 0.13um.

And, Paulzebo, I wonder where you are getting Intel and AMD's market share numbers. Are you guessing these? Or can you provide a link - or even just a source?

Remnant is totally correct that servers and workstations are all about RAS (Reliability, Availability and Serviceability), not raw speed. Speed is a good thing, but it's not at the top of the list. For most Fortune 500 companies it's not even in the top three.
 
&quot;It will be very hard to use current UV technology for photolith&quot;

As a Lithography technician, I can assure you that we'll be able to handle such challenges. Hell, we are already doing things that aren't &quot;supposed to be&quot; possible... 😉


paulzebo, of those articles that you listed... Which one says anything about Intel having problems? Do companies in a bind become that agressive with R&amp;D and manufacturing? Or invest that much into making RDRAM more viable for the consumer? And how is it bad to allow the consumer their option of which RAM to use?

You say that Intel has &quot;hit the wall&quot;, as if that's a bad thing. So far, you've only pointed out how Intel is progressing into the future.
 
Back
Top