Moore's Law is in its own bubble...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rimshaker

Senior member
Dec 7, 2001
722
0
0
Originally posted by: Wingznut
As has already been stated, Moore's Law is an observation... Not really a "law".

But it's also been said many times in this thread, that semiconductor manufacturers are being driven by Moore's Law. And that's not really true.

Had Gordon Moore never made the statement in question, technology would still be advancing at the same rate. It's not like meetings are had by the lead engineers wondering, "How can we maintain Moore's Law?" Moore's Law is never a factor when designing cpu's. Faster performance and lower cost are pretty much the driving factors.

Of course it was an observation, but when the industry stubbornly tries to keep up with that observation in their future node roadmaps, what difference does it make what exact terminology is used?

You're an employee of the most dominant company in the industry... i'm little disappointed that your last statement says what it says. Actually i've seen the opposite. I've read countless papers and articles the past few years with engineers actually asking, "How can we maintain Moore's Law?" Obviously it has nothing to do with how well a certain cpu family performs or sells.. i was pointing more towards the struggle with ever-decreasing processing nodes.

 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
Well, of course shrinking nodes are something they want to acheive. But it's in the name of faster processors, as well as cheaper manufacturing.... Not to attain Moore's Law's next tier.

Nobody gives a rat's @ss about doubling transistor density if it doesn't mean a significant performance increase and/or cost to manufacture decrease.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Wingznut: Read the Ars article Buddah linked. Moores law originally stated the number of transistors that can be fit on the most economical chip will double. ie: As chips get larger, your yield goes down, as chips get smaller, packaging costs goes up. Somewhere, there is a "sweet spot". Moore is saying that that sweet spot will double.
 
Jun 26, 2002
185
0
0

The only company that I have every heard quote Moore's law is Intel. Whenever Intel releases a new processor they always have a little spot on the news comparing it to Moore's law. When they came and talked to my graduating EE class, sure enough the slide with Moore's law came up with all there processors on it. Kinda ironic that Moore is an Intel co-founder.

I agree that as the CPU gets faster, the software will need more power. Look at windows 95, I bet that would fly on a new 2.4Ghz machine, but it was slow at the time. I think more than Moore's law, the way the economy is setup drives the CPU market. The only way to separate a CPU is by performance now. MB you can always have quality and features, but with Intel and AMD having the same quality how can you separate yourself, speed. Of course I am including features like MMX into the word "performance" and "speed."
 

dejitaru

Banned
Sep 29, 2002
627
0
0
Originally posted by: TheSnowman
Originally posted by: dejitaru
we can already render 100% realistic images and sounds, we have been able to for years. simple ones anyway.
Yeah, I can make an image that looks exactly like a real image rendered on a computer.


no you can make it look like a picture/video shot in real life.
:D

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
seriously, i do the low poly game stuff myself but other people do some amazing photo realistic stuff with current technology, i did quick google image search and came up with was this render here as a good example. also there is a great video on on alias|wavefront's site as well as plenty other good examples all around if you look for them. now you may say "oh well this doesn't look right, or that doesn't look right" but you are forgetting that everyone perceives things a bit different so someone else's render is never going to look exactly how you think it should be, however that does nothing to disprove the fact that photo realistic rendering has been around for quite some time and it is getting even better every day.
 

dejitaru

Banned
Sep 29, 2002
627
0
0
There was some animation on a Discovery channel special (after the dinosaurs, or whatever) that looked not realistic, but real - in some cases. This was by far the best CG I'd ever seen. Proper implementation of blemishes is key.
 

dangereuxjeux

Member
Feb 17, 2003
142
0
0
I agree that CG has reached amazing levels, but when connecting that to the topic of the thread (or what it has become), you have to realize that nobody on a PIII is rendering things for Discovery Channel or ILM, etc. If people will want to create things like these on their desktop computers (and who's to say that they won't... I run Maya Personal Learning Edition), then there's a reason for faster processors, system buses, and more RAM. As for the impact of increasing transistor density (Moore's Law), that doesn't apply solely to processors. With increased transistor density, there will also occur a great increase in transistors available for system architectures (which leads to much great performance increases) as well as other components. Interestingly, a similar "law" (or observation) also applies for DRAM (estimates of 16GB DRAM devices in 2007). In its history, every obstacle in the way of the continuation of Moore's law has been overcome, and it is hard to imagine that similar feats will not occur in the future. Future barriers include quantam barriers (essentially Heisenberg uncertainty principle stands in the way of reduction of size) and logic problems that will occur with systems are running at microwave frequencies.

Less scientifically, but IMHO, improvements in GUI, gaming, and video all validate faster processors and other components, and I think even for those who will not use higher-level apps, GUI and usability improvements will continue to drive the need for hardware improvements for some time to come.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
the majorty of computer users spend their time with reading and writeing email, word documents and surfing the web; a 486 linux box suits thoses purposes just fine.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
Originally posted by: dejitaru
who in the world needs multi-GHz computing?.....
Quite a few people, and they're willing to pay for it.
A PIII machine is still more than enough for like 95% of computer users even today.
Did you wave your wand before pulling this number from a hat? Even the biggest newbie can appreciate the speed of a mid-range system. People aske me all the time why their win98 box is so slow. It's not just (not mostly) the techs who buy a machine the day it's released. If you're doing anything beyound word processing, you will utilize the extra speed.
Should manufacturers cripple their processors just because you say so?

I could certainly use the fastest system available, and I appreciate the fact that it exists.

I don't just wave magic wands or just spit out opinions without some thought. Browse through the latest Maximum PC magazine. Interesting survey in there. Not surprised to see that the PIII is still the most widely used cpu and the GF2mx are the most widely used vid cards. Yes, that pretty much reflects the 'other' 95% of the population.

 

Snooper

Senior member
Oct 10, 1999
465
1
76
This one always makes me laugh. There are always people that cry "This is fast ENOUGH!!!", people that scream out "Why am I waiting on this damn machine!" and the the majority who just keep using a semimodern system not know how fast (or slow) it is and not really caring.

For a VERY long time to come, there will be new programs that WILL take advantage of more power and smarter OSes and applications. If all you want to do is enter some text, your right. All you really need is an old 286 with Word Perfect and a 12" monocrome screen.

Most people want a little bit more from their computers, and it's NOT just for gaming. I have a question for you: Can your computer understand what you say? Can you ask it a question and have it respond with an answer that actually makes sense? Hell, can your email program actually filter out junk email WITHOUT deleting messages you want to see? I doubt it. These things have a LONG way to go! And it is going to take a LOT of processing power to get there. Not double. Not 10 times. Not even 100 times the speed will get us there. But eventually, we will have the hardware and we will have the software tools (and these tools are going to be far, far, FAR more important to getting our "smart" computers than just the CPU cycles due to the shear complexity of the software at that point) and will have our Star Trek like computers.

Oh, and will be able to play Doom XX with some REALLY kickass graphics and sound!
 

Tigerman

Banned
Feb 11, 2003
13
0
0
it takes 4 hours on a P3 1.2GHz to encode a movie into MPEG1 for VCD in TMPGE
I know this cause this is what I do,2 1/2 hours if I use CCE(but I don't usually do this because it is a pain to set up)
So the more GHz the better.I really don't see a lot of improvement (if any)with the P4,it is the first thing I do when working on a new computer is test how long it takes to encode VCD(an ENTIRE movie not just a minute)
At the current rate I don't think I'll see a HAL9000 in my lifetime(30).
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
The pricing of Intel's top of the line processors have always stayed constant at around $650, and the company stays profitable making the huge investments to produce those top end processors which obey Moore's law and sell them at that price point. There have always been enough people that are willing to pay that price to make it worthwhile for Intel to produce these processors. You might be able to argue that the sales of lower end processors essentially serve to mitigate losses on the top of the line processor (because it may be a loss leader and it would be nice if someone who know for sure could either confirm or deny this). In any case a loss leader has a purpose in a given economic situation, it is useful for marketing. The development of these top of the line products creates technologies that end up trickling down into cheaper products that consumers buy in sufficient quantity to offset the billions of dollars it takes to develop these products. In other words it is very difficult to say that Moore's law or even the opinions of engineers are driving the rate of progress in semiconductor technology - it is the free market that is. If companies were delivering products too fast and investing too much on R&D , then they would have gone out of business already. Such is the power of economic forces - they are inescapable.


As for whether our computers are too fast today, let me give the example of the consumer video card market. When the R300 came out, I couldn't believe how fast the 9700pro was. Finally here was a card that could run any game I wanted with 4xAA and 8X anisotropic filitering. Then the 9800pro was announced and is even more ridiculously fast. Then today I saw the benchmarks of this card running a currently selling game, Splinter Cell. The card can't even hit 60fps at 1024x768. It does something like 34fps at 1200x1600, and this is as fast as it gets hardware wise. The point of all of this is that as fast as our hardware gets, software will always come and use that speed or power.

You can do simple word processing and e-mailing on 486, but you really can't run a modern OS like Win XP on it without getting frustated by waiting for simple things to happen (like opening Word XP). People want and are willing to pay for the features in Windows XP over Win 3.1 (which would probably run decently on that 486). People are willing to pay to actually have these huge resource intensive programs on their computers. It takes my Anti-virus program an hour to scan my computer running at 80% or more CPU utilization and I have a P4 1.7 GHz which isn't great but is better than your average PIII system, on which it might 2 hours and noticibly slow down the system for those 2 hours. People are willing to pay to not have to endure that. There are so many examples out there about tasks that average people want to do on their computers these days that are very processor intensive.

Your point about subsystems not catching up is a valid one, but Moore's law also estimates the pace of development of all semi-conductors. It just happens that our CPUs are ahead of bus speeds and memory speeds, but those are catching up and may one day leave the CPU as the bottle neck in our systems (nothing is impossible).

At the end of the day it is the economy that drives the rate of development of processors.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
Originally posted by: sxr7171

At the end of the day it is the economy that drives the rate of development of processors.

Really? Ever since intel broke 2GHz, all I've been seeing every other month is another speed upgrade... 2.2... 2.4... 2.26.. 2.5.. 2.53.. 2.6.. 2.66.. 2.8... and now 3.06. And it's funny cause the entire time i swear i haven't heard anyone yelling out 'faster faster faster!!' (other than in the bedroom).
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: rimshaker
Originally posted by: sxr7171 At the end of the day it is the economy that drives the rate of development of processors.
Really? Ever since intel broke 2GHz, all I've been seeing every other month is another speed upgrade... 2.2... 2.4... 2.26.. 2.5.. 2.53.. 2.6.. 2.66.. 2.8... and now 3.06. And it's funny cause the entire time i swear i haven't heard anyone yelling out 'faster faster faster!!' (other than in the bedroom).

People are buying them, that's why they're making them faster, faster, faster!!! The 3.06 is a 50% speed improvment over the 2GHz, and people want them. Some people buy computers that they feel will hold them over for 3 years and they like to buy the fastest processors. When you buy the fastest processor at any given time your machine is king of hill for a while, then it becomes just average and slowly you'll find that there are some things you simply can't do with your machine. When I got my 400MHz PII about 4 years ago, I thought it would hadle everything thrown at it, I thought it was the supercomputer. Then came Divx, I couldn't play Divx files with full post processing it would just stutter badly. Now my parents use the thing for E-mail and web surfing and it is perfect for that (after a HD upgrade for speed, and a memory upgrade - these newer OSes and newer browsers just crawl on older standard configurations). Now my newer 1.7Mhz P4 won't smoothly play the huge Hi-Rez WM9 demo files on www.windowsmedia.com. Already there are things my computer can't do. Now what if I want to do HDTV PVR in the future, can my machine handle it? I'm fine for now, but when I buy a new machine I expect that I can get something a good 2-3 times faster. It feels good to have some headroom on the stuff you run everyday as short-lived as that tends to be. I am part of the market, and there are others like me and even people who want the latest technology every year we are the people expecting faster, and we are the people that Intel expects to buy these faster processors.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
sxr7171,

I appreciate who you are, be it a computer enthusiast, hardcore gamer, or just someone with deep pockets who loves technology. I've been there, i'm still doing that, i AM that.. at least a little still. But do you have any idea how small a percentage we are out of the total population??? We're talkin less than 5%. Companies survive and thrive thanx to the other 95%, not us. We're not the meat and gravy of all their profits and revenue each quarter.
 

ReiAyanami

Diamond Member
Sep 24, 2002
4,466
0
0
duh, we need super awesome computing so we can build a Matrix that will be our energy solution for the future ;)
 

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
I admit that Intel do seem to follow daddy's rule a bit.

I mean all things considering a 3000 Barton XP clocked at 2.2ghz outperforms an HT enabled 3.06 P4 in numerous processes and benchmarks, which does go to show that the "3.06" perhaps isnt deserved. I personally think that AMD have the more realistic clock speeds.

As for your thoughts on why we need anything quicker than a p3 rimshaker, thats perhaps one of the most ridiculous comments ive ever heard.

Thats like putting a stop on technology itself. You also seem to forget that fancy top-end high clocked processors arnt just for fanatics, and game geeks.
Computers are also used to perform many other tasks in life, some of the mainly important ones being Medical Research etc.

If you cant appreciate technology and computer advancement when playing a game on Quake etc, im sure someone else might whos had there life saved because a faster processor on a computer somewhere helped discover a cure to the illness they had, etc.

Dan
 

Apotherix

Senior member
Mar 6, 2003
229
0
0
I would agree, but I think there is plenty of demand for high-end computeing, and there always will be. (Video Editing, Multi-tasking, Hardcore Gaming, etc.) I for one, am not willing to spring for the thousands that it takes to live on the "cutting edge", and am willing to settle for a few months back. But you do have a good point. Software is falling back in the race...
 

Mist

Member
Feb 19, 2003
127
0
0
Listen, I'm Scottish and many people can't understand what I am saying, never mind a CPU.

It's got no chance!

Michael.
 

ZapZilla

Golden Member
Oct 9, 1999
1,027
1
71
I for one won?t be satisfied until I have no less than an Asimovian Positronic Brain on my desk.

Yup, ?intelligent robots? are what I?m looking foreword to ? and word processing with a PIII isn?t even remotely close.