Moore's Law is in its own bubble...

rimshaker

Senior member
Dec 7, 2001
722
0
0
Let's be realistic for a change. A PIII machine is still more than enough for like 95% of computer users even today. The huge tech boom of the late 90's appears to have started the chip makers in a dang horse race with all these new ridiculously speedy processors like every other month it seems like. Other than the scientific community and deep-pocket gamers with no life (yes that was me once so don't hate), who in the world needs multi-GHz computing?..... and don't compare that statement with how it was like back 20-30+ years ago. Back then everything was on a 'linear' scale.

All for what? Just to keep up with what someone said decades ago? So what!!?? It just seems the semiconductor industry has adopted moore's law like it was one of the ten commandments or something. And if they continue to base their total survival on obeying what one guy said, then this economy is really gonna get hit hard in the near future when demand for 'desktop supercomputing' starts leveling off. Take a look at chip density/performance since moore's law started. Look at it on a regular scale, not a log scale. Everyone can see a huge exponential curve. Look familiar? I dunno... maybe it's just me.... but nothing in nature ever remains stable on an exponential scale.

If anything, I think what really needs to be done now is a new period of expansion and improvements for the other subsystems in a machine to catch up with cpu performance. When the bottleneck starts becoming the processors again (remember those days?).... that's when it's finally safe to buy semiconductor stocks again.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Just because nobody has yet built the "killer app" for multi-Ghz systems doesnt mean we should start getting complacent now. Personally, I think the most likely killer app would be fully natural speech recognition. Something like this:

"Okay, lets open up a google window and search for Risc Processors.

Hmm... Gimme the third link on the page. Scroll down a bit... no too far.

Okay, see that paragraph starting with "the architecture is most noted for"?

Copy that entire paragraph and paste it into notepad for me.

Okay, lets open up a photoshop window.

Okay, go back to the Googled page and get me that image right after the text I copied.

Yup, paste it into photoshop for me.

Okay, now emboss a THG logo down the bottom and post it onto Toms Hardware.

Done? Okay, shut down then."

Something like that would completely revolutionise computer interfaces and would likely require a buttload of CPU cycles. Naturally, I dont expect it to spring up overnight but even integrating simple commands like "open explorer" into Windows is a first step.

Apart from that, it seems that most other killer apps (teleprescence) require too much high-bandwidth to make it feasilbe in the short term (look how long its taked to get everybody to 56K).
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
Speech recognition capability is already out there. But sounds like you're referring to a star trek environment, where the computer can fully comprehend and distinguish all the slang and minute nuances in natural speech. Yea, i agree, one day this scenario will probly be commonplace. But the artificial intelligence factor would have to really improve vastly. I just wanted to comment on Moore's Law, and how it doesn't make sense in these times to continue following it just to keep it alive.. or just to say you're following it.
 

AnthraX101

Senior member
Oct 7, 2001
771
0
0
You are simulating nature with a completely logical device, a computer. In order to get a more realistic view, you just need to keep throwing processor cycles at it. You only approach exact reproduction as you approach an infinite number of processor cycles.

20 or 30 years ago, the growth was also exponential, it was just on a smaller scale. We zoomed out a bit.

There will always be a demand for bigger, better, faster, and stronger, if for no other reason then to push down the older stuff to the bottom of the heap. Consider my last processor I bought, a 1900+. I got it for about $250. Before that, I had a 750mz T-Bird. That cost me about $250. And before that? 333mz Pentium 2. You guessed it, $250.

The market will advance, and people will keep getting newer things. Processors will bottleneck sometimes, memory will bottleneck others, and then it will be the video cards turn. They will all advance in their own time, and others will fall to the back.

What are we going to ?need? all that processing for? Better graphics, faster responses, and more advanced simulations. And if you don?t believe that, then why are you using your operating system now? 95 was good enough for anybody. So was DOS. And tubes.

AnthraX101
 

Cycad

Golden Member
Oct 18, 2000
1,406
0
0
I agree that processors are ridiculously fast now and the industry makes them solely to sell in the main market, home computing, where they are almost never needed. Why don't they offer an affordable slower computer in stores, because they can make more money by selling you a faster CPU. They jack up the prices on all computers simply by putting a faster CPU in them because most people don't know better. They don't realize that they would be getting a better system if they invested in a better motherboard or better video rather than looking only at the speed of the CPU. To the mainstream CPU speed is the only benchmark of a good computer. The companies out there feed this idea and so it perpetuates the need for faster CPU's. I agree that there will be demand for faster processors but not by people sitting at home sending e-mails, the market will come in large scale computing. Everyone is trying to screw the consumer, computer companies are no different.
 

dejitaru

Banned
Sep 29, 2002
627
0
0
who in the world needs multi-GHz computing?.....
Quite a few people, and they're willing to pay for it.
A PIII machine is still more than enough for like 95% of computer users even today.
Did you wave your wand before pulling this number from a hat? Even the biggest newbie can appreciate the speed of a mid-range system. People aske me all the time why their win98 box is so slow. It's not just (not mostly) the techs who buy a machine the day it's released. If you're doing anything beyound word processing, you will utilize the extra speed.
Should manufacturers cripple their processors just because you say so?

I could certainly use the fastest system available, and I appreciate the fact that it exists.
 

SuperTool

Lifer
Jan 25, 2000
14,000
2
0
Am I the only one who sees no point in upgrading the home machine?
I don't play games very often. It's fine for going online (1 ghz athlon xp) occasional games, playing divx's and mp3s, and staroffice.
With games moving more and more towards consoles, there are fewer and fewer reasons to upgrade personal PC. Maybe if you play sim city 4 or something, but other than that, what's the point?
The last piece of hardware I bought was that front USB bay at fry's that was $5 or something after rebate. That convenience brought me more enjoyment than upgrading my system for $200. It's all about peripherals again.
I would rather run 1ghz machine with a nice monitor than a 3 ghz machine with a cheap monitor.
 

vegetation

Diamond Member
Feb 21, 2001
4,270
2
0
Funny, this same argument was going around when the 386 chip came out. Way faster than the 286 it replaced, the thought of a 386-DX33 being replaced by anything faster was just mind-boggling. Many said there was no need for anything faster, just a waste, only "scientists" would benefit from faster chips. That all changed when newer games like Doom came out, revolutionizing the gaming industry. And of course, back then nobody had perceived the internet would turn into what it is now. All these advancements required more power, I'm sure none of us can accurately predict what the typical consumer will be doing with their computers 10+ years from now.


 

Titan

Golden Member
Oct 15, 1999
1,819
0
0
As a young software enginner, I can guaruntee you that if hardware get's faster, guys like me are going to take advantage of that speed and gobble it right up. Slowly but surely, many applications are being coded in an object-oriented manner, and things like determining which virtual function to call in C++ are determined at runtime, not compile-time. Look at Java, which is not only object-oriented ( and every variable is polymorphic), it has to run every instruction in it's language to a run-time virtual machine, the JVM. The more a piece of software moves into runtime, the more flexible, easy to code, and maintainable it can be, and all that run-time processing takes lots of CPU cycles. I recently worked on an old PII 400mhz machine running a visual Java aplication, and the response time was unbearable, taking like 5 minutes to get the cursor ready to type (not kidding, 5 minutes!). On my dual 1800+ system, everything ran as fast as I clicked.

My point is that AI, graphics, and scientific applications are all fine and dandy examples of the "need" for more computing power. But there is a general need. I am telling you that your next version of a great web browser or office app will rock your world, make life better, and use up most of the power that the new hardware can dish out. If you're happy with the old technology, and it serves your needs, that's great, but all that says is that you are getting old and your dreams have stagnated. There is a bigger picture to some of us, and there always has been. The fact is that new hardware and powerful software are still driving a technological revolution where every day dreamers sit down to find ways to make the world a better place, and we will need all the advances we can get. Or did we make fire, the wheel and then continue to dwell in caves content with our existence?

As for moore's law, i think of it as a goal that producers try to reach. We may see a limit reached somewhere around 30Ghz, and then other advances in refinement and paralell computing will have to drive the speed increase.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
I actually don't think that my little spiel would be all that hard to do but that might just me my naivity. Heres a disection of how I assume a computer would see it.

"Okay, lets open up a google window and search for Risc Processors.

Computer sees: "open, google, window, search, <Risc Processor>", this bit is pretty obvious, we can get speech apps to do this now.

Hmm... Gimme the third link on the page. Scroll down a bit... no too far.

Computer sees: "give, <third link>, <scroll down>, bit, <pause> <too far>". Google would have embedded code saying how to find the "third link" (possibly modified by user customized nuances). Scroll down is an easy command to recognise, in this exchange, the computer would modify the meaning of "a bit" since it has now learned that what it previously thought was a bit was too much.

Okay, see that paragraph starting with "the architecture is most noted for"?

Computer sees: "<see that paragraph> <the architecture is most noted for>". <see that x> would be a built in macro which basically does a grep.

Copy that entire paragraph and paste it into notepad for me.

Computer sees: "copy, <entire paragraph>, paste, notepad". This also seems fairly easy. Pretty standard for modern speech units.

Okay, lets open up a photoshop window.

Computer sees: "open, photoshop" again, pretty standard.

Okay, go back to the Googled page and get me that image right after the text I copied.

Computer sees: "<go back>, <Googled page>, get, image, <right after>, <text I copied>". This ones a bit complicated. "go back" is an ambiguous phrase but since the context is a photoshop window, we assume that it means switch window. Googled as a verb may be a bit of a difficulty but, after the first use, the computer can store all of your made-up words with what they mean. Right after the text i coped is a whole bucked of symobolic logic problems but I assume clever heuristics should be able to get it right most of the time.

Yup, paste it into photoshop for me.

Computer sees: "paste, photoshop", again pretty standard.

Okay, now emboss a THG logo down the bottom and post it onto Toms Hardware.

Computer sees: "emboss, <THG logo>, bottom, post, <Toms Hardware>". emboss might either be a photoshop specific macro or a user defined one. THG logo would be a referenced GIF file. bottom might be a bit complicated. It would get a refined meaning over time. If it were a user defined macro, it might be something like "somewhere in the bottom third of the screen, lower is better, the harder it is to see, the worse" so it would find a good place to put it. post is a fairly simple user-defined macro. Toms Hardware is simply a reference to a URL.

Done? Okay, shut down then."

Computer sees: "done, <shut down>". Done might be a bit of a problem Im not sure.

Well, it seems to me that its not COMPLETELY infeasible. Im relying on lots of years of good research on human speecha and some damn good heuristics but I think it is achieveable. I think a lot of what we say gives heavy clues to our meaning from the timing to the use of umms in places. Theres currently not enough research done for this to work.

 

everman

Lifer
Nov 5, 2002
11,288
1
0
I think with speech recognition, it's more of a problem of writing the code to do that than cpu power.
Eventually games will be able to render 100% realistic images and sounds but we'll need more power to do that.
 

dejitaru

Banned
Sep 29, 2002
627
0
0
Originally posted by: everman
I think with speech recognition, it's more of a problem of writing the code to do that than cpu power.
Eventually games will be able to render 100% realistic images and sounds but we'll need more power to do that.
Exactly! "...It's writing the program, not building the machine."

But seriously, those who are opposed to upgrades don't belong in the HT forum. Isn't this supposed to be a technocracy?

Things will always improve. Industry does not stop, not even with low demand.
A single high end machine may not suffice, so I multicompute.

You personally don't see a need to upgrade? Don't. Take your MS word and AOL app on your PII and have fun.


-Ace
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
we can already render 100% realistic images and sounds, we have been able to for years. simple ones anyway. ;)

point being though that people often look at such things like a goal, and it is a silly goal because it depends compleatly on how complex the realist enviroment is to be recreated. the whole consept is extreemly subjective.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
well i found this here.

also note that is a rather liberal stance on the issue though. realy more's law is more of an inudstry standard that they constantly try to live up to, it is far from natural law like something from Newton.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
Taken from the linked article:

" but data density has doubled approximately every 18 months, and this is the current definition of Moore's Law, which Moore himself has blessed"

Moore himself has blessed???? :confused: That right there reinforces my comment about the semiconductor industry following moore's law like it's one of the ten commandments..... utterly ridiculous. Technology roadmaps should follow what demands and economic scale exist today... not based on what someone said decades ago.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
ya some people seem to confuse technology with religion, Moore's Law is only one of countless examples i can think of. i agree with you rimshaker, it is compleatly absurd.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
Attention, attention everyone.... I claim that raw processor performance will roughly double every 30 months from now on.... and I have an MSEE specializing in wafer fab technology just for validation purposes.... so who's with me? I;ll call this prediction "Rimshakers Law"

Cmon folks, who's with me??!! There's ultimately no diference between Gordon and I (other than age now).... we just merely stated and predicted an observation. Hope i become famous for this...
 

Stealth1024

Platinum Member
Aug 9, 2000
2,266
0
0
Don't you dare try telling me computers are too fast! Have you ever tried rendering video or editing multi-track audio?
 

dejitaru

Banned
Sep 29, 2002
627
0
0
we can already render 100% realistic images and sounds, we have been able to for years. simple ones anyway.
Yeah, I can make an image that looks exactly like a real image rendered on a computer.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
The speed capabilities between the processor, memory, and storage subsystems have too much of a gap nowadays. The processor (CPU/GPU) subsystem has been carried along by Moore's Law on an exponential scale for the past decade, while everything else is being left too far behind with each new cpu family release. This fact, coupled with how wonderful the general economy is right now, leaves me to believe that this whole setup is reaching some unstable point in the future.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
As has already been stated, Moore's Law is an observation... Not really a "law".

But it's also been said many times in this thread, that semiconductor manufacturers are being driven by Moore's Law. And that's not really true.

Had Gordon Moore never made the statement in question, technology would still be advancing at the same rate. It's not like meetings are had by the lead engineers wondering, "How can we maintain Moore's Law?" Moore's Law is never a factor when designing cpu's. Faster performance and lower cost are pretty much the driving factors.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: dejitaru
we can already render 100% realistic images and sounds, we have been able to for years. simple ones anyway.
Yeah, I can make an image that looks exactly like a real image rendered on a computer.


no you can make it look like a picture/video shot in real life.