Moore's Law is in its own bubble...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DannyBoy

Diamond Member
Nov 27, 2002
8,820
2
81
www.danj.me
Well there are many people who have your view, and there are many people who dont.

I could go on google and pull off a few thousand results that are totally opposite to your view, but then again i could also go on google and pull of an equal number of people who agree with you.

At the end of the day everyone has their own views on each and every topic of discussion known to man. Ive left mine for your views on Moores Law, and mine is a Yes i agree with you and No i dont.

Dan

p.s. Isnt this more of a general / OT thread, doest really deserve to be in Highly Technical :confused:

Dan
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Moore's Law is an observation of an industrial trend - it's not a true Law. It's closer to "Murphy's Laws" than "The 2nd Law of Thermodynamics".

Whatis.com definition of Moore's Law:

The original Moore's Law derives from a speech given by Gordon Moore, later a founder of Intel, in 1965, in which he observed that the number of microcomponents that could be placed in an integrated circuit (microchip) of the lowest manufacturing cost was doubling every year and that this trend would likely continue into the future. As this observation and prediction began to be frequently cited, it became known as Moore's Law. In later years, the Law was occasionally reformulated to mean that rate. The pace of change having slowed down a bit over the past few years, the definition has changed (with Gordon Moore's approval) to reflect that the doubling occurs only every 18 months.

The first paragraph in the post at the top discussed why higher frequencies are not needed. Whether or not anyone agrees with opinions of whether or not frequency improvements are needed, it has nothing to do with Moore's Law.

All for what? Just to keep up with what someone said decades ago? So what!!?? It just seems the semiconductor industry has adopted moore's law like it was one of the ten commandments or something. And if they continue to base their total survival on obeying what one guy said, then this economy is really gonna get hit hard in the near future when demand for 'desktop supercomputing' starts leveling off.
Moore's Law is not a commandment and it's not a law like those passed by legislators. It's an observation. For what it's worth, I would highly recommend reading Moore's original paper since it's very readable and it's interesting to read with the benefit of hindsight. In this paper, Gordan Moore presented took a snapshot of several key industry statistics over the years and graphed them to see if there was any way to use the past to predict the future.
In 1965, chips could hold 60 transistors. Moore predicted that by 1975 as many as 65,000 transistors could be crammed onto a single chip. In an interview with Dori Jones Yang in 2000, Moore admitted he used to cringe every time someone used the term, and stated about his original ten year prediction that, "Really, that's as far ahead as I've ever been able to see." Moore's predicted rate of progress has been embraced by the industry, becoming somewhat of a self-fulfilling prophecy. If a company progresses faster, fabrication and development costs become prohibitive. If it progresses slower, components may be less expensive but the company falls behind the competition.
Rather than Moore's Law being a commandment, it's a prescient observation of the general trend in the industry that has held true for far longer than anyone - especially its originator - would have guessed. As far as the future of Moore's Law, I can say with confidence that it will continue to hold true for 10 more years. After that I could only guess, but I think the 10 years after it (putting us in 2023) will continue to track with the original observation. I can't see it holding for the next 50 years though... although I may well be surprised.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,059
32,580
146
Ultra-informative post as always PM :)

As to the original question concerning who outside of the scientific community and hardcore gamers need several GHZ of processing power, I'll remind you that we are All in a position to be a part of the scientific community via participation in distributed computing projects, and I can assure you personally that 3.1ghz seems far too slow when applied to one of these ;)
 

RyanM

Platinum Member
Feb 12, 2001
2,387
0
76
I wasn't going to bother reading the whole thread, but I'd like to take a moment to answer the question someone posed.

Essentially, he asked, "Why does anyone need all these extra MHz?"

Because the more MHz that are in one machine, the less computers that the render farm needs to operate at the same efficiency - Or, the more efficient it can operate with the same number of computers.

Because the graphics designers don't want to wait 30 seconds for a single gaussian blur on a 500 MB file.

Because the video editors don't want to have to get coffee while their machine processes and compresses a 500 MB AVI into a Sorenson 3 MOV.

Because the gamers want to have sex with virtual reality pornstars.

That's why. In any high-end app, you can NEVER have a fast enough computer. Ever.