• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Now isn't this ironic?

Aenslead

Golden Member
AMD64 was born... from Barton, and added on die memory controller and increased cache size. From there, it has been minor tweaks and its had new instructions added.

So the best processors in the market now, are based (90%, according to AT on their first review on Athlon 64) on an almost 6 years old technology.

Then we have Intel Pentium M technology, which is mostly based on the Pentium III, AND will be the future for Intel's plan (which is SUPPOSED to give them an edge on AMD... by Q2/06', that is).

So its ironic that today's best technologies are based mostly on old technologies, and the ones that should've been the innovative, are a complete fiasco (read: Netburst).

And the ones that where supposed to be "different" (Apple/IBM - G5), are dropping out in favor of an OLDER technology.

So when will we see a NEW product?
 
Well, that's the contrary example!!

It was well written on The Inquirer that cpu wars where like a walk on the park on a sunny sunday compared to the ACTUAL war that graphic chips carry.

I mean, those things DOUBLE their performance every 6 months! When do we see that on CPU's, dare I ask?
 
Mercedez-benz still uses very old and proven technology in their engines. No superchargers and that stuff.

Newer and more radical is not necessarily better.

The thing with CPUs is that they gain the most from simply doubling their transistor count along with their raw performance. Only things like 64-bit or RISC can significantly improve things, or else an on-die memory controller like the A64 has.

I'm wondering if an on-die northbridge is the next step towards efficiency.
 
Originally posted by: SickBeast
I'm wondering if an on-die northbridge is the next step towards efficiency.

Ya know, for the longest time I have wished for upgradable northbridges. Like, have a baseline model, and if you want more options and performance buy the premuim model. Kinda like oldschool socket 7 stuff, many manufacturers making different flavours of chips for the same socket. It'd spawn a whole new market area, or maybe I'm just hoping for more options.

Cpu's are just like engines, a good example is the small chevy. Lots of different improvements but its still the same old motor as back in 55.
 
What to mean new technology?? are you asking for a product that is not a microprocessor. There is only so much you can do with a processor.

How do you innovate a tooth brush??? to you take away the brush or add more brisles.
 
Originally posted by: Aenslead
Well, that's the contrary example!!

It was well written on The Inquirer that cpu wars where like a walk on the park on a sunny sunday compared to the ACTUAL war that graphic chips carry.

I mean, those things DOUBLE their performance every 6 months! When do we see that on CPU's, dare I ask?

ummm, ok

my point is that they still build on older technology

the 9700 pro, 9800, and the x800 series all used basically the same arcitecture
 
The Wright Brothers invited just about everything that the modern airplane uses: wings, engines, canards, vertical stabilizers, wing warping (a.k.a. flaps), etc. Even more amazing is that DaVinici came up with most of this centuries before the Wright Brothers and the Greeks developed the steam engine, etc. etc. etc. 🙂

As for GPUs, they do not double their performance every 6 months. For a lot of the new GPUs, the only time you see a performance increase is when the AA/AF/resolution is cranked up. Below 1600x1200, the differences are minor.
 
Originally posted by: lifeguard1999
The Wright Brothers invited just about everything that the modern airplane uses: wings, engines, canards, vertical stabilizers, wing warping (a.k.a. flaps), etc. Even more amazing is that DaVinici came up with most of this centuries before the Wright Brothers and the Greeks developed the steam engine, etc. etc. etc. 🙂

As for GPUs, they do not double their performance every 6 months. For a lot of the new GPUs, the only time you see a performance increase is when the AA/AF/resolution is cranked up. Below 1600x1200, the differences are minor.

That's not the GPU's fault...
But they don't double every 6 months anyway, it's more like every 12-18 months when a new product is released (no a rehash/clock speed bum) that there's a noticeable speed increase.
 
Originally posted by: lifeguard1999
The Wright Brothers invited just about everything that the modern airplane uses: wings, engines, canards, vertical stabilizers, wing warping (a.k.a. flaps), etc. Even more amazing is that DaVinici came up with most of this centuries before the Wright Brothers and the Greeks developed the steam engine, etc. etc. etc. 🙂

As for GPUs, they do not double their performance every 6 months. For a lot of the new GPUs, the only time you see a performance increase is when the AA/AF/resolution is cranked up. Below 1600x1200, the differences are minor.

Well, of course we will still need processors. I don't mean the fact that we have to CHANGE the concept of the processor or find something else.

My point is, that... its ironic that the best we have on present CPU's are past cores with new features.

when will we see the next NEW architecture? For example, I find the Cell processor to be VERY innovative... but hardly useful for average computing.
 
Originally posted by: Aenslead
Originally posted by: lifeguard1999
The Wright Brothers invited just about everything that the modern airplane uses: wings, engines, canards, vertical stabilizers, wing warping (a.k.a. flaps), etc. Even more amazing is that DaVinici came up with most of this centuries before the Wright Brothers and the Greeks developed the steam engine, etc. etc. etc. 🙂

As for GPUs, they do not double their performance every 6 months. For a lot of the new GPUs, the only time you see a performance increase is when the AA/AF/resolution is cranked up. Below 1600x1200, the differences are minor.

Well, of course we will still need processors. I don't mean the fact that we have to CHANGE the concept of the processor or find something else.

My point is, that... its ironic that the best we have on present CPU's are past cores with new features.

when will we see the next NEW architecture? For example, I find the Cell processor to be VERY innovative... but hardly useful for average computing.

Well, you said it: new innovations are hard to program for. Some of the latest supercomputers number in the tens of thousands of processors. Ever try mounting a filesystem on that many systems? A lot of what you see in the supercomputing world winds up on the desktop. Multicore? IBM was there first in the supercomputing world.
 
Originally posted by: SickBeast
Mercedez-benz still uses very old and proven technology in their engines. No superchargers and that stuff.
Actually, Mercedes has been using superchargers for a long time. Pretty much all of their current AMG models, as well as some of the entry models use superchargers (also known as "Kompressor"). They even used this technology in the first half of the last century, if I remember correctly.

I might also add that they use twin-turbos with some of their current V12 engines.
 
Originally posted by: Brunnis
Originally posted by: SickBeast
Mercedez-benz still uses very old and proven technology in their engines. No superchargers and that stuff.
Actually, Mercedes has been using superchargers for a long time. Pretty much all of their current AMG models, as well as some of the entry models use superchargers (also known as "Kompressor"). They even used this technology in the first half of the last century, if I remember correctly.

I might also add that they use twin-turbos with some of their current V12 engines.

Though, a turbocharger isnt the same as a compressor. Common turbos use use the exhaust stream to power a turbine, while a compressor get it's power directly from the engine via a drive belt or similar transmission system. The effect is pretty much the same though.

But you're right, Mercedes equips many models with compressor technology.

BTW, the turbocharger was patented in 1905 by a swiss engineer. So it's almost as old as the car itself. 🙂

As for the topic at hand. Intels Itanium was supposed to replace the x86 processors at some point. Look where they are now. The industry didnt adapt to it.

It's cheaper to stay with a winning team (and improve it) than re-inventing the wheel.

 
Originally posted by: Griswold
Though, a turbocharger isnt the same as a compressor.
Hehe... I know that. I was just pointing out that Mercedes uses both turbos and superchargers in their current model line-up.

On topic: I don't think there's any particularly good reason for starting out from scratch. It's usually a good idea to evolve an existing design. Sooner or later, after a number of revisions, it won't resemble the original design much anyway.
 
So the technology is showing signs of a plateau. Didn't scientists in the 70's predict we'd reach our limits in the 80's?

Thousands of engineers and scientists have been revising these core cpu architectures for 60 years. We are just at a point where the architecture got ahead of the material science. Once the 10 nm process is available and engineers have 100 billion (exaggeration) transistors to work with, they will be happy to design a new architecture.
 
Originally posted by: Brunnis
Originally posted by: Griswold
Though, a turbocharger isnt the same as a compressor.
Hehe... I know that. I was just pointing out that Mercedes uses both turbos and superchargers in their current model line-up.

On topic: I don't think there's any particularly good reason for starting out from scratch. It's usually a good idea to evolve an existing design. Sooner or later, after a number of revisions, it won't resemble the original design much anyway.

Fair enough; I was just trying to make a point. From what I'm told, Mercedez uses very old and proven tech in their engines/transmissions.
 
Originally posted by: osan0001
What's amazing that much of our current technology is based on 18th century mathematics and earlier.

What amazes me is that these chips are entirely made up of materials that are about 4.5 billion years old (or older) 😉

m
 
How is netburst a complete fiasco again? Intel sold 500 million cpu's based on that over the past 5 years.

Merom/Nehalem are "based" on P6/P3 only in a very loose sense. The pipelines look *similar* only at the highest level, which is meaningless. One step down and you will see the differences, but those details are never published. Every generation the uarch is changed, sometimes just a tiny bit, but the changes from banias -> merom and merom -> nehalem are quite significant.

So Intel/AMD still rely on "old technologies" like pipelining, out-of-order, renaming, branch prediction and caching... it so happens those ideas still form the basis of the best known method of executing generic x86 code. It's not like new ideas are tossed out, there are plenty of radically new strawman proposals, but they all had gaping weaknesses.
 
Originally posted by: dmens
How is netburst a complete fiasco again? Intel sold 500 million cpu's based on that over the past 5 years.

Merom/Nehalem are "based" on P6/P3 only in a very loose sense. The pipelines look *similar* only at the highest level, which is meaningless. One step down and you will see the differences, but those details are never published. Every generation the uarch is changed, sometimes just a tiny bit, but the changes from banias -> merom and merom -> nehalem are quite significant.

So Intel/AMD still rely on "old technologies" like pipelining, out-of-order, renaming, branch prediction and caching... it so happens those ideas still form the basis of the best known method of executing generic x86 code. It's not like new ideas are tossed out, there are plenty of radically new strawman proposals, but they all had gaping weaknesses.

It was destined for 10ghz. In that sense it was a failure.
 
Back
Top