Now isn't this ironic?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
Originally posted by: Fox5
I believe if you run cpu-z, it identifies a pentium M as something like P6-III or something like that, so it's still based on P6. In fact, I'd say that the backend is probably almost exactly identicle, rather it's the front end that has significantly changed.

Actually it doesnt, because I have a Dell Inspiron 8600 with a Banias 1.3Ghz next to me.


Originally posted by: Fox5
The real shame was that the 800mhz(or the closest mhz to that) Athlon C outperformed both the 1ghz P3 and the 1.4ghz P4.(depending on the task btw, for Intel or P4 optimized apps, like Quake 3 or encoding, the results would not look like this, just there were quite a few tests where an athlon 600mhz below a p4 would beat it).

The Athlon-C was a Thunderbird using a 133FSB. The lowest speed Athlon-C was 1Ghz. But if you want obscure benchmarks, WinRar has a 1Ghz P3-Coppermine beating a 1.8Ghz Northwood. Such things do exists, but I think its fair to say that a P4-Willamette, which wasn't that good on release, got better compared to the Athlon or the Pentium-3 as time and more optimized apps came in the future.

Originally posted by: Fox5
Well, that makes up for the P4's floating point deficiency, but SSE2 is much harder to use than....well not using it.(besides, P3 had SSE, how much better is SSE2 over SSE?)

How is it much harder to use? As time went on, better compilers were written for SSE2. It got to the point where if you recompile something with SSE2 flags, the performance would increase by double digit percentages (note the Apple fiasco when they introduced their G5's).

Originally posted by: Aenslead
You are mostly speaking of synthetic apps, and we all know those where HEAVILY optimized for Pentium 4 processors. Remember the fisaco that came from BAPCO because their software was written to take advantage of Netburst and make it look like the best thing since sliced bread?

How am I speaking of synthetic benchmarks? I meant encoding, media, games, and gfx. In the majority of those benchmarks, the P4-Willamette at 1.3Ghz beats a P3-Coppermine at 1Ghz, something unheard of when it first launched.

---

As for the integrated controller arguement, Sun Microsystems and DEC Alpha did that in the 80s. Ironically, they switched back to the off-die controller for various reasons. I seriously think a large part of the reason AMD did it was because of the poor memory controllers produced by VIA and SiS over the Athlon-AthlonXP years.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Actually it doesnt, because I have a Dell Inspiron 8600 with a Banias 1.3Ghz next to me.

Maybe so, but CPU-Z's website lists it as the P6+ core, and indeed, that seems to be the official code name.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Originally posted by: Fox5
In fact, I'd say that the backend is probably almost exactly identicle, rather it's the front end that has significantly changed.

It is identical in the sense that dataflow is roughly similar... but that's about it. The changes are massive in both front end and back end. It is convenient to lump P-M with P6 because they are in the same family, using that definition, merom and nehalem are also P6+, and that is quite misleading considering there has been what, 5 generations since?

On the topic of software optimizations, one of the things intel was kinda hoping for was mass adoption of certain compilers that were written in-house (intel employs more compiler engineers than microsoft apparently), but it didn't happen. For various reasons, code optimized for a P6/K7 platform (still the norm?!?) don't do so well on P4 and its loopy speculation.
 

Sixtyfour

Banned
Jun 15, 2005
341
0
0
It's 8080 with multimedia extensions, 64bit adressing, integrated memory controller and larger caches... ;)
 

TomKazansky

Golden Member
Sep 18, 2004
1,401
0
0
Originally posted by: Brunnis
Originally posted by: SickBeast
Mercedez-benz still uses very old and proven technology in their engines. No superchargers and that stuff.
Actually, Mercedes has been using superchargers for a long time. Pretty much all of their current AMG models, as well as some of the entry models use superchargers (also known as "Kompressor"). They even used this technology in the first half of the last century, if I remember correctly.

I might also add that they use twin-turbos with some of their current V12 engines.

i concur
 

ND40oz

Golden Member
Jul 31, 2004
1,264
0
86
Originally posted by: Griswold
Originally posted by: Brunnis
Originally posted by: SickBeast
Mercedez-benz still uses very old and proven technology in their engines. No superchargers and that stuff.
Actually, Mercedes has been using superchargers for a long time. Pretty much all of their current AMG models, as well as some of the entry models use superchargers (also known as "Kompressor"). They even used this technology in the first half of the last century, if I remember correctly.

I might also add that they use twin-turbos with some of their current V12 engines.

Though, a turbocharger isnt the same as a compressor. Common turbos use use the exhaust stream to power a turbine, while a compressor get it's power directly from the engine via a drive belt or similar transmission system. The effect is pretty much the same though.

But you're right, Mercedes equips many models with compressor technology.

BTW, the turbocharger was patented in 1905 by a swiss engineer. So it's almost as old as the car itself. :)

As for the topic at hand. Intels Itanium was supposed to replace the x86 processors at some point. Look where they are now. The industry didnt adapt to it.

It's cheaper to stay with a winning team (and improve it) than re-inventing the wheel.

A turbo is a compressor. It has two sides, the turbine side that is driven by the exhaust and the compressor side that compresses the intake air. If it didn't compress air, you wouldn't get any boost.