Will Intel's 45nm send AMD to the grave?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Howard

Lifer
Oct 14, 1999
47,982
11
81
Originally posted by: mugs
Originally posted by: SampSon
Naw, the market will always contain competition.
I would be willing to bet that intel would dump serious money into AMD just to keep them afloat.

Don't you think they liked it when every personal computer except Mac had Intel processors?
I bet they also liked it when their top of the line consumer CPUs cost well over $1000 each.
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
Originally posted by: Leper Messiah
Originally posted by: DVK916
Originally posted by: Leper Messiah
Originally posted by: dexvx
Originally posted by: jpeyton
Google seems to have faith in AMD. Their new deployable datacenters are using huge clusters of dual-core Opterons.

Whats your point? The first google datacenters were using Transmeta blades.

Originally posted by: Leper Messiah
nope. AMD isn't just standing still lol. Plus their 90nm are still cooler than Intel's 65nm...

Are you saying an X2 running at a 90W TDP is better than a comparable Yonah running at about a 30W TDP?

no, but is yonah running on the desktop? Thats a pure apples to oranges comparison.

Some Viiv desktops will use yonah

when is this going to happen? Prolly around the same time that AMD's 65nm process comes out.

You can buy a desktop iMac right now with a Yonah. Several motherboard OEM's are releasing desktop Yonah boards (Aopen with a SLI board) in the next few weeks. Sossaman, which is a server based Yonah, will be ready in a few weeks as well.

AMD has stated they will not go to full desktop 65nm production until Q1 2007. Conroe is Q3 2006.
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
Originally posted by: FoBoT
not without real dual core cpu's

the conroe is a true dual core with a real shared L2 cache that should be far more efficient than the x2.

i even bought intel stock because of this.


amd is pretty behind intel in process technology. their 65nm chips will not even be available until the very end of 2006 or in some leaked reports the 1st half of 2007.

intels have been out since the end of 2005 and by the time amd gets 65nm out intel will be doing possibly quad core on 45nm using dual buses for 4p servers.


i suppose amd will always have its niche. but with intel owning a cost advantage they already own the

"i just need a computer, ANY cpu" market since they are cheaper to make , and they will soon retake the mainstream desktop if conroe is as good as speculated (not to mention probably 1/3 less cost to build than an x2 and lower heat dissipation to boot).

anyways, thats just my opinion. ibm and amd better get 65nm up and running well, because the xbox360, ps3, revolution and amd's dual core chips all depend on it.

i know most of the anandtech community is pretty high on AMD. but their cpus are not even that cheap anymore , and they are just riding a huge wave of euphoria right now.

i mean i have owned basically every amd chip ever and have held their stock in the past etc. but for the coming year, it looks like intel is really getting its act together compared to years past.

 

Imaginer

Diamond Member
Oct 15, 1999
8,076
1
0
Originally posted by: 13Gigatons
It seems that AMD has only gotten stronger in the last 2 years where it was predicted they would leave the CPU market by several analyst that would like to remain nameless. Intel for it's part has stumbled and run into walls that it did not predict like netburst not reaching 5ghz or even 4ghz for that matter. Intel had to scrambled and get the Pentium M out and Core Solo and Core Duo after just to keep up. Desktop will have the same approach as well, it will interesting to see how Intel markets the newer chips though now that netburst is being phased out.

Scrambled to get the Pentium M out? Originally this was a mobile platform because they realized they cant keep up using the power hungry and heat producing P4s in lappies forever. I didn't think they scrambled on this one but they did show some urgency on the matter.

But I see their core solos and duos as a path in the right direction. In the mobile market, you can't doubt that Intel has gained ground with not just their CPU, but their marketing approach as a solid platform.

AMD however has a slew of third party manufacturers to play with and to get it all integrated nicely is not an easy task. I still however trust AMD to have their CPU power my desktop though with ATI and Nvidia (mainly) as a support but it would be tough on their part to start marketing and solidifying themselves as what Intel has done.

It is just a new manufacturing process. I doubt AMD will kick the bucket from this. AMD has long refined its CPU architecture since the Slot A Athlon days and they will continue to do so. Heck, I wish Intel would show up AMD once again in the desktop world so that AMD would be as innovative once more - and hopefully as a platform rather than just CPUs.
 

jimbob200521

Diamond Member
Apr 15, 2005
4,108
29
91
Nope. I think AMD is here to stay. They may take a little longer to their desktop parts out, but they have usually competed/beat Intel procs. Ever since I switched over to AMD's X2, I haven't looked back.
 

hans007

Lifer
Feb 1, 2000
20,212
18
81
gee, ever since you switched to X2 you havent looked back ? that is what like 4 months ago tops? haha
 

Furen

Golden Member
Oct 21, 2004
1,567
0
0
Originally posted by: dexvx
Are you saying an X2 running at a 90W TDP is better than a comparable Yonah running at about a 30W TDP?

Heh, hardly comparable... An FX-60 burns like 70W tops (seriously) and destroys any Yonah in performance, has 64 bit execution and has a cooler name. Heck, 2.66GHz Conroe will burn 65W (Intel's desktop TDPs are closer to their load power draw) so AMD's not that far behind in power consumption.
 

BOLt

Diamond Member
Dec 11, 2004
7,380
0
0
Haha, this is a silly little poll for Anandtech forums. Anandtech loves AMD. You're not going to get very balanced results from a poll here. Of course, that doesn't make us wrong. Of course AMD is going to trump Intel...
 

SpeedZealot369

Platinum Member
Feb 5, 2006
2,778
1
81
AMD an Intel are huge companies with huge affixaty of capital, and one company coming out with a proseccor that kicks the other companies' proseccers' ass won't be the downfall of either of those companies. Plus, I have huge confidance in AMD taking over the cpu market with both desktop and mobile cpu's.
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
Originally posted by: Furen
Originally posted by: dexvx
Are you saying an X2 running at a 90W TDP is better than a comparable Yonah running at about a 30W TDP?

Heh, hardly comparable... An FX-60 burns like 70W tops (seriously) and destroys any Yonah in performance, has 64 bit execution and has a cooler name. Heck, 2.66GHz Conroe will burn 65W (Intel's desktop TDPs are closer to their load power draw) so AMD's not that far behind in power consumption.


Where do you get a FX-60 burning 70W tops? It's TDP is rated at over 100W. It's peak consumption is also around 100W:

http://www.xbitlabs.com/articles/cpu/display/athlon64-fx60_3.html

FX-60 destroying a Yonah? Clock for clock, the Yonah is on par with the FX series.
 

DVK916

Banned
Dec 12, 2005
2,765
0
0
Originally posted by: openwheelformula1
Once again Intel and AMD rate TDP very differently. Intel rates lower while AMD rates the maximum.

Check your facts too. Yonah doesn't draw significantly more power when idle or full load.
 

BrownTown

Diamond Member
Dec 1, 2005
5,314
1
0
why do people always quote the whole "AMD and Intel rate TDP differently" when they are lookign at actuall power number. Who the heck cares what a chip is rated for when you have the power numbers right in front of you? Clearly it says 110 watts under load, and thats a real number, not marketing numbers which are BS no matter who is writing them.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Oh for crying out loud, give that old "intel and amd tdp are different" canard a rest already. What is "maximum" power? Take any CPU, knock up the supply and frequency, and it'll easily exceed the stated TDP.

Do you even know how TDP is calculated? I guarantee you both companies use very similar procedures. Instead of repeating tired myths, go look up something called "power virus"
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
OK... way to disprove your own point. Some "maximum" power, any noob who can press del at BIOS can make those CPU's exceed the power draw given by that tool. LOL!
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: dmens
Oh for crying out loud, give that old "intel and amd tdp are different" canard a rest already. What is "maximum" power? Take any CPU, knock up the supply and frequency, and it'll easily exceed the stated TDP.

Do you even know how TDP is calculated? I guarantee you both companies use very similar procedures. Instead of repeating tired myths, go look up something called "power virus"

No, they don't dmens...if you check the data sheets for both companies you will see the difference immediately.
AMD's spec is

?Thermal Design Power (TDP) is measured under the conditions of TCASE Max, IDD Max, and VDD=VID_VDD, and include all power dissipated on-die from VDD, VDDIO, VLDT, VTT, and VDDA.?

This means that TDP, as defined by AMD, is measured at the maximum current the CPU can draw, at the default voltage, under the worst-case temperature conditions.

Intel's spec from Prescott is

?Thermal Design Power (TDP) should be used for processor thermal solution design targets. The TDP is not the maximum power that the processor can dissipate.?

Keeping in mind that TDP is a thermal design spec and not a measurement of power, it makes sense that Intel wouldn't need to have a TDP at max because they rely more heavily on throttling for their thermal solutions.

Also from the Prescott data sheet...

?Analysis indicates that real applications are unlikely to cause the processor to consume maximum power dissipation for sustained periods of time. Intel recommends that complete thermal solution designs target the Thermal Design Power (TDP) indicated in Table 26 instead of the maximum processor power consumption. The Thermal Monitor feature is intended to help protect the processor in the unlikely event that an application exceeds the TDP recommendation for a sustained period of time.?

BTW, I hear what you're saying about overclocking the supply and frequency...but that really has nothing to do with the given specs.
 

vaylon

Senior member
Oct 22, 2000
219
0
71
Intels biggest drawback at this point isn't its processors, its the approved chipsets.
If intel would open up its designs to all the chipset manufacturers and let them have at it, like AMD does, then I think Intell would have superior chips. But as long as they keep the process in house and just give other companies the use of their chipsets, they will flounder around a bit. But on the other hand you are starting to see more companies coming out with NEW chipsets for intel than in the past.

But my money is still on Amd.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
BTW, on the original point of the post...
AMD and IBM have also been developing both 45nm and 32nm for over a year now. The only difference is that they have not seen fit to advertise their step by step progress.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
No it won'tsend AMD to the grave. It will be interesting to see though what will happen when Conroe is released.

Intel will likely arrive on 45nm in Q3-Q4 2007 at the absolute earliest. So there is still quite a bit of time till 45nm.



 

Furen

Golden Member
Oct 21, 2004
1,567
0
0
Originally posted by: dexvx
Originally posted by: Furen
Originally posted by: dexvx
Are you saying an X2 running at a 90W TDP is better than a comparable Yonah running at about a 30W TDP?

Heh, hardly comparable... An FX-60 burns like 70W tops (seriously) and destroys any Yonah in performance, has 64 bit execution and has a cooler name. Heck, 2.66GHz Conroe will burn 65W (Intel's desktop TDPs are closer to their load power draw) so AMD's not that far behind in power consumption.


Where do you get a FX-60 burning 70W tops? It's TDP is rated at over 100W. It's peak consumption is also around 100W:

http://www.xbitlabs.com/articles/cpu/display/athlon64-fx60_3.html

FX-60 destroying a Yonah? Clock for clock, the Yonah is on par with the FX series.

Interesting power consumption graph, I've seen others with lower swings, S&M is a force to be reckoned with, I havent seen an X2 go over 80W (including VRM inefficiency) on other loads. Yes, Yonah pretty much matches the X2 clock for clock (I'd say the A64 is around 5% faster or so) but I havent seen a 2.6GHz 30W Yonah on sale, have you?


dmens: The power consumption for Conroe I'm just inferring from the launch schedule. Intel said 65W TDP so I'm assuming the top-end part will match it (basically every top-of-the-line part will match the TDP target, in my opinion, but I admit this is just an assumption) since I doubt Intel would go 2.33 for Merom if it had more headroom.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: BobDaMenkey
I don't think that AMD is going to go away any time soon. AMD will always have some sort of awnser to anything Intel does, it might not always match up (Barton 3200+ vs Intel P4 3ghz), but it'll still be an awnser, and if it doesn't match up, that only makes them work harder to out do it.

Barton 3200+ actually outperformed the 3.06ghz P4 it was released to compete against, but the 2.8Ghz P4 with the 800mhz fsb outperformed both. Intel has generally exploited more advanced fabrication processes to stay ahead, while AMD has gone for faster memory. (AMD had ddr first, and adopted higher speeds of ddr first, and their dominance right now is due to the integrated memory controller)

Anyhow, in the past AMD always had far better pricing, even if they didn't have the performance crown, right now AMD is actually more expensive for the performance in some price brackets. (but they're increasing prices and sales, so perhaps performance is more important than value...or maybe just because current Intel desktop processors are furnaces)

I bet they also liked it when their top of the line consumer CPUs cost well over $1000 each.

I'd say the EE and FX are analoglous to the Pentium Pro.