• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[ Bloomberg ] AMD Facing Bleak Future

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I would not take any further advice from that professor.

Hey, don't be a doubter. I bought a 760K that came with a little baggie full of spare transistors. It's the world's first CPU that has user replaceable parts inside. The IHS even has a rather clever hinge to facilitate replacement of transistors and a little tray to hold the spare parts. Intel has NO SUCH TECHNOLOGY.
 
I heard they only last something like three months and then they burn out and start blue screening. That and they catch viruses really easy too.

🙄🙄🙄

That's right. Scholzpdx here used to be able to run that 8350 @ 40.2 Ghz, with 1.40v. These days, I hear he's needing 140v, and is only able to do 4.2 Ghz.😉

Hey, don't be a doubter. I bought a 760K that came with a little baggie full of spare transistors. It's the world's first CPU that has user replaceable parts inside. The IHS even has a rather clever hinge to facilitate replacement of transistors and a little tray to hold the spare parts. Intel has NO SUCH TECHNOLOGY.

Congratulations. I had not accidentally spit on my monitor lately...until I read your post!
 
Hey, don't be a doubter. I bought a 760K that came with a little baggie full of spare transistors. It's the world's first CPU that has user replaceable parts inside. The IHS even has a rather clever hinge to facilitate replacement of transistors and a little tray to hold the spare parts. Intel has NO SUCH TECHNOLOGY.

You must have awesome eyesight. I keep losing track of my transistors when I delid, so I need an eletron microscope to keep track of them all. And a lot of spare time.
 
You must have awesome eyesight. I keep losing track of my transistors when I delid, so I need an eletron microscope to keep track of them all. And a lot of spare time.

If you can find a modern day transistor with an electron microscope then you should be in the business of selling electron microscopes!

Even with today's best electron microscope that money can buy, the resolution and images they can generate of a 14nm (or 20nm) transistor pretty much sucks ass if you care about any kind of precision (thanks to electron charging).

The sweet pictures you, and pretty much everyone, get to see of transistors that comprise maybe 50 atoms wide are created from TEM imaging techniques.

I'm getting off topic here, but I remember well the standard engineering test of "repeatability" in which you measure the dimension of a specific structure on a wafer over and over again, only to watch the numbers themselves change over time because the SEM was adding electrons to the structure (changes the EMF so electron paths are distorted on their way to the detector) but also physically altering the sample area by way of sputtering atoms off of the wafer.

Destructive sampling sucks, electron charging sucks, and TEM sample prep sucks...but these days an SEM is really only useful for the BEOL. Real people have TEMs. 😉 😀
 
If you can find a modern day transistor with an electron microscope then you should be in the business of selling electron microscopes!

Even with today's best electron microscope that money can buy, the resolution and images they can generate of a 14nm (or 20nm) transistor pretty much sucks ass if you care about any kind of precision (thanks to electron charging).

The sweet pictures you, and pretty much everyone, get to see of transistors that comprise maybe 50 atoms wide are created from TEM imaging techniques.

but "EM" in "TEM" stands for "Electron Microscope" isn't it ?

btw aren't STM http://en.wikipedia.org/wiki/Scanning_tunneling_microscope also used in the field for chip micrographs ?
 
Last edited:
I do not think AMD is a good processor. One of my professor in college told me in computer science class that the reason why AMD makes thire products cheap is because their products do not last long.

I've heard other people say similar things.

To be fair, AMD used to have really bad chipsets, and I think the SOI tech they used had lower thermal limits.
 
The problem generally wasn't with AMD's CPUs, but rather the fact that until the K6-2 era (when they really started pushing manufacturers to make good-quality motherboards) they tended to be paired with cheap, godawful motherboards that would blow up if you sneezed in their general vicinity. And even after that, it wasn't really until nForce 2 arrived on the scene that AMD's systems started to be in the same league as Intel in terms of reliability and stability.

If anything, Cyrix were much worse for reliability problems, especially once they started using non-standard voltages and bus speeds; the 83MHz bus speed on their latter chips usually gave a 41.5MHz PCI speed for instance, which resulted in all sorts of unpleasantness.
 
Yeahh, no. AMD's products are reliable and last long. Built a lot of Athlon II X3/X4 systems for friends back in my college days years ago, and they're all still running fine.

IF one of them has windows 8 try playing asphalt 8 or another high cpu usage game on the amd one and on an intel, and you will see a huge difference the Intel will work a lot better.
 
IF one of them has windows 8 try playing asphalt 8 or another high cpu usage game on the amd one and on an intel, and you will see a huge difference the Intel will work a lot better.

I'm not saying AMD CPUs are as good performers as Intel chips. We were talking about reliability and longevity, and I think that both AMD and Intel do a good job there.
 
That's right. Scholzpdx here used to be able to run that 8350 @ 40.2 Ghz, with 1.40v. These days, I hear he's needing 140v, and is only able to do 4.2 Ghz.😉

I know right! Just 7 months old and nearly a tenfold reduction in clock speed and a hundred-fold increase in voltage needed.

I have to use seawater cooling similar to a nuclear power plant to boot into Windows.
 
To think such ignorance can drive a buy. I have heard from friends that you cant play Phsyx games with AMD GPUs. And they werent talking about using those particular effects, they mean dont run the game AT ALL.


To the other poster above, actually SOI processors are known to have better degradation rates than bulk proc's. So if you are in a OC scenario, it will be likely that the bulk cpu would need to have a bump of vcore earlier in its lifetime to remain stable than the SOI one.
 
I've heard other people say similar things.

To be fair, AMD used to have really bad chipsets, and I think the SOI tech they used had lower thermal limits.

I also heard the new CEO sacrifices virgins weekly and eats babies.
 
Back
Top