Dave2150
Senior member
I would not take any further advice from that professor.
That professor could(!) be right - some users have probably thrown their AMD CPU's into the bin, after discovering their lack of performance in CPU demanding games 😛
I would not take any further advice from that professor.
I would not take any further advice from that professor.
I heard they only last something like three months and then they burn out and start blue screening. That and they catch viruses really easy too.
🙄🙄🙄
Hey, don't be a doubter. I bought a 760K that came with a little baggie full of spare transistors. It's the world's first CPU that has user replaceable parts inside. The IHS even has a rather clever hinge to facilitate replacement of transistors and a little tray to hold the spare parts. Intel has NO SUCH TECHNOLOGY.
Hey, don't be a doubter. I bought a 760K that came with a little baggie full of spare transistors. It's the world's first CPU that has user replaceable parts inside. The IHS even has a rather clever hinge to facilitate replacement of transistors and a little tray to hold the spare parts. Intel has NO SUCH TECHNOLOGY.
You must have awesome eyesight. I keep losing track of my transistors when I delid, so I need an eletron microscope to keep track of them all. And a lot of spare time.
If you can find a modern day transistor with an electron microscope then you should be in the business of selling electron microscopes!
Even with today's best electron microscope that money can buy, the resolution and images they can generate of a 14nm (or 20nm) transistor pretty much sucks ass if you care about any kind of precision (thanks to electron charging).
The sweet pictures you, and pretty much everyone, get to see of transistors that comprise maybe 50 atoms wide are created from TEM imaging techniques.
I do not think AMD is a good processor. One of my professor in college told me in computer science class that the reason why AMD makes thire products cheap is because their products do not last long.
Yeahh, no. AMD's products are reliable and last long. Built a lot of Athlon II X3/X4 systems for friends back in my college days years ago, and they're all still running fine.
I would not take any further advice from that professor.
IF one of them has windows 8 try playing asphalt 8 or another high cpu usage game on the amd one and on an intel, and you will see a huge difference the Intel will work a lot better.
My professor has a doctors degree.
If I had a PhD in Literature, would you automatically assume anything I have to say about computer architecture is worth listening to?
That's right. Scholzpdx here used to be able to run that 8350 @ 40.2 Ghz, with 1.40v. These days, I hear he's needing 140v, and is only able to do 4.2 Ghz.😉
I've heard other people say similar things.
To be fair, AMD used to have really bad chipsets, and I think the SOI tech they used had lower thermal limits.
Maybe you can persuade him with your eloquent use of the language.
If you can find a modern day transistor with an electron microscope then you should be in the business of selling electron microscopes!
Maybe you can persuade him with your eloquent use of the language.
Nice talk about things that will be made in the future.Says the human with inferior eyesight. Though Tsavo has me beat, he can do it with the naked eye.
Orbital Mind Control Lasers. Nothing else will do.
Thread non-sequitur levels critical
My professor has a doctors degree.
Nice talk about things that will be made in the future.
Thread non-sequitur levels critical