AMD GPU will power all 3 major next gen consoles?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Outrage

Senior member
Oct 9, 1999
217
1
0
nVidia has never made a GPU, TSMC does. Using your precise logic bumpgate wasn't nVidia's fault. I honestly don't see the RRoD issue as being AMD's issue, it is the XClamp that fails(fixed enough of them to know)- but if you blame nVidia for bumpgate and you have any integrity, you have to blame AMD for the RRoD issues and the billions of dollars of losses that it caused.

I say it again, amd sold MS a DESIGN. MS did the asic themself inhouse and that did't work to well for them with the RRoD.

MS tried to save some coins but failed, same as Nvidia did with bumpgate.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Are you saying that they're going to spend $225 on a video chip?

Try ~$25. BRD drive is going to run you ~$25, CPU is going to cost ~$25, RAM is going to cost you ~$25, HDD is going to cost you ~$25, PCB and other chipsets required are going to cost you around $25, PSU/case is going to cost you around $25, the controller is going to cost you around $25, Packaging/manuals/cables are going to cost you around $10, assembly is going to cost you around $10- hit those numbers and add in distribution costs(shipping, channel margins for each level etc) and you will hit the market at around $300 with a small loss, $350 would be roughly break even and $400 would give you some margin to work with(faster BRD, bigger HDD, amount of RAM etc will obviously move those numbers, but low balling everything it's tough to get to market under $350). I work in distribution, people seriously underestimate exactly what kind of costs there are involved in the things they buy and really don't seem to wrap their heads around how much money it takes to move something from a Japanese factory to a US or European retail facility(EU is actually quite a bit worse).

I say it again, amd sold MS a DESIGN

A design that failed, even after two die shrinks it was still failing. I absolutely consider it MS's fault, but it is no different then bumpgate. In both situations a better cooling solution would have removed the problem.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
A design that failed, even after two die shrinks it was still failing. I absolutely consider it MS's fault, but it is no different then bumpgate. In both situations a better cooling solution would have removed the problem.

No, the implementation failed. MS was responsible for the implementation. Therefore the fault lies with them. Nvidia was also responsible for the implementation that caused Bumpgate. Therefore the fault lies with them as well. Bumpgate wasnt due to bad silicon it was due to cost cutting measures during the assembly of the GPU itself which AFAIK TSMC has nothing to do with.

Lets say 2 different people ask you:

Hey Ben, Spec me a nice gaming rig. And you send them a list of parts from Newegg. So they can build a PC for themselves.

Another person asks you; Hey Ben build me a gaming rig. So you build them a PC.

If both of the above computers end up failing, which are you responsible for?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
it was due to cost cutting measures during the assembly of the GPU itself which AFAIK TSMC has nothing to do with.

nVidia doesn't own any manufacturing facilities. RRoD and bumpgate are very close to identical.

If both of the above computers end up failing, which are you responsible for?

If they both failed due to lack of colling then both of them would be my fault as I specced/built both of them.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
I don't know the particulars, but it could be that either party is at fault or some combination of both. If AMD gave them a design that exceeded the specified thermal constraints, that would be on AMD. If Microsoft didn't design a cooling system capable of dissipating the known and specified maximum amount of waste heat, it's clearly on Microsoft. Hell, it's possible that some of the chips were produced on a bad line that resulted in chips that leaked more. Then it could be the manufacturer for giving Microsoft bad chips, or even possibly Microsoft's fault for trying to get away with using some chips that should have been tossed.

Regardless, we'll probably never know the full story.
 

Outrage

Senior member
Oct 9, 1999
217
1
0
nVidia doesn't own any manufacturing facilities. RRoD and bumpgate are very close to identical.

Yes they are close, but you are trying to put the blame on AMD for something that aint there fault.

I will try to explain it to you

Nvidia build ARM soc's (they dont build them tcsm does, but they design the asic after the design that they bought from ARM) think of AMD like ARM

Would you put the blame on ARM if a tegra soc blows up ?
 
Last edited:

Outrage

Senior member
Oct 9, 1999
217
1
0
That's a flat out lie. Try reading my posts again.


A few points on the general choice- MS lost billions of dollars due to AMD GPUs overheating. The real cause of this was the shockingly stupid X clamp design, but what actually caused the RRoD was the AMD supplied part frying. That's just the reality of the situation. People point to RRoD as evidence as to why the companies would be concerned about performance/watt, they then point to the company whose part failed as the solution to that problem.

nVidia has never made a GPU, TSMC does. Using your precise logic bumpgate wasn't nVidia's fault. I honestly don't see the RRoD issue as being AMD's issue, it is the XClamp that fails(fixed enough of them to know)- but if you blame nVidia for bumpgate and you have any integrity, you have to blame AMD for the RRoD issues and the billions of dollars of losses that it caused.

?.......
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
If they both failed due to lack of colling then both of them would be my fault as I specced/built both of them.
That's very interesting. So if the Anandtech forums make a recommendation (“help me decide what graphics card I need”) for a part which turns out to have a faulty cooler, you'd blame us rather than the manufacturer?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
BenSkywalker you're either trolling, or ignorant of the facts.

The AMD designed part in the Xbox360 was never defective, the cooling sub system was flawed and caused the chip to run way out of spec, and crash. Some of the chips ended up being permanently damaged. There was nothing flawed on the silicon level, if the chip was cooled properly it did not fail. In fact, even when the chip greatly exceeded it's maximum temperature for long periods, repairing the cooling apparatus would actually make the console run properly again in some cases. The design itself was fine, there was nothing AMD could have done to prevent the RROD, short of overseeing the actual manufacturing of the Xbox themselves.

In Nvidia's case, they make too many compromises, so even if the chip was run in spec, it still failed. Nvidia and AMD work closely with TSMC, but there are design rules and constraints and need to be followed, Nvidia did not and as a result they ended up with a defective part on their hands. Nvidia subsequently corrected the design, and there were no more issues. AMD never experienced premature failures, and remember the silicon is made in the same factories.

It is very clear that Nvidia made mistakes, and in fact they admitted as much by agreeing to replace defective silicon (unfortunately with the same defective part). BTW, I've repaired several Xbox360's myself and they are still running fine.
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
how the hell is the RROD AMDs fault? that's like me replacing my car tyres with cheap ass remoulds and blaming the car manuf for the failure.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
That's very interesting. So if the Anandtech forums make a recommendation (“help me decide what graphics card I need”) for a part which turns out to have a faulty cooler, you'd blame us rather than the manufacturer?

What if that video card people reccomended was a GTX480 running at full clocks with a single slot heat sink and a small fan. The question as I present it sounds pretty dumb, doesn't it? Look at the cooling solutions we are talking about for the parts we are talking about, they are in line with my angle on your question(in other words, noone on these forums would ever reccomend it without getting blasted by the forums).

The AMD designed part in the Xbox360 was never defective, the cooling sub system was flawed and caused the chip to run way out of spec, and crash. Some of the chips ended up being permanently damaged. There was nothing flawed on the silicon level, if the chip was cooled properly it did not fail.

The nVidia designed part in the laptops was never defectinve, the cooling sub system was flawed and caused the chip to run out of spec and crash. Some of the chips ended up being permanently damaged. There was nothing flawed on the silicon level, if the chip was cooled properly it did not fail.

Both statements are true.

AMD never experienced premature failures, and remember the silicon is made in the same factories.

The PS3 never experience premature GPU failures, and the silicon for it and the 360 are made in the same factories. Sony designed a much better cooling solution then MS.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
both statements are true.
Now you are just trolling, you know very well your statement is wrong. The Nvidia part WAS defective, it routinely failed even with proper cooling. I've seen the exact same notebook come back 3 times, each time it got a new motherboard and revised bios to increase the fan speed/fan profile. The GPU kept failing despite the cooling system working harder and keeping the chip relatively cool, bad design.
The PS3 never experience premature GPU failures, and the silicon for it and the 360 are made in the same factories. Sony designed a much better cooling solution then MS.
Exactly. And you need to learn the difference between a chip failing because of an internal flaw, and a chip failing because of poor cooling. And in fact, the fansink on the early Xbox360's does not make proper contact, so the chip gets hot as hell. By reapplying thermal compound and clamping the HSF properly, the units then work without issue. I've done several myself, it's not a difficult repair.
no, they are both wrong.
Care to explain how my statement is wrong?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
Can we please put to bed the argument that Nvidia GPU's were not defective? Nvidia themselves admitted as much:

NVIDIA is committed to providing our customers with quality products that push the edge of technology and also continuously improve product quality and reliability. To help improve the product quality and ensure smooth and uninterrupted product supply during the current “end stage” of life cycle, NVIDIA strongly recommends that customers transition to this latest revision of the NB8E-SET GPUs as soon as possible. These latest revision units utilize “Hitachi” underfill packaging material that improves product quality and enhances operating life by improved thermal cycling reliability.

source

Nvidia apparently has a solution to the problems faced by many in their notebooks powered by GeForce 8M series, that is, to encourage the OEM/ODMs to buy their new problem free chips. Of course, this doesn't solve the issues faced by current users and there won't be any replacement for them. Nvidia has come out with a new GPU called NB8E-SET which is essentially NB8E-SE with the new underfill. NB8E-SE is used on many notebooks most commonly advertised as GeForce 8700M GT, GeForce 8800M GS or GeForce 9650M GS. NB8E-SET (G84-751) uses the new Hitachi underfill and is clocked at 625MHz core and 128-bit 800MHz GDDR2/3 memories similar to the current NB8E-SE.

http://vr-zone.com/articles/nvidia-...us--buy-our-new-chips/6351.html#ixzz1RjFvqzDF
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The passion for defending AMD, can do no wrong. If the xbox gpu was a nvidia designed gpu, I'd bet I see claims (armchair) of them under-rating its TDP. Which is probably why the original cooling solution failed so often.
 

Outrage

Senior member
Oct 9, 1999
217
1
0
Care to explain how my statement is wrong?
The gpu asic design was faulty, if i't had been done right from the beginning by ppl that knew what they where doing ie not microsoft, the problems wouldent have been as massive as they where.

microsoft tried to save a few million $'s and that costed them billions, they had to spend those millions anyway to get the asic done by another undisclosed company, probably amd.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
The passion for defending AMD, can do no wrong. If the xbox gpu was a nvidia designed gpu, I'd bet I see claims (armchair) of them under-rating its TDP. Which is probably why the original cooling solution failed so often.

Your blind passion for defending Nvidia actually makes neutral members hate that company even more.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
The gpu asic design was faulty...
True, but not the silicon itself. The chip was poorly soldered to the motherboard, combine that with a cheaply designed and attached fansink, and it was a recipe for failure. But the Xbox360 problems are nothing like bumpgate.

To Microsoft's credit, they did a good job of replacing defective units, even though it ended up costing MS at minimum over a billion dollars. Really bad management decisions were make initially, the original Xbox360 was very poorly designed.

edit - this explains it pretty well
The problem is that the cooling design of the 360 doesn’t hold up. The cooling of the CPU was well done, with a heat pipe to draw the heat away from the chip (and accordingly, away from the mainboard). The problem is that the GPU and its low-profile heatsink sit under the DVD drive, and are given a very narrow channel for air to be pulled acrosss the heatsink by the fans. When the GPU heats up enough, not only does it reflow the solder in the ball grid array slightly, it can cause the entire mainboard to flex - a phenomenon largely caused by the X-shaped brackets that hold the heatsinks on under the mainboard. They hold the heatsinks down to the chips with a tension fit that presses up directly underneath those chips.

So when the system gets too hot, the combination of loosened solder with a mainboard that flexes from heat causes the GPU or CPU to actually break its connection from the board - resulting in the 3 red lights and secondary error code 0102 (the “unknown hardware error” code).
 
Last edited:

Outrage

Senior member
Oct 9, 1999
217
1
0
The passion for defending AMD, can do no wrong. If the xbox gpu was a nvidia designed gpu, I'd bet I see claims (armchair) of them under-rating its TDP. Which is probably why the original cooling solution failed so often.

[not serious] You could put the blame on nvidia for ms RRoD problems, if nvidia wouldent have been so greedy with the gpu in the original xbox, MS would probably have bought the complete gpu package from a gpu vendor instead of trying to do most of it themself. :eek: [/not serious]
 

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
I believe Nintendo already confirmed that Wii U has an RV770 style GPU in it. Chances are that Microsoft is going to continue with AMD as well. That only leaves Sony as the "unknown". Since consoles have limited cooling space and the original Xbox360 was heavily criticized for being loud, I believe there will be an added emphasis on power efficiency/heat. In this case, both AMD 5000 and 6000 series chips are simply superior in terms of performance / Watt compared to GTX4xx/5xx series. Therefore, it only makes logical sense that AMD is the preferred supplier for consoles in the next generation.

In addition, NV's higher end chips need a more costly 320/384-bit memory bandwidth which will add significant costs to the console motherboard design. It would be far cheaper to implement a 256-bit memory interface with GDDR5, which again AMD provides.

Either way, it's far more important to see how powerful the GPUs will be, rather than what vendor will provide them. I mean if they put an HD6770 or an HD6550 as part of the APU into the PS4, I will not be impressed, regardless of how efficient that chip is. So I am waiting until exact specs are released. At least if all 3 GPUs will be provided by AMD, there will be no more fanboy arguments over which GPU is faster since you would be able to compare them across AMD architectures :)

It is unfortunate that the consoles are launching during the "stagnant GPU" time though. GPUs haven't really improved in performance that much since HD5870 was released in Sept of 2009. I am still hoping either Sony or Microsoft squeeze a 28nm Kepler or 28n HD7000 series into their 2012-2013 consoles, even if it is one of their mid-range offerings (given the limitations for power consumption). But that's probably unlikely considering development is way under way for both. I am guessing there is a 95% likelihood the next gen PS4/720 will have an HD5000 or HD6000 derivative GPU.

A very good post. It even had humor.

At least if all 3 GPUs will be provided by AMD, there will be no more fanboy arguments over which GPU is faster

Hahahahah! Are you new to the internet? :D
 

HeXen

Diamond Member
Dec 13, 2009
7,835
37
91
There is always the PPC 7. I hear its pretty powerful.
The real issue for all the 3 console makers is all the freaking people making PC vs Console threads and making comparisons of hardware....one nice thing about the Cell initially was that the direct comparisons werent too feasable and no one really knew the full extent of it for quite a while, but the GPU wasnt different enough.

I don't care if a new PS4/720 is less powerful than my PC rig since they always make better use of it than i get outa mine anyway, but hopefully the hardware architectures are different enough so i don't have to see so many stupid VS threads or better yet not be shunned for preferring the console.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
If the RROD problems were ATI / AMD's fault, why did Microsoft fork the bill to pay?

Do people really think a company like Microsoft would have paid for the costs of replacing those faulty Xbox 360s themselves, if the fault lies with ATI / AMD?

Contrast that with the many laptop manufacturers who had bumb gate problems, and it was NVIDIA who had to pay for the replacements regardless of the manufacturer of the laptops.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
I predict Nvidia will be the GPU supplier to Sony. Nvidia will give Sony a deal they can't refuse, even if it means losing a bit of money in the process. Nvidia can't afford to take the PR hit, and can't afford to be locked out of the next-gen consolescape from a platform/software perspective.