[hexus.net]AMD claims it will power another gaming device

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
From day 1 you keep talking trash about PS4 and XB1 without offering any better alternatives of how to make superior gaming consoles than those. Pretty much at every opportunity you trash those console's APUs, even in cases like Unity where clearly it is the worst optimized game of 2014!

Its a known fact that the 2 consoles failed to deliver what they should, aka 1080P/60FPS. And on top we also enjoy the sub 1080p cinematics and resolutions and 30FPS to sub 30FPS in large amounts for the same reason. Mainly because the CPU in both is blatantly weak. And yes, there was better AMD alternatives, but it didnt look as good on paper. Funny since you talk about your "high standards", yet accept subpair products. You got double morales.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
He's something you might not like to hear: If there is guaranteed profit to be made, an actual shareholder demands it from the company. It seems like many of you have bought into the companies responses to losing out on contracts; "It just doesn't fit in our profit margin."

But that's exactly how it works. As much as your local reseller won't sell you the things you buy below a certain price, an IHV will only design for you if you if you reach certain return threshold.

For AMD the consoles were really a gift, not only they have positive cash flow, something they sorely lack, it also allowed AMD to fulfill most of the WSA commitments that were haunting the company since 2012. Nvidia OTOH didn't have WSA commitment, did have positive cash flow business and was already trying to develop other businesses with higher returns. Why should they shift resources to fight for the console chips?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Don't be ridiculous. It may not be a silver bullet that cures all of AMD's woes, but it helps. Making money is still better than not making money.

I never said it wasnt good for AMD in their dire situation. Read what I answered too.
 
Last edited:
Apr 20, 2008
10,067
990
126
If you're going to be that naive to believe a company who designs video cards wouldn't want their products in every major video game system, you're out of your mind. Not only that, but with guaranteed profit from the beginning. 4/5ths of business is marketshare. Intel used their brand and marketing to survive Netburst, and it's because they had such a large market penetration that kept them afloat. Every company could use some cushion by ways of product recognition.

AMD states their APU was semicustom, meaning they slapped two products together and shipped it. To have paired up with Intel was clearly a pride issue or rejection. They can spin it to their investors all they want but losing out on millions in pure profit on a product they have already designed is the worst outcome possible.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Sony, MS, AMD, and owners of those consoles have won. IBM (PowerPC)/NV and IBM's/NV's shareholders are the losers.

Yeah, Nvidia's shareholders are losing, but laughing all the way to the bank, and I'm sure IBM shareholders did notice the lost sales of the console contracts in the middle of the 100 billion of revenue they have, from much more profitable business.

I think it's already clear that nobody would race to the same bottom AMD was racing regarding the console prices. Nobody *needs* the kind of margins AMD gets with the console contracts for the TAM of AMD could reach with the console contracts. AMD poor financial situation, and its willingness to swallow 15-17% operating margins, were a perfect match for the console makers, since they didn't want to get burned with high hardware costs like in the last generation.

Just because a firm diversifies to other sectors, it doesn't mean it has abandoned the PC segment as you keep implying. Much to your dissatisfaction, AMD's focus on other sectors outside the traditional PC market won't stop AMD's 390X from blowing your 980 out of the water in 2015.

One thing is to add resources to your company and then focus for other markets, that's what Intel and Nvidia have been doing in the last years. Another different thing is to cut down R&D in half *and then* focus on other markets. AMD is spreading itself thinner, and if they don't get out of some markets, or severely restrict the scope of the projects they are undertaking, bad things will happen.
 
Apr 20, 2008
10,067
990
126
You just don't get it. You're clearly not near any sort of upper level management or running your own business. Stable, guaranteed profit on a product that's damn near 100% complete for a semicustom application that will provide steady income while you develop in other related markets. Those chips are one-and-dones and continue generating increased profits for half of a decade. Even if it broke even with all expenses considered (including salary) you take it to increase marketshare to further your brand.

What does it tell the average Joe when no console contains nVidia architecture, but that their main competitor in the desktop space is in everything.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
What does it tell the average Joe when no console contains nVidia architecture, but that their main competitor in the desktop space is in everything.

Average Joe have no idea who powers his console. He only knows its Sony/MS/Nintendo.
 

NTMBK

Lifer
Nov 14, 2011
10,525
6,051
136
Average Joe have no idea who powers his console. He only knows its Sony/MS/Nintendo.

I knew who made the graphics in my Gamecube and Wii, thanks to the little ATi sticker on the side. ;) But yeah, even that seems to have vanished on the latest generation.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
You just don't get it. You're clearly not near any sort of upper level management or running your own business.

I don't know what kind of company you run or work for, but in the company I work for every single "bright" idea someone has to bring to the upper echelons it should have a financial plan attached, and if this financial plan attached is below the targeted returns, someone must give a hell of an explanation for the business to go ahead. Things like "stable, guaranteed profit", or "brand awareness" don't cut it, because even those things must adhere to the return guidelines.

In my case if I were proposing a business with subpar returns like the console chips, the first question I would have been asked is why I would be asking for the company to shift resources from a more profitable business for a less profitable business, and things would go downhill from there. Unless I could find a pool of idle resources about to be fired, that's a project that I would never present to my BoD. That's probably what would happen at Nvidia or Intel as well, and but not at AMD, because the other business have even lower returns than the console chips.
 

nenforcer

Golden Member
Aug 26, 2008
1,782
24
81
This last generation (7th?) of the PS3 / XBOX 360 was extended to 8 (2005 - 2013) years due in part to the global financial crisis which is 3 years beyond the normal 5 year console lifespan. Both of those consoles had HDMI outputs for up to 1080P but could only really render at 720P (PS3) or 900P (XBOX 360).

The new (8th) generation consoles appear to be a little underwhelming in terms of hardware specs when compared to the hype of the IBM CELL and XBOX Xenon considering that they still can barely do 1080P 30fps games with even fewer at 1080P 60fps. (although its still very early in the console life and much room for developer optimization)

I don't think this hardware will last as long (or age as well) as the 7th generation of consoles did and well probably see the XBOX NEXT or Playstation 5 in a shorter time frame probably back to the previous 5 year console lifecycle. (2018)

I think the 10th generation will get very interesting with in all likelyhood an ARM chip powering one or more consoles (possibly an Nvidia Tegra), an AMD x86 APU in there and probably still some form of an IBM POWER chip for the Nintendo console.
 
Aug 11, 2008
10,451
642
126
We have heard all these arguments before. It is very simple really. AMD had the best product for the specific application. The stars just sort of aligned for AMD in this case. I dont really see it as any grand technical or marketing feat, but neither Intel or nVidia had both the cpu and gpu available to make an integrated product. The consoles are a compromise however, and I dont understand why AMD fans get so defensive when it is pointed out that the cpu is weak. That is just a fact, you dont have to be an AMD hater to realize that. And the performance of the new consoles is good, but could be better, as expected from a compromise product.

It was a nice win for AMD, and a desperately needed one in light of their financial situation. OTOH, in the grand scheme of things, compared to the total x86 market and gpu markets, console are fairly minor, so I doubt Intel and nVidia are losing any sleep over not getting the console contracts.
 

cytg111

Lifer
Mar 17, 2008
26,855
16,114
136
Its a known fact that the 2 consoles failed to deliver what they should, aka 1080P/60FPS. And on top we also enjoy the sub 1080p cinematics and resolutions and 30FPS to sub 30FPS in large amounts for the same reason. Mainly because the CPU in both is blatantly weak. And yes, there was better AMD alternatives, but it didnt look as good on paper. Funny since you talk about your "high standards", yet accept subpair products. You got double morales.

It is a know fact that rhetorics like "It is a know fact" is used to convey opinion as fact in absence of evidence.
 

kawi6rr

Senior member
Oct 17, 2013
567
156
116
Its a known fact that the 2 consoles failed to deliver what they should, aka 1080P/60FPS. And on top we also enjoy the sub 1080p cinematics and resolutions and 30FPS to sub 30FPS in large amounts for the same reason. Mainly because the CPU in both is blatantly weak. And yes, there was better AMD alternatives, but it didnt look as good on paper. Funny since you talk about your "high standards", yet accept subpair products. You got double morales.

If its a known fact they failed to deliver then post the source! Yawn... sleepy reading your posts, same old thing different day.
 

DrMrLordX

Lifer
Apr 27, 2000
23,232
13,323
136
Have you tried running AMD proprietary drivers in Linux Mint 17.1?

Linux mint has a driver manager under control panel (that is unique to mint, Ubuntu doesn't have it) that is as simple as clicking the box for which proprietary drivers a person wants to be used. All that is required after that is a full reboot of the computer. (which they don't tell a person do..... but is apparently required to get everything working straight)

P.S. One major downside that I can determine is that the latest proprietary drivers are not an option with mint driver manager.

Haven't bothered with Mint, though Ubuntu does offer pushbutton driver installation for proprietary drivers in Synaptic. You can pick fglrx and fglrx-updates. When I used that, it just sort of locked up and did nothing. I have never had problems with that using an Nvidia card.

The main issues right now with Catalyst under Linux is the wine incompatibility (which is only an issue for those that want to use wine, mind you) and the silly Xorg incompatibility. The Xorg issue ceased to be a problem when they rolled out an Ubuntu-only update to 14.9 that fixed that little problem (14.12 is also compatible with Xorg 1.16 and later). The wine problem . . . is still there, and may be for awhile.

Then there's the monitor autodetect flaw built into fglrx (and, it turns out, the Windows driver too). The driver seems to detect resolution support that is a fiction (example: my junky Gateway monitor supports 1280x1024 max, but the driver seems to think it can do 1600x1200). It defaults to the highest detected resolution which results in the monitor going out of range, and for some bizarre reason this breaks xrandr/xset badly enough that the resolution can not be changed by hand from the command line. Fixing that problem involves setting up an xorg.conf file which Ubuntu (and many other distros) not longer even uses by default. Fortunately, aticonfig --initial creates a base file that is relatively easy to modify.

In Windows, thanks to MS not being criminally insane (all the time), the OS will default to a resolution that will actually work assuming a similar resolution was in use before installing Catalyst. Sure, Catalyst will still offer me non-functional resolutions like 1600x1200, but the important thing is that it doesn't default to any of those.

Bottom line is this: it is theoretically possible that you could pick a Linux distro with pushbutton fglrx installation, grab a version of 14.9 that is compatible with the standard version of X, and still have problems thanks to the monitor being pushed irreparably into an unsupported resolution. The solution involves hand-editing a .conf file in a directory which has contents that generally can not be altered by hand unless the user asserts root privs. And that doesn't even cover the wine thing.

edit: in all fairness to AMD, they have been rolling out drivers rapidly over the last year, and many things have changed. It isn't that hard to get 14.12 to work once you figure out what's going wrong. It's not like the driver itself is completely non-functional (though it seems to break some games for people under Linux and Windows . . . not sure why). In fact, it's very nice compared to Gallium .4 . The real problem is the shortage of good information on what to do if your driver install goes off the rails. That unsupported monitor resolution thing has been repeatedly reported for years, and for most, it's a show-stopper.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
It is a know fact that rhetorics like "It is a know fact" is used to convey opinion as fact in absence of evidence.

If its a known fact they failed to deliver then post the source! Yawn... sleepy reading your posts, same old thing different day.

Are you guys claiming the new consoles are not seriously under powered in comparison to their predecessors?

I was excited about Xbox 360's hardware. Day it released it put PC's to shame. PS3's cell was a marvel when it released, although untapped.

Today's consoles are a joke.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Are you guys claiming the new consoles are not seriously under powered in comparison to their predecessors?

I was excited about Xbox 360's hardware. Day it released it put PC's to shame. PS3's cell was a marvel when it released, although untapped.

Today's consoles are a joke.

They are faster in every single aspect, cpu throughput, gpu throughput, memory bandwidth, power consumption, feature set, ease of use, future proofing etc.

just try to remember that a 1.6GHz jaguar core outperforms a 3.2GHz xenon core and the gpu has a tonne of compute performance that wasn't available with the last gen of consoles.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
They are faster in every single aspect, cpu throughput, gpu throughput, memory bandwidth, power consumption, feature set, ease of use, future proofing etc.

just try to remember that a 1.6GHz jaguar core outperforms a 3.2GHz xenon core and the gpu has a tonne of compute performance that wasn't available with the last gen of consoles.

Yeah, they're faster than their predecessors but not by the leaps that Gen 7 roflstomped Gen 6.

We're how many years into this "HD Era" and our current consoles are struggling to do 1080p/60 often having to settle for tricks such as vertical-resolution upscaling, 30 FPS (or worse), and resolutions that were common during the 360/PS3 days.

The fidelity has improved, but I was hoping consoles would at least hit 1080p/60 this time around.

I'm so disappointed that it is now acceptable to have microstutter and tearing on consoles. It is without a doubt a shame how weak these new consoles are.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Yeah, they're faster than their predecessors but not by the leaps that Gen 7 roflstomped Gen 6.

We're how many years into this "HD Era" and our current consoles are struggling to do 1080p/60 often having to settle for tricks such as vertical-resolution upscaling, 30 FPS (or worse), and resolutions that were common during the 360/PS3 days.

The fidelity has improved, but I was hoping consoles would at least hit 1080p/60 this time around.

I'm so disappointed that it is now acceptable to have microstutter and tearing on consoles. It is without a doubt a shame how weak these new consoles are.

I'm sorry but those are design decisions and not lack of adequate performance. They chose to design it as such not that it magically degrades performance.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I'm sorry but those are design decisions and not lack of adequate performance. They chose to design it as such not that it magically degrades performance.

Of course they are design decisions. Decisions made, possibly after balancing everything, to match a level of "acceptable" performance.

900p/<30FPS for a game is a blazing example that design decisions made have to compensate for underwhelming performance/hardware.

You make it sound like the devs are more than happy to scale back their games. A good portion of dev rel has already stated the consoles are under powered.

It isn't an uncommon opinion.
 

turtile

Senior member
Aug 19, 2014
634
315
136
Its a known fact that the 2 consoles failed to deliver what they should, aka 1080P/60FPS. And on top we also enjoy the sub 1080p cinematics and resolutions and 30FPS to sub 30FPS in large amounts for the same reason. Mainly because the CPU in both is blatantly weak. And yes, there was better AMD alternatives, but it didnt look as good on paper. Funny since you talk about your "high standards", yet accept subpair products. You got double morales.

Well, the hardware in both systems are capable of 1080P and above. Developers would rather make the game look better and scale the resolution.

Their choose to use Jaguar was nothing other than a way to keep costs low. It has more compute power for its size and uses less power than Richland.

Microsoft and Sony sell consoles to make money. The lower the price, the more they'll sell. How many people refuse to buy the next generation due to the resolution? Not many... But price is a huge factor. Why do you think X1 outsold the PS4 in November? The price cut...
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Of course they are design decisions. Decisions made, possibly after balancing everything, to match a level of "acceptable" performance.

900p/<30FPS for a game is a blazing example that design decisions made have to compensate for underwhelming performance/hardware.

You make it sound like the devs are more than happy to scale back their games. A good portion of dev rel has already stated the consoles are under powered.

It isn't an uncommon opinion.

Your argument is centered around 1080p60 yet there are many games that run 1080p60 even on a lowly wii u. Again, these are design decisions the devs made. A perfect example is BF4 which runs nearly flawlessy at 1080p60 on massive maps on the ps4, that is because they designed their engine multithreaded from day 1.