AMD's Unified Gaming Strategy and the future of graphics

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

psoomah

Senior member
May 13, 2010
416
0
0
Steam is great no doubt. Somebody had to do it. And in reality someone will always innovate in the PC game market. Maybe Valve maybe someone else. Their Big Picture interface makes it just as easy to use my PC as an Xbox.

But from what little I read HSA optimizations work only in an iGPU situation right? If it does work with a discrete GPU, then its an open standard and it would behoove nVidia to incorporate it.

It will work with any combination of CPU, APU or GPU ... as long as they are architected to HSA specifications.

ARM and Qualcomm for example are founding members of the HSA.

Their future processors are certain to meet HSA specifications.

HSA is designed to provided a common interface that will allow integration of an array of processing elements. A SOC with both an AMD APU and ARM APU for example. A single chip solution for a tablet, notebook or laptop to run both windows and android AND allows for an outboard GPU, all seamlessly working together.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
The PS4 has an APU with 7850 class graphics that runs full bore at 65 watts.

Q4 2014 will see 20nm APUs with at least 7870 graphics in that same 65 watt range.

Nothing you just said makes any sense.

Can you share the video showing the full load power consumption of the final version of the PS4? Even just a kill-a-watt, that would be great!

PS4 has GCN 1.1, AKA a 7790 with more SP.

GCN 1.0 is a power sucker, even nVidia's biggest chip destroys it in perf/w.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It will work with any combination of CPU, APU or GPU ... as long as they are architected to HSA specifications.

ARM and Qualcomm for example are founding members of the HSA.

Their future processors are certain to meet HSA specifications.

HSA is designed to provided a common interface that will allow integration of an array of processing elements. A SOC with both an AMD APU and ARM APU for example. A single chip solution for a tablet, notebook or laptop to run both windows and android AND allows for an outboard GPU, all seamlessly working together.

And is still slower than my current PC...so...not replacing dedicated GPU
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Can you share the video showing the full load power consumption of the final version of the PS4? Even just a kill-a-watt, that would be great!

PS4 has GCN 1.1, AKA a 7790 with more SP.

GCN 1.0 is a power sucker, even nVidia's biggest chip destroys it in perf/w.

I believe the PS4 actually graces us with a 7870 class GPU. The Xbox one on the other hand has that super wimpy 7790. And the audacity to charge an extra $100 for it. It's the "somebody's watching you" premium.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
It will work with any combination of CPU, APU or GPU ... as long as they are architected to HSA specifications.

ARM and Qualcomm for example are founding members of the HSA.

Their future processors are certain to meet HSA specifications.

HSA is designed to provided a common interface that will allow integration of an array of processing elements. A SOC with both an AMD APU and ARM APU for example. A single chip solution for a tablet, notebook or laptop to run both windows and android AND allows for an outboard GPU, all seamlessly working together.

It's good tech to have standards.

So Samsung has released a Win 8/Android tablet that dual boots but apparently it can switch OSes without booting into them. Probably does a resume from SSD. The other feature is that you can put Win apps on the Android desktop and vice versa and apparently launch into apps from either OS. I really want to see how that works in practice.
 

psoomah

Senior member
May 13, 2010
416
0
0
Can you share the video showing the full load power consumption of the final version of the PS4? Even just a kill-a-watt, that would be great!

PS4 has GCN 1.1, AKA a 7790 with more SP.

GCN 1.0 is a power sucker, even nVidia's biggest chip destroys it in perf/w.

Sony specifically said the PS4 has 2013 next gen (GCN 2.0) graphics. Kaveri and the 8xxx GPUs are coming out concurrent with the PS4, and GCN 2.0 specifically address the power/efficiency issue, and in fact is said by AMD to be substantially more efficient than anything currently on the market, so why would Sony not include that advanced power savings graphics technology in their SOC?
 

psoomah

Senior member
May 13, 2010
416
0
0
It's good tech to have standards.

So Samsung has released a Win 8/Android tablet that dual boots but apparently it can switch OSes without booting into them. Probably does a resume from SSD. The other feature is that you can put Win apps on the Android desktop and vice versa and apparently launch into apps from either OS. I really want to see how that works in practice.

Samsung is also a founding member of the HSA Foundation. Expect their next such tablet to contain HSA compliant AMD/ARM hardware and provide substantially better performance and battery life.
 
Last edited:

psoomah

Senior member
May 13, 2010
416
0
0
And, unsurprisingly, your dog is about the only thing that will find something other than comedic value in your posts.

My dog doesn't worship me for my brilliant posts.

That would be awesome though.

I said, stay on topic.
-- stahlhart
 
Last edited by a moderator:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think the Die size on my GPU is nearly 4 times larger than that of my CPU. So I can't wait to see this new CPU that offers all that performance of my CPU and GPU combined. Especially considering they are having enough problems keeping temps in check on Haswell.

While you are right that more people will be fine with their iGPU as time goes on the discrete GPU will not just die. There will always be a market for it. Maybe a smaller market, maybe they will go up in price but people will always be there to buy a discrete card.

Even if at the end they are Pro level cards trickled down to consumers like the Titan or 780GTX. People bought those. nVidia is making marginal revenue from those. The market will exist for a decade at least. There are people still buying vinyl for heaven's sake.

When the market shrinks enough, dev's will stop supporting those few people who would pay for a discrete card. If the dev's don't offer a reason to get a discrete card, then you won't likely go out and want to buy one either and at that point, the cost of these cards will be awfully expensive.

When the IGP's are good enough to support high enough visuals, I don't know how much we are going to care about getting a tiny bit higher IQ. 5 years ago, we all had to have Ultra graphics, because the difference between that and medium was huge. Today, medium graphics are starting to not look much worse than ultra. That trend will likely continue. The closer to realism we get, the smaller the steps are.

I could be wrong, but there is a very good chance I'm not.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
When the market shrinks enough, dev's will stop supporting those few people who would pay for a discrete card. If the dev's don't offer a reason to get a discrete card, then you won't likely go out and want to buy one either and at that point, the cost of these cards will be awfully expensive.

When the IGP's are good enough to support high enough visuals, I don't know how much we are going to care about getting a tiny bit higher IQ. 5 years ago, we all had to have Ultra graphics, because the difference between that and medium was huge. Today, medium graphics are starting to not look much worse than ultra. That trend will likely continue. The closer to realism we get, the smaller the steps are.

I could be wrong, but there is a very good chance I'm not.

Then there's always a dedicated GPU that does things better, faster, higher resolution, with more features, faster memory, and can do higher AA levels. That isn't changing.

iGPUs will only change the low end spec from Intel HD3000 or whatnot, to something a little more substantial. When iGPUs advance so does the rest of the industry. Including a dedicated GPU. You may be fine with medium, I am not. That's why there is a choice and I choose more power.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Then there's always a dedicated GPU that does things better, faster, higher resolution, with more features, faster memory, and can do higher AA levels. That isn't changing.

iGPUs will only change the low end spec from Intel HD3000 or whatnot, to something a little more substantial. When iGPUs advance so does the rest of the industry. Including a dedicated GPU. You may be fine with medium, I am not. That's why there is a choice and I choose more power.

You're making irrelevant arguments. The market for "high-end" graphics (Which has already gone from $500 to $1000) will shrink and become unprofitable. Then, it will die.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You're making irrelevant arguments. The market for "high-end" graphics (Which has already gone from $500 to $1000) will shrink and become unprofitable. Then, it will die.

I bet it won't...cause no APU or iGPU will be able to match the power necessary to play the latest games at above 1080p resolutions.

If it could why are people talking about "medium settings becoming acceptable on PC"? You guys are acting like graphics demands will go backwards and people will be content with PS3/360 graphics forever.

For the record a OC GTX 780 > GTX Titan and isn't $1000. It's not exactly cheap but it is most definitely a high end card.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
You're making irrelevant arguments. The market for "high-end" graphics (Which has already gone from $500 to $1000) will shrink and become unprofitable. Then, it will die.

AMD with their 40$ APU and $300 barely-making-any profit high-end GPU might agree with you.

But with Steam-best-seller GTX 670 $400+ GPU,
with $1000 range bestseller Titan, with new $650 because-we-can mid/highend GTX 780 and with their $4000 pro cards used in $25,000 GRID workstations - NVIDIA proly disagrees.

We are not all the same.
Income distribution has never been more dispersed for the last 40 or so years. What's too expensive for some, is quite acceptable price for others.
That there is no money in XYZ+ dollar market is ludicrous. Top of the line products will ALLAYS and especially today find the buyer.
 
Last edited:

Siberian

Senior member
Jul 10, 2012
258
0
0
You're making irrelevant arguments. The market for "high-end" graphics (Which has already gone from $500 to $1000) will shrink and become unprofitable. Then, it will die.


So high end graphics are able to sell at a higher price so that means it's going to die? AMD will be gone long before the high end graphics market is gone. One is making money, the other is not.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I bet it won't...cause no APU or iGPU will be able to match the power necessary to play the latest games at above 1080p resolutions.

If it could why are people talking about "medium settings becoming acceptable on PC"?

For the record a OC GTX 780 > GTX Titan and isn't $1000. It's not exactly cheap but it is most definitely a high end card.

So now you're using the, "I can overclock my card to get the next higher card's stock performance", argument? Remember that next time someone says their 7970 will O/C past your 780.

O/C 7970 = 98.3 > stock 780 = 96.4
perf_oc.gif

perf_oc.gif


Besides, the 570 (The card that's most comparable) was $350 at launch compared to the 780 at $650. Nothing to make an argument with.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So high end graphics are able to sell at a higher price so that means it's going to die? AMD will be gone long before the high end graphics market is gone. One is making money, the other is not.

I'm not predicting the future of the 2 companies. I'm talking the future of dGPU. You can talk all of the anti AMD rhetoric you want to. It just shows that you can't address the point of the thread, "The Future of Graphics".
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
"iGPs will continue to get more powerful and practical for gaming, and dGPUs will become increasingly less necessary, blah blah blah"

I see this as becoming the case for mobile dedicated graphics, but pretty unlikely for desktops. And as gamers we'll still continue to happily shell out the cash for high end dedicated graphics in both laptops and desktops. The higher end iGPs just continue to eat at die space which adds to the cost of the processor for desktop users unless the HSA dream really becomes a reality.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
What I'm saying is the more advanced GPU's become, the less improvements we see. Even 5 years ago, the difference between low, medium and high was quite large. The difference between low, medium and large today is much less. As time goes on, the difference between image quality between low, medium and high graphics is likely going to shrink, to a point that a discrete GPU will be come less and less desirable.

It isn't that you can't do more with a discrete card. It is not that it won't continue to be more powerful, but for gaming purposes, the practical difference is shrinking. The end product will start seeing less of a tangible difference. At some point, we'll stop caring to spend all that money for higher end graphics.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
AMD will be gone long before the high end graphics market is gone.
You wish.


The original poster sounds like a marketer...or a dreamer.


I don't want either company to dominate.I would like to see 50/50 share...or even 33/33/33 Intel/Nvidia/AMD.


The Nvidia fanboys don't seem to want ANY competition to their beloved Company...they want AMD to be 'long gone' etc etco_O.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
If anything I think AMD stands the most to gain by exiting high end discrete graphics. It's not really making them money. And they got what they needed by buying ATi.


They are headed in the right direction. What they need to do is out-atom Intel. Intel is moving pretty slowly to the point that ARM owns the market. AMD might garner a slice of the market if they move fast enough. Tablets and Ultrabooks are going to be the bulk of the future computing market.


That would leave only nVidia in high end discrete graphics and it would be a monopoly. The costs will go up for those who choose PC gaming and a discrete card but the market will still exist. If anything nVidia tested it out pretty darn well with the Titan. Proving that people will buy a $1000 card for gaming even if they know that a similar performing card will show up in the time frame of months in the future.

There are after all people dropping thousands on those tube amps.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
There are after all people dropping thousands on those tube amps.

The income for those amps wouldn't make a dent in the R&D costs for dGPU's. Besides who wants to pay Audio Research prices for video cards?

011013ARREF750-600.jpg

$55,000 a pair :D
 
Status
Not open for further replies.