The future of AMD in graphics

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

coercitiv

Diamond Member
Jan 24, 2014
7,250
17,086
136
Even firesale 580s are a tough ask. I bought one here a few months ago for my nephew, and it was really disappointing. Hot, loud, inefficient (XFX XXX I think, have to check my posts in FS/FT to find it again).
The XFX should have no problem performing at silent levels, unfortunately it's Silent BIOS profile was replaced by one with mining oriented settings. (1150Mhz max clocks, better memory timings)

Nevertheless, I would still recommend your nephew to switch to the mining BIOS, then make two adjustments in Wattman - increase max clocks by 100-150Mhz and raise target temps from 65C to 75C.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,751
12,753
136
Okay, @Innokentij , how did Radeon VII not improve power efficiency over Vega64/VegaFE? According to my power numbers posted earlier in this thread:

Radeon VII: delta power of 230W, ~17000 points in Superposition 1080p medium
Vega FE: delta power of 370W, ~15000 points in Superposition 1080p medium

Is that somehow not an improvement? I'd like to hear your thoughts on the matter.
 

Guru

Senior member
May 5, 2017
830
361
106
Nope, the 1080Ti is solidly ahead of Vega 64 in Vulkan games, Doom and Wolf 2.
No its not!


Skip to the gameplay footage, not only does it match it and trades blows, but on occasions it's also faster.

Doom it is faster, but only by about 5-6%, usually around 10fps. So yeah, not to mention you left out the whole RX 580 vs 1060 6gb, which clearly shows AMD's architecture being much better in DX12 and Vulkan in like 90% of games.

Heck even the RX 570 beats the 1060 6gb in some DX12 and Vulkan games.

I mean look at Vulkan and DX12 performance in Turing over Pascal, the RTX 2060 always beats the 1070ti in Vulkan and most DX12 games, even though it on par or slower than a 1070ti in DX11 games. So as I was saying it took Nvidia 3 years to catch up to AMD in terms of low level api performance!
 
  • Like
Reactions: Zstream

Guru

Senior member
May 5, 2017
830
361
106
I don't know. Definitely for gamers right now AMD doesn't have much. Even firesale 580s are a tough ask. I bought one here a few months ago for my nephew, and it was really disappointing. Hot, loud, inefficient (XFX XXX I think, have to check my posts in FS/FT to find it again).

I've recommended $250 used Vega56 over RTX 2060 for an experienced user who is competent enough to carefully undervolt and 64 mod it. Other than that, it's a rare case by case basis. 10 times out of 10 at the same price or even a moderate premium, I'd take an 11Gbps 1060 over a 580, and a 1160ti over any Vega. They are just a lot less hassle to get working without a bigger PSU and more PCIe power/noise/size. 1160ti gives around Vega64 performance with tiny cards and near silent operation, after playing with one day 1 from Micro Center it's my go to in the 250-300 range for sure. At mostly high settings it's able to run major titles at 1440p easily, which I couldn't say for the 580. I had to really help tweak and dial settings significantly back for 1440p on there. A couple of years ago 480 was one of my favorite value cards until mining ruined things. But games have moved on, BFV, Metro Exodus, AC Odyssey, etc, it's just too much for 480/580 (even 590 for that matter) beyond 1080p. I do see them for $120 used though now, on our FS/FT and Craigslist, and for that price I do think they're still very nice budget 1080p cards, certainly better than a 1050ti if you have the PSU for it. Even the 4GB variants, because let's face it, they're too slow to use 8GB and 1440p, and 4k is laughable with any of the cards in that segment even up to 1660ti/Vega64/2060.

I do think there should be a 1660ti performer for $199 though. I'm hopeful that Navi10 can be just that, because Nvidia will obviously just keep charging higher prices without some kind of competition.
What are you even talking about? The GTX 1060 6GB is still to this day $250, on sale admittedly you can find some that go as low as $220 for single fan cards, but most RX 580's go for $200 or less and I've not actually seen the 11gbps 1060's they don't exist. Even if they were still selling, the difference between the gddr5 and gddr5x one is literally 0%. The only thing the gddr5x does is it gives it even more OC headroom for memory, but the issue with that is you actually reach the max power limit the same, so whether you have gddr5x or not you are hitting the same OC speeds essentially as the power limit is the same. These cards can't draw more than 145w.

RIGHT NOW there are 3 sales on newegg for RX 580 8GB, one for $170, two for $190 and all of the RX 580 come with 2 different games.

So I don't know what you've been smoking, but the RX 580 is literally 2x times better value than a GTX 1060.

Most GTX 1660ti are selling for $290 right now, which is not a mainstream market. $180-250 is the mainstream market in a nutshell, maybe one can stretch his/her budget +$20, but that's it. Anything over $250 is high end market.
 
  • Like
Reactions: guachi

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Even back in the day with Tahiti first launched 200-300 was midrange. The cheapest 7950 was $425, and that was a bare bones blower model. The 7970 was in the $550 range.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
No its not!
Old Video, using Vega 64 LC, which is also overclocked vs a standard 1080Ti, that's a pathetic comparison grasping at straws.
Now here are these games with latest patches and drivers, Here are the real comparisons, the 1080Ti wipes the floor with Vega 64, the 1080 is on par or slightly behind.


Wolfenstein_II_average_fps.png


https://techreport.com/review/34105/nvidia-geforce-rtx-2080-ti-graphics-card-reviewed/11

RTX2080-REVIEW-54.jpg

https://www.hardwarecanucks.com/for...a-geforce-rtx-2080-ti-rtx-2080-review-17.html

Wolf2_1.png


https://www.pcper.com/reviews/Graph...2080-Ti-Review/Game-Testing-Far-Cry-5-Wolfens

6jvkmtff6en11.png


index.php


https://www.guru3d.com/articles-pages/geforce-rtx-2080-ti-founders-review,21.html


wolfenstein-2_3840-2160.png


100904.png


https://www.anandtech.com/show/1334...tx-2080-ti-and-2080-founders-edition-review/9

And here is DOOM:
index.php

https://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-ti-gaming-x-trio-review,19.html

NVIDIA even managed to beat AMD in their favorite benchmark (Ashes) by a good margin, how the time has changed!

ashes_0.png


https://www.pcper.com/reviews/Graph...2080-Ti-Review/Game-Testing-Far-Cry-5-Wolfens
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Old Video, using Vega 64 LC, which is also overclocked vs a standard 1080Ti, that's a pathetic comparison grasping at straws.
Now here are these games with latest patches and drivers, Here are the real comparisons, the 1080Ti wipes the floor with Vega 64, the 1080 is on par or slightly behind.

Just a bit of a clarification, those benchmarks are not "current patches and drivers" for older cards. As new cards come out, those sites don't go back and retest 100 different cards. So the Vega64 and 1080Ti for instance are from when the game came out. The 2080/2080Ti numbers are from when those cards came out, using a newer version of the game and newer drivers obviously.

Not that I would expect there to be much of a difference, but it is something to be aware of.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
. So the Vega64 and 1080Ti for instance are from when the game came out. The 2080/2080Ti numbers are from when those cards came out, using a newer version of the game and newer drivers obviously.
Nope, these are all retested cards. ALL OF THEM. That's why most benchmarks have a short list of cards.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
The XFX should have no problem performing at silent levels, unfortunately it's Silent BIOS profile was replaced by one with mining oriented settings. (1150Mhz max clocks, better memory timings)

Nevertheless, I would still reccomend your nephew to switch to the mining BIOS, then make two adjustments in Wattman - increase max clocks by 100-150Mhz and raise target temps from 65C to 75C.

Oh my God I forgot lol.

The XFX I got here actually caught on fire and burned his new Corsair 650W up. I had to buy the new PSU as his old one didn't have the right PCIe connector on it.

I ended up giving him my old 1060 FTW and some other PSU I had, figured Corsair wouldn't replace a PSU with literally melted wires.
 

Attachments

  • received_240655109985572.jpeg
    received_240655109985572.jpeg
    146.8 KB · Views: 33

coercitiv

Diamond Member
Jan 24, 2014
7,250
17,086
136
The XFX I got here actually caught on fire and burned his new Corsair 650W up. I had to buy the new PSU as his old one didn't have the right PCIe connector on it.

I ended up giving him my old 1060 FTW and some other PSU I had, figured Corsair wouldn't replace a PSU with literally melted wires.
The card caught on fire and you didn't use the warranty to get a new one in return?!

I have an XFX 580 8GB XXX, been running for over a year with no particular problem so far. It isn't noisy at all except during POST, although I did configure the fans in software.
 
  • Like
Reactions: DarthKyrie

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
The card caught on fire and you didn't use the warranty to get a new one in return?!

I have an XFX 580 8GB XXX, been running for over a year with no particular problem so far. It isn't noisy at all except during POST, although I did configure the fans in software.

They rejected the RMA, no real surprise.

It was noisy and not particularly fast anyway.
 

tajoh111

Senior member
Mar 28, 2005
346
388
136
AMD has picked the crossroad where it's picking compute over gaming because they don't have the resources nor the architecture to tackle both gaming and compute simultaneously with one architectures.

AMD made a gamble where they hoped they could reduce investment, software development on R and D for the architecture because of the console market.

They hoped by coding for their architecture in consoles, they would have a natural advantage that would naturally carry over to the discrete gaming market.

What they underestimated was the amount of work to code for directx 12 from consoles and the bugginess of directx 12 made it unattractive to developers. There is no cheap port of directx 12 from consoles and most ports perform worse on directx 12 on both sides.

Generally the best directx 12 performing titles on AMD, run poorly on Nvidia which shows the weakness and closed nature of directx 12. Directx 12 was not designed to be a mutually beneficial open system. It was a trojan horse by AMD which was meant for GCN to florish while kepler would flounder if Nvidia stayed on the same architecture. Luckily for Nvidia, Directx 12 was problematic to code for an open environment with a wide variety of hardware configurations and was limited to Windows 10 systems. This made directx 11 the preferable path for developers where driver strength and savings for developers from reduced code allowed Nvidia to flourish because of a much stronger GPU software driver team which is reflected by their strong day 1 performance.

At this point AMD is in an ackward place. Directx 12 won't take off until it benefits Nvidia as much as AMD because they represent the majority of the market. And they don't have the man power to code the drivers for direct x 11 which is why day 1 performance tends to suffer on top of development teams coding for Nvidia since of their marketshare.

AMD can shine with directx 11 as games like call of duty shows but they don't have the resources or marketshare using directx 11 to go head to head with Nvidia in this space.

AMD has to make a decision where they invest more on software and marketing and essentially bribe software developers to make GCN the lead platform in the PC market. It's too late for this and AMD does not have the resources.

Less likely but if AMD wants to save money on software development is collaborate fully with Nvidia on making directx 13, where both companies mutually benefit, rather than just AMD mostly. This way, developers have incentives to use the platform because it benefits the whole market. Not just AMD.

Unfortunately for us, I think AMD is picking an exit strategy and intends to go full compute and abandon the PC discrete market to prevent getting squashed by Intel and Nvidia. Right now AMD appears to be following this road because they haven't given a damn about the gaming market with their hardware release. This is reflected by the lack of gaming focused cards and the bandaid solution at the moment of bundling games rather than releasing new hardware.

I don't blame them. 7nm is such an expensive node which doesn't leave room for 3 players to make money because there is not enough revenue to go around. Without mining, the discrete videocard market might be worth 6 billion dollars for gaming annually. With the cost of designs, wafers and the fierce competition from Nvidia, continuing to compete in the discrete market where an increasing more expensive market where the rewards are not worth the risk. A 6 billion dollar pie where you need 40% marketshare to make money vs a 50 billion dollar pie(the CPU market) where you just need 15-20% to make money. Intel represents a dark horse at the moment, but I have a feeling they will land microsofts next console. I know there are some ryzen/navi rumors out there but Intel unlike Nvidia can swallow their pride and give their products away to get a contract. With Apple moving away from Intel, Intel will have extra capacity at their fabs which they need to feed. In addition, with AMD closing in on Intel's CPU IPC advantage, Intel will need need need their graphics, particularly laptops to keep their marketshare. Getting a console is too important strategically and if I was Intel, I would be willing to spend billions to get it. Because it not only raises the chances of a successful Discrete launch, it weakens AMD and Nvidia who are biting into their markets. In addition, a console trojan horse from Intel has a strong chance to succeed since the incentive is there for developers.

Unlike AMD, Intel has the marketshare and money to get developers to code for their system on top of provide stronger software support through funds. If Intel is able to get something like a 4tflop into their laptops/discrete integrated by 2020-2021, with Intel strong chip sales, developers will have a strong reason to code for Intel as the lead platform since they represent 70% of the market overall.

https://www.extremetech.com/gaming/...pu-market-shifts-between-intel-amd-and-nvidia

That means, if they can land a console, it means very big things which puts uncomfortable pressure on Nvidia and squeezes out AMD altogether. I also feel the integrated graphic market more closely mirrors to closed ecosystem of consoles in terms of hardware meaning a low level API is more likely to work in this market.
 
Last edited:

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
AMD has picked the crossroad where it's picking compute over gaming because they don't have the resources nor the architecture to tackle both gaming and compute both architectures.

AMD made a gamble where they hoped they could reduce investment, software development on R and D for the architecture because of the console market.

They hoped by coding for their architecture in consoles, they would have a natural advantage that would naturally carry over to the discrete gaming market.

What they underestimated was the amount of work to code for directx 12 from consoles and the bugginess of directx 12 made it unattractive to developers. There is no cheap port of directx 12 from consoles and most ports perform worse on directx 12 on both sides.

Generally the best directx 12 performing titles on AMD, run poorly on Nvidia which shows the weakness and closed nature of directx 12. Directx 12 was not designed to be a mutually beneficial open system. It was a trojan horse by AMD which was meant for GCN to florish while kepler would flounder if Nvidia stayed on the same architecture. Luckily for Nvidia, Directx 12 was problematic to code for an open environment with a wide variety of hardware configurations and was limited to Windows 10 systems. This made directx 11 the preferable path for developers where driver strength and savings for developers from reduced code allowed Nvidia to flourish because of a much stronger GPU software driver team which is reflected by their strong day 1 performance.

At this point AMD is in an ackward place. Directx 12 won't take off until it benefits Nvidia as much as AMD because they represent the majority of the market. And they don't have the man power to code the drivers for direct x 11 which is why day 1 performance tends to suffer on top of development teams coding for Nvidia since of their marketshare.

AMD can shine with directx 11 as games like call of duty shows but they don't have the resources or marketshare using directx 11 to go head to head with Nvidia in this space.

AMD has to make a decision where they invest more on software and marketing and essentially bribe software developers to make GCN the lead platform in the PC market. It's too late for this and AMD does not have the resources.

Less likely but if AMD wants to save money on software development is collaborate fully with Nvidia on making directx 13, where both companies mutually benefit, rather than just AMD mostly. This way, developers have incentives to use the platform because it benefits the whole market. Not just AMD.

Unfortunately for us, I think AMD is picking an exit strategy and intends to go full compute and abandon the PC discrete market to prevent getting squashed by Intel and Nvidia. Right now AMD appears to be following this road because they haven't given a damn about the gaming market with their hardware release. This is reflected by the lack of gaming focused cards and the bandaid solution at the moment of bundling games rather than releasing new hardware.

I don't blame them. 7nm is such an expensive node which doesn't leave room for 3 players to make money because there is not enough revenue to go around. Without mining, the discrete videocard market might be worth 6 billion dollars for gaming annually. With the cost of designs, wafers and the fierce competition from Nvidia, continuing to compete in the discrete market where an increasing more expensive market where the rewards are not worth the risk. A 6 billion dollar pie where you need 40% marketshare to make money vs a 50 billion dollar pie(the CPU market) where you just need 15-20% to make money. Intel represents a dark horse at the moment, but I have a feeling they will land microsofts next console. I know there are some ryzen/navi rumors out there but Intel unlike Nvidia can swallow their pride and give their products away to get a contract. With Apple moving away from Intel, Intel will have extra capacity at their fabs which they need to feed. In addition, with AMD closing in on Intel's CPU IPC advantage, Intel will need need need their graphics, particularly laptops to keep their marketshare. Getting a console is too important strategically and if I was Intel, I would be willing to spend billions to get it. Because it not only raises the chances of a successful Discrete launch, it weakens AMD and Nvidia who are biting into their markets. In addition, a console trojan horse from Intel has a strong chance to succeed since the incentive is there for developers.

Unlike AMD, Intel has the marketshare and money to get developers to code for their system on top of provide stronger software support through funds. If Intel is able to get something like a 4tflop into their laptops/discrete integrated by 2020-2021, with Intel strong chip sales, developers will have a strong reason to code for Intel as the lead platform since they represent 70% of the market overall.

https://www.extremetech.com/gaming/...pu-market-shifts-between-intel-amd-and-nvidia

That means, if they can land a console, it means very big things which puts uncomfortable pressure on Nvidia and squeezes out AMD altogether. I also feel the integrated graphic market more closely mirrors to closed ecosystem of consoles in terms of hardware meaning a low level API is more likely to work in this market.
This post is a big load of nonsense.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,574
10,211
126
AMD has to make a decision where they invest more on software and marketing and essentially bribe software developers to make GCN the lead platform in the PC market.
Unlike AMD, Intel has the marketshare and money to get developers to code for their system on top of provide stronger software support through funds.
Contradict yourself much?

AMD is bribing developers to code for their arch., but Intel, UNLIKE AMD, is providing "software support, through funds" - aka, "bribing" developers. So. are or aren't AMD "bribing" developers.

Last I knew, gpuopen.com was AMD's efforts to open-source things, not bribe developers. Funny how you didn't mention that.

That means, if they [Intel] can land a console
I thought Consoles were supposedly low-margin biz. So low, that NVidia wasn't interested, and Intel, which lives and dies on high gross margins, would never in a thousand years touch a console design. Heck, even Iris Pro was too expensive, and only saw the light in a very expensive NUC unit. If Intel consoles cost $1000 like their NUC units, forgot it.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Nope, these are all retested cards. ALL OF THEM. That's why most benchmarks have a short list of cards.

No, they aren't. We know for a fact thats not the case with Anandtech, they specifically say they don't retest cards. Out of all the ones you posted, there may be some that have the resources to test 25 cards every time a new card comes out, but that is certainly not the norm.
 

tajoh111

Senior member
Mar 28, 2005
346
388
136
Contradict yourself much?

AMD is bribing developers to code for their arch., but Intel, UNLIKE AMD, is providing "software support, through funds" - aka, "bribing" developers. So. are or aren't AMD "bribing" developers.

Last I knew, gpuopen.com was AMD's efforts to open-source things, not bribe developers. Funny how you didn't mention that.


I thought Consoles were supposedly low-margin biz. So low, that NVidia wasn't interested, and Intel, which lives and dies on high gross margins, would never in a thousand years touch a console design. Heck, even Iris Pro was too expensive, and only saw the light in a very expensive NUC unit. If Intel consoles cost $1000 like their NUC units, forgot it.

Not at all.

I used the word bribing because, it a direction developers would normally not take. It takes more incentive for developers to program a directx 12 path when it primarily benefits only AMD hardware(where they are not the marketshare leader), requires vastly more work than the directx 11 path since your in charge of all hardware communications and thus have to write the drivers and the high likelihood you still get everything wrong in the end and end up with worse performance. This on top of resource problems with having to code this directx 12 path on top of the directx 11 path. Split teams up to write both codes concurrently and delay your launch or only launch a directx 12 path and sacrifice the install base which do not use windows 10.

Coding for Intel as the lead platform if Intel makes strong integrated graphics for once, opens up an install base that is gigantic. As the extremetech article shows, they represent 70% of the market. Marketshare means alot to developers since it increase their potential revenue. Today Nvidia is attractive option because of their marketshare and the ease to program on the directx 11 platform. Intel might be able to pull developers away if they can product integrated graphics which are as powerful as something like an xbox one s which is something developers can atleast work with.

As a result, with a 70% install base for graphics, developers will want to focus their attention on the Intel platform because it opens up their potential sales up vastly. That is they might be willing to give up on directx 11, throw Nvidia and AMD under the bus because of the potential install base. There's a big win it for developers unlike coding a pure directx 12 path for AMD. They will want to code to the metal because in 2020, 4 tflops will be underpowered a bit. I also suspect it will be easier to code for an Intel chip as the lead platform when the variables between graphics and CPU are less. That is there will be much less hardware configurations on a CPU/GPU combination much like a console.

What I am talking about as far as support is send teams to work and help developers on a one to one level. Something Nvidia already does extensively.

AMD takes a hands off approach mostly, keeps the software open for use and freely available at the expense of providing the worst support. E.g AMD open linux drivers. This is a cost saving measure since development is done by the community mostly rather than AMD. Providing support as far as staff and support is not a bribe.

You don't think lowering margins is part of Intel's strategy? Think again.

https://appleinsider.com/articles/1...ly-subsidizing-cheap-x86-atom-android-tablets

Intel lost billions trying to make x86 mobile devices a thing. They were not only giving the chips at near cost, they were paying manufacturers 51 dollars per tablet to use their hardware. Considering some tablets were less than 100 dollars, this was a pure loss, not just a break even move. A company willing to lose 7 billion dollar to increase their chances of success is someone who would spend billions getting into the console market to increase their chances of success to prevent another larrabee which cost them 4 billion dollars.

Selling these chips at cost would be far cheaper then some of Intels other contra revenue schemes. Lets not forget the big one where Intel paid dells up to a billion dollars a year to not use AMD chips.

https://money.cnn.com/blogs/legalpad/2007/02/suit-intel-paid-dell-up-to-1-billion_15.html

Intel is by far the most ruthless company when it comes to willing to give something up for a strategic advantage in the future. If any company is willing to give up on margin or accept a loss to gain a competitive advantage it is Intel.

If intel gives these at cost which is relatively cheap compared to some of Intel't other contrarevenue schemes, at the very least Intel will ensure that their discrete chip does not follow the failures of larrabee which makes it completely worth doing. Giving these chips at cost to prevent another larrabee failure is actually cheap considering the billions they are likely pouring into their next discrete graphics.

The NUC is a proof of concept that was never meant to translate into real volume, selling it at low cost has no strategic value since intel is already dominating in mobile PCs/home theater computers. If Intel is able to get into consoles and as a result of this, turn every 2020 or 2021 laptop/desktop with integrated into a gaming PC, you have a tremendous advantage intel can leverage over AMD's APU's, remove volume from Nvidia at the lower end and have Intels full discrete chips perform at a higher level because they were the lead system in development. That's why selling at cost is nothing. Fragmenting the market to remove revenue from AMD and Nvidia is in Intel's best interest. AMD is quickly catching up with Intel's CPU and Nvidia is encroaching and taking away from Intels data center market. Remove cashflow your your competitors and they will start to struggle(particularly with 5nm and beyond).

On top of this, with Apple moving away from Intel, Intel will have an fill empty capacity of 20 chips that used to go into Apples computers annually. A console sounds like the right fit.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
5,151
5,537
136
Not at all.

I used the word bribing because, it a direction developers would normally not take. It takes more incentive for developers to program a directx 12 path when it primarily benefits only AMD hardware(where they are not the marketshare leader), requires vastly more work than the directx 11 path since your in charge of all hardware communications and thus have to write the drivers and the high likelihood you still get everything wrong in the end and end up with worse performance. This on top of resource problems with having to code this directx 12 path on top of the directx 11 path. Split teams up to write both codes concurrently and delay your launch or only launch a directx 12 path and sacrifice the install base which do not use windows 10.

Coding for Intel as the lead platform if Intel makes strong integrated graphics for once, opens up an install base that is gigantic. As the extremetech article shows, they represent 70% of the market. Marketshare means alot to developers since it increase their potential revenue. Today Nvidia is attractive option because of their marketshare and the ease to program on the directx 11 platform. Intel might be able to pull developers away if they can product integrated graphics which are as powerful as something like an xbox one s which is something developers can atleast work with.

As a result, with a 70% install base for graphics, developers will want to focus their attention on the Intel platform because it opens up their potential sales up vastly. That is they might be willing to give up on directx 11, throw Nvidia and AMD under the bus because of the potential install base. There's a big win it for developers unlike coding a pure directx 12 path for AMD. They will want to code to the metal because in 2020, 4 tflops will be underpowered a bit. I also suspect it will be easier to code for an Intel chip as the lead platform when the variables between graphics and CPU are less. That is there will be much less hardware configurations on a CPU/GPU combination much like a console.

What I am talking about as far as support is send teams to work and help developers on a one to one level. Something Nvidia already does extensively.

AMD takes a hands off approach mostly, keeps the software open for use and freely available at the expense of providing the worst support. E.g AMD open linux drivers. This is a cost saving measure since development is done by the community mostly rather than AMD. Providing support as far as staff and support is not a bribe.

You don't think lowering margins is part of Intel's strategy? Think again.

https://appleinsider.com/articles/1...ly-subsidizing-cheap-x86-atom-android-tablets

Intel lost billions trying to make x86 mobile devices a thing. They were not only giving the chips at near cost, they were paying manufacturers 51 dollars per tablet to use their hardware. Considering some tablets were less than 100 dollars, this was a pure loss, not just a break even move. A company willing to lose 7 billion dollar to increase their chances of success is someone who would spend billions getting into the console market to increase their chances of success to prevent another larrabee which cost them 4 billion dollars.

Selling these chips at cost would be far cheaper then some of Intels other contra revenue schemes. Lets not forget the big one where Intel paid dells up to a billion dollars a year to not use AMD chips.

https://money.cnn.com/blogs/legalpad/2007/02/suit-intel-paid-dell-up-to-1-billion_15.html

Intel is by far the most ruthless company when it comes to willing to give something up for a strategic advantage in the future. If any company is willing to give up on margin or accept a loss to gain a competitive advantage it is Intel.

If intel gives these at cost which is relatively cheap compared to some of Intel't other contrarevenue schemes, at the very least Intel will ensure that their discrete chip does not follow the failures of larrabee which makes it completely worth doing. Giving these chips at cost to prevent another larrabee failure is actually cheap considering the billions they are likely pouring into their next discrete graphics.

The NUC is a proof of concept that was never meant to translate into real volume, selling it at low cost has no strategic value since intel is already dominating in mobile PCs/home theater computers.

With Apple moving away from Intel, Intel will have an fill empty capacity of 20 million PC's annually. A console sounds like the right fit
Seeing that the graphics in consoles are lower level coded, do you really thing it'll be easy to replace GCN optimizations? Sounds almost as hard as Nvidia using ARM to break into the market for console APUs. X86 and GCN have strong persistence due to their long established presence. Not saying it can't happen, just that the barriers are very high and once AMD does not get greedy, it should remain in use for quite a while.
 
  • Like
Reactions: DarthKyrie and Ajay

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
Correct, an entire ecosystem built around GCN and its evolutionary offspring + X86 CPUs
It's not even about their SIMD ISA.
AMD is nice and fluffy to deal with, and their semi-custom biz proved to be a well-oiled machine, which makes the likes of Sony 300% happier.
 
  • Like
Reactions: DarthKyrie