Intel CEO confirms first dGPUs in 2020

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
7,843
5,999
136
I mean, AMD has been selling their "Game Crate" at Newegg, and they often get sold out. (Ryzen 5 1600 CPU, MSI RX 580 GPU, MSI AM4 mobo).

Those sell out precisely because you can't buy an AMD GPU at MSRP due to miners. I wouldn't even be surprised if the miners were buying the crates as well and trying to resell the CPU and board.
 

HutchinsonJC

Senior member
Apr 15, 2007
465
202
126
25% of a small pie is better than 0% of a big pie. No shareholder will complain about additional profit either. They only care about margins if your market isn't growing.

I guess I'm thinking of it from the other side of the table, too.

Few buyers will be interested in a product that's 3rd rate unless the price/performance ratio is such that the buyer can accept the transaction for their purposes or the power consumption characteristics in relation to the performance are more in line with what they are interested in.

The customers that Intel is losing... is largely to Nvidia (for parallel compute). It's not being lost in the same way to AMD. If it's not being lost to AMD the same way that it's being lost to Nvidia, and Nvidia is getting top dollar for their sales, then it leaves me to conclude that Intel already lost any kind of low end price ratio attempts.
 

Mopetar

Diamond Member
Jan 31, 2011
7,843
5,999
136
My point is that Intel isn't going to be able to compete in PC graphics. The low-end of that market is being squeezed by APUs already and Intel is probably good enough for business users, so the kind of discrete card that Intel is capable of making won't have enough performance to compete against AMD or NVidia cards unless Intel takes incredibly tiny margins along with a low sales volume that results in very little profit.

The market for compute tasks (AI, deep learning, etc.) is likely to continue to grow and the margins here are already insane. Even if Intel is the third-rate solution that has to offer the best performance/price ratio because they can't compete on performance alone, there's still a decent amount of money to be made, simply because the existing margins are so good.

Honestly, if they made cards that just targeted miners, I think everyone would be happy. Intel would get to sell a lot of cards and gamers would be able to get GPUs at reasonable prices. It's probably a lot easier for Intel to build a card that doesn't worry quite as much about graphics (it can still do them) and competing with AMD/NV when it can carve out its own niche and give them a large number of sales.
 

jpiniero

Lifer
Oct 1, 2010
14,618
5,227
136
The only way I could see this working is if Gen12 is a scalable design (ie: EMIB)... and of course 10nm is fixed.
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
I'll be happy if they use a Freesync-type feature (something in spec of Displayport or HDMI, etc. rather than anything G-sync). I want more pressure on nV to give it up with G-sync and just work with monitor makers to give everyone the benefits of adaptive sync tech without having to be stuck inside or outside an ecosystem.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I agree that big issue will be the drivers. Its hard to care about how fast the card is if its unreliable when it comes to playing games. See Titan V. And that's for the fastest chip there is - which Intel won't deliver on the first round.
 

jpiniero

Lifer
Oct 1, 2010
14,618
5,227
136
I'll be happy if they use a Freesync-type feature (something in spec of Displayport or HDMI, etc. rather than anything G-sync). I want more pressure on nV to give it up with G-sync and just work with monitor makers to give everyone the benefits of adaptive sync tech without having to be stuck inside or outside an ecosystem.

There were rumors that Intel added AdaptiveSync support to Gen10 but nothing ever came of it. Maybe they will add support for HDMI 2.1 VRR at some point.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
One thing we know is that Raja is in charge.

We have a lot if interviews with him laying out a vision for his graphics nirvana. Extremely high-res, high refresh rate, low power, HDR virtual reality indistinguishable from reality. He's still the same person and probably sees Intel as having the resources to achieve it vs AMD, who starved him.

A reasonable assumption is that he sold this same vision to Intel. In that case we will see them aiming high.


but it's also the same Raja that completely focused in gaining market share with mid-range/low end products with Polaris.


Yeah, and if we consider that Intel is looking at 2020, I'm leaning toward a thought that they may even lose more ground by then.



In another thread about AMD's ryzen/epyc, and talking in terms of the big business world, where upper management kinda dictate to you, based on experience and reputations, the tried and true... you end up where a lot of IT folks have this "AMD is off limits" being pushed on them.

Reputation, future compatibility, etc being the big point there.

With that said, and with Intel's gpu coming in 2020, how many folks will jump off the Nvidia or AMD wagon to even try an Intel "1060" level card? How many people will try something new, unless it's priced very (emphasis) well? And we know how Intel can be with margins. Reviews need to be kinda glowing to get people to take interest, I think. Otherwise, I think there's a huge chunk of gpu buyers that will stick with what they know.

I'm kinda looking forward to the bits and pieces of "leaked" info or official info that will come out from now until then in regard to Intel's renewed push for graphics acceleration. Should be interesting, anyway.

Intel already is the "leader" in terms of graphics due to their IGPs, their tech is proven to a point, and I think the Epyc situation is different, enterprises are more conservative with that I think, it's a large investment, and AMD is coming from a very bad place...

certainly Nvidia is the Intel of dGPUs right now, and it's not going to be easy, to convince gamers not to buy Geforce, and professionals not to buy Quadro and so on, if that's what Intel is going for.

but I would think Intel will be more successful with this than they were trying to take over mobile.

I agree that big issue will be the drivers. Its hard to care about how fast the card is if its unreliable when it comes to playing games. See Titan V. And that's for the fastest chip there is - which Intel won't deliver on the first round.

I haven't gamed on Intel IGPs for a while, but I was under the impression that their current stuff is not that problematic in terms of drivers and compatibility
 

maddie

Diamond Member
Jul 18, 2010
4,746
4,687
136
but it's also the same Raja that completely focused in gaining market share with mid-range/low end products with Polaris.
Was it him or Lisa and others? He continually spoke about extreme performance parts needed for his vision. I believe that he simply got frustrated and finally realized that he would never soon achieve what he personally wanted.

The ironical thing is that Intel margins are going to come under a very serious attack starting now and accelerating with Zen 2. They have a large cash stash but with the high share price to maintain, they might start to cut the fat soon. I hope he didn't, as they say, jump from the frying pan to the fire.

Intel has used their extreme profits from server parts to fund many questionable ventures. Those days are over.
 

ksec

Senior member
Mar 5, 2010
420
117
116
That is unless Intel' decide to open up GPU, instead of working like a black box behind drivers.
 
  • Like
Reactions: Headfoot

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
really embracing open source on the driver side would be a super interesting play. Still, doubt they will commit to it on account of the same IP concerns you hear from hardware companies
 

HutchinsonJC

Senior member
Apr 15, 2007
465
202
126
Intel already is the "leader" in terms of graphics due to their IGPs, their tech is proven to a point

First, I disagree on the merit that "I don't think most sane people would call Intel a leader of graphics". No one is looking at Intel to build a gaming rig. No one is looking at Intel for parallel compute in cars. People look at Intel as the cheap option to get by; it's good at that.

Their tech is prepacked with CPUs to buyers who may or may not even want the gpu portion.

Their tech is probably, by and large, used by businesses. Where graphics is the last real thought or consideration to make office or a webx show up on two different screens.

Their tech doesn't work all that well with offloading a directx8 or earlier game to a dedicated gpu even if you have one installed. People have resorted to the likes of dgVoodoo as a wrapper to upconvert old directx games to use newer directx APIs to get the game to run on their dedicated gpu. Most people would probably not realize the option. Though, it works well; I've tried it.

And as far as Intel tech being proven, I think the antithesis to this statement is that Intel started trying to use AMD graphics instead of their own and brought on board an ex-AMD graphics guy.
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
So, what's on your Intel video card wish list?

Personally, I'd want it to:

Be able to handle 4K resolutions at 120fps
Be under $500
Use less than 150W of power
Suck at cryptocurrency mining, so the miners don't jack up the prices on them :)
 
  • Like
Reactions: whm1974 and psolord

cytg111

Lifer
Mar 17, 2008
23,220
12,861
136
I cant make sense of this move.

Is 2020 the year where VR/AR growth goes exponential?
Dropping "phi" in favor of dgpus?
It is most certainly not iot.

Making a huuuuge investment to gain entry to a declining market(unless you think crypto is going to run forever), a very market that the established players already there is doing everything they can to try and branch out of.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
First, I disagree on the merit that "I don't think most sane people would call Intel a leader of graphics". No one is looking at Intel to build a gaming rig. No one is looking at Intel for parallel compute in cars. People look at Intel as the cheap option to get by; it's good at that.

Their tech is prepacked with CPUs to buyers who may or may not even want the gpu portion.

Their tech is probably, by and large, used by businesses. Where graphics is the last real thought or consideration to make office or a webx show up on two different screens.

Their tech doesn't work all that well with offloading a directx8 or earlier game to a dedicated gpu even if you have one installed. People have resorted to the likes of dgVoodoo as a wrapper to upconvert old directx games to use newer directx APIs to get the game to run on their dedicated gpu. Most people would probably not realize the option. Though, it works well; I've tried it.

And as far as Intel tech being proven, I think the antithesis to this statement is that Intel started trying to use AMD graphics instead of their own and brought on board an ex-AMD graphics guy.

their IGP is more often enabled than disabled, their IGP is in almost every PC sold the past 10 years (sure because it comes with the CPU), it's in all sorts of devices including higher end laptops, some with some gaming ambition (with Iris Pro and so on), while all their line is often used for casual gaming on cheaper laptops and the driver work for gaming is certainly a thing, and plenty of games also do the effort to support those IGPs, outside of gaming they have a competent GPU, their video encode/decode block is probably better or as good as Nvidia's, it can only get better...

the product that includes the AMD GPU on package now looks very temporary since they plan to make their on discrete GPUs that can possibly be used in that manner (same package, separate die with fast ram); also who knows when that project started and when they decided to make dGPUs... it was low volume, and it probably made sense, when they couldn't make a IGP that performs like that (not even AMD can, with your typical DC DDR4), and were not making dGPUs just for that product.
well, perhaps that's the product that convinced Intel that they need dGPUs? good job Raja?

older games with DX8 and bellow are often problematic for many reasons, I don't think that's a focus for any company right now.
 
Last edited:

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I wonder if they'll take another shot at hardware accelerated ray casting? I remember them working on it awhile back and prediction on it being viable hardware wise are in the same time frame that they are estimating.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
So, what's on your Intel video card wish list?

Personally, I'd want it to:

Be able to handle 4K resolutions at 120fps
Be under $500
Use less than 150W of power
Suck at cryptocurrency mining, so the miners don't jack up the prices on them :)

My Wishlist:

Support Freesync and generic VRR over HDMI 2.1
Better performance-per-watt and dollar than the competing Nvidia GX104
Driver suite with built in OCing, voltage control, fan curves, gameplay recording, frame rate cap, etc
Driver suite with built in Reshade/Sweetfx post processing effects like injecting AA and lighting

Dreamlist:
Comes to a licensing agreement with Nvidia to support G-Sync
 

ksec

Senior member
Mar 5, 2010
420
117
116
My Wishlist:

Open up the whole GPU access, a way to free the GPU from Driver blackbox which is not sustainable in the long term. We cant rely on Drivers team to tune for every AAA games. Nvidia, even before CUDA had much more software engineers then hardware engineers. Intel are not likely to invest as much, and probably don't want to. This also means it is easier to program with in GPGPU. There is absolutely no way Intel can catch up to Nvidia CUDA or Drivers within 2 - 3 years time frame. Even with the resources of Intel. So I think that is a logical way of doing it.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Nothing about this thing matters because its going to absolutely suck IMO. Its going to be a weak integrated graphics replacement and probably go into OEM machines so Intel can claim "upgraded graphics" in their Best Buy specials without having to rely on AMD or Nvidia for the chips. That's it. This thing will absolutely blow.
 
  • Like
Reactions: VirtualLarry

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Nothing about this thing matters because its going to absolutely suck IMO. Its going to be a weak integrated graphics replacement and probably go into OEM machines so Intel can claim "upgraded graphics" in their Best Buy specials without having to rely on AMD or Nvidia for the chips. That's it. This thing will absolutely blow.
It only needs to be halfway decent, I think. Just needs to be better than the Intel IGPs.
Maybe GT1030/GTX1050 class but cheaper than NV?

People using an Intel IGP might more readily upgrade to an Intel DGPU.
The name recognition on the card might be worth something.

Plus more "Intel Inside" stickers... :D
 
  • Like
Reactions: moonbogg

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
It only needs to be halfway decent, I think. Just needs to be better than the Intel IGPs.
Maybe GT1030/GTX1050 class but cheaper than NV?

People using an Intel IGP might more readily upgrade to an Intel DGPU.
The name recognition on the card might be worth something.

Plus more "Intel Inside" stickers... :D
This is true. If it runs very cool (passive), then the pre-mades will gravitate towards adding it for the upsell.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
It only needs to be halfway decent, I think. Just needs to be better than the Intel IGPs.
Maybe GT1030/GTX1050 class but cheaper than NV?

People using an Intel IGP might more readily upgrade to an Intel DGPU.
The name recognition on the card might be worth something.

Plus more "Intel Inside" stickers... :D

Yeah, exactly. That's what I'm thinking. Just an Intel branded graphics upgrade. This will also make Intel appear more independent because they can provide the major parts of the PC, like CPU and GPU etc. No Nvidia or AMD stickers needed to sell those Best Buy "gaming" computers. This would actually have a big impact of preventing all that free marketing Nvidia and AMD get from having those stickers on all those cheap store bought computers. It will just have Intel plastered all over it. Will it suck? Oh hell yeah. But I think you nailed it with the stickers comment. Intel is planning to make more Intel stickers. The GPU is just a side effect and happens to be necessary to accomplish that goal. Its not like their GPU will be able to play games or anything, lol. Lets not get carried away here.