News Intel GPUs - Intel launches A580

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
Quickly? It took AMD forever to catch up. I get your point though caught up with a single release. Do you see that happening with Intel?

Who knows, but I know AMD is not in a position to be caught with their pants down like Intel. But I doubt they will be complacent.
 

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
if you consider 5 1/2 years to be quickly.

Yes it took 5 years for Ryzen to reach the market, but up to Ryzen's release it seems most people here and probably at Intel were blindsided by how competitive Ryzen is. Thats what I mean by overnight. Maybe Intel has had something in the works for years now. I hope AMD treats the next 2 years with a sense of urgency,ie, 2 years is the amount of time for them to not only keep pace with Nvidia but to greatly outpace Intel
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Intel is in a much better place to do well in GPUs than their last attempts. They've got a team thats been executing on a really pretty decent iGPU over many iterations. They've got 2 iterations of scaling up and building fabric for Phi, which is GPU-like in many respects (fast attached memory, fabric-heavy). If they dedicate the resources, they could do it. Not to mention their normal advantages - direct access to leading edge process tech
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
None of those need full x16 (apart from some high end video capture, which I would not count as consumer). I already mentioned storage, USB is built into the chipset, and ethernet falls under the "miscellania" which can be serviced with an x1 or x2 link.

USB in the chipset is implemented using PCIe internally. Oh, and Thunderbolt can use 8 lanes.
 
Mar 10, 2006
11,715
2,012
126
How many other consumer applications of it need a full X16 Gen3 link? Sound is in the chipset, WiFi is getting integrated into the chipset... Leave an X4 link for storage, couple of X1 links for miscellania, and you're done.

Look, bottom line is there's a big market for discrete GPUs, and if Intel is dumb enough to not include a full x16 link on its CPUs, then AMD will capitalize on that mistake and become the de facto gaming CPU vendor overnight.

Not gonna happen.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I welcome our new graphic card overlords.

When one company (AMD) keeps on slacking and cant offer much competition , its good that someone see an opportunity to try and take over some marketshare :shortcake:
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
They already do that, console semi-custom designs are merely beefy APUs.

No they are not. Anything with Jaguar cores is not beefy by modern standards. The only reason why consoles can get away with it is because they are built for a singular purpose (gaming).

ARM is not going to do anything even to 15W market.

Pretty much every analyst agrees that the threat of switching to ARM is one of the key reasons why Apple has been able to make Intel dance to their tune, and design products that are custom made for their needs (and largely useless for other OEMs). Most recently with Kaby Lake G.

Remember that Apple only sits on about 10% of the notebook market, and even less than that for the desktop market, so it's hardly because of market share that Intel has been so eager to please them (instead of bigger OEMs such as Dell, Lenovo or HP). Instead it's because Apple is seen as a trendsetter by consumers, and if they do something then other OEMs will inevitably follow.

Riding the meme learning boom. It's a bubble that will eventually pop.

Spoken like a true luddite
 

senseamp

Lifer
Feb 5, 2006
35,783
6,187
126
Intel's is just reacting to compute market shifting from CPUs to GPUs. CPUs have hit a wall, meaning it's only a matter of time before AMD and others catch up and commoditize that market. GPUs on the other hand are just getting started. The problem is that gluing your CPU to a 2nd ranked competitor's GPU is not a GPU strategy. It's a tactic at the fringes, a hail Mary pass. Intel beancounters don't have the stomach for a real GPU strategy. A real GPU strategy would mean that after being wrong about GPUs for decades, Intel would have to believe they have the best vision for what the GPUs look like in five to ten years, and bet the company on that vision. Not dabble in GPUs until they need to cut costs to meet earnings expectations. Not play anticompetitive games to try to force customers to buy whatever crap GPU they make. Have the right vision, commitment to it and perseverence through thick and thin over many many years of losses. I don't think Intel has it in them. Once their CPUs come under competitive pressure (now) and GPUs start taking over the gravy train in the datacenter (soon), Intel management will start cost cutting to keep up short term profits for Wall Street.
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Intel's is just reacting to compute market shifting from CPUs to GPUs. CPUs have hit a wall, meaning it's only a matter of time before AMD and others catch up and commoditize that market. GPUs on the other hand are just getting started. The problem is that gluing your CPU to a 2nd ranked competitor's GPU is not a GPU strategy. It's a tactic at the fringes, a hail Mary pass. Intel beancounters don't have the stomach for a real GPU strategy. A real GPU strategy would mean that after being wrong about GPUs for decades, Intel would have to believe they have the best vision for what the GPUs look like in five to ten years, and bet the company on that vision. Not dabble in GPUs until they need to cut costs to meet earnings expectations. Not play anticompetitive games to try to force customers to buy whatever crap GPU they make. Have the right vision, commitment to it and perseverence through thick and thin over many many years of losses. I don't think Intel has it in them. Once their CPUs come under competitive pressure (now) and GPUs start taking over the gravy train in the datacenter (soon), Intel management will start cost cutting to keep up short term profits for Wall Street.

I think your first part about using Vega as stop-gap of sorts is correct. I think your other part might be a bit off. I'm sure they have a vision and plan, but you can't just hire one person or throw money at a problem and expect results. They need experienced GPU engineers, and quite a few of them to catch Nvidia. In the coming months, I imagine we will see Nvidia increase it's R&D expenditure and perhaps branch out to Samsung's fab more (if it makes sense). Not that Nvidia has been idle in the past few years (quite the contrary), but with AMD falling behind a generation in perf/w and performance, having Intel entering the race will help keep competition healthy and in check.
 

senseamp

Lifer
Feb 5, 2006
35,783
6,187
126
With Google, Facebook, DL startups, etc, top engineering talent in parallel computing is not exactly lining up to join Intel to work on yet another GPU attempt. GPUs will always play second fiddle to CPUs at Intel. Top talent doesn't like playing second fiddle.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
With Google, Facebook, DL startups, etc, top engineering talent in parallel computing is not exactly lining up to join Intel to work on yet another GPU attempt. GPUs will always play second fiddle to CPUs at Intel. Top talent doesn't like playing second fiddle.
As if the people writing distributed computing software are the same people doing hardcore electrical engineering and ASIC design... the venn diagram for overlap there is small
 
  • Like
Reactions: Tlh97 and Phynaz

senseamp

Lifer
Feb 5, 2006
35,783
6,187
126
As if the people writing distributed computing software are the same people doing hardcore electrical engineering and ASIC design... the venn diagram for overlap there is small
These companies are all building their own hardware ASICs.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
These companies are all building their own hardware ASICs.
Small skunkworks teams are, at companies that employ thousands. Im sure theres a tiny little bit of overlap, not enough to make any macroeconomic difference
 
  • Like
Reactions: xpea

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
...

Intel may have packaged up an EMIB commercial proof of concept, using a Radeon dGPU chip but you can bet that future integrated designs from Intel will be all Intel once they get their dGPU ready, and AMD will be packaging all AMD solutions.

Where does that leave NVidia? Mostly on the outside without a CPU partner for laptops.

NVidia dGPUs will pushed to the niche hardcore Gamer Laptops, that run GTX 1070+

So the real loser here is NVidia, not AMD.

Those parts now have code names:

Intel Preps Their Own Discrete GPUs For Gen 12 and 13 Codenamed Arctic Sound and Jupiter Sound – Will Be Featured in Post-Cannonlake Chips Replacing AMD’s dGPU Solutions
 
Last edited:

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
Man... Those are lame code names for a GPU. They need something like "Rampage" (an old 3dfx codename) or "Groove Chicken" (because it just sounds cool) :)
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
Yeah, you gotta figure that mining performance is a concern for any modern GPU at this point, even if the chip makers want their products to be used for gaming. Something for the “Project Groove Chicken” engineers to think about, anyway.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Yeah, you gotta figure that mining performance is a concern for any modern GPU at this point, even if the chip makers want their products to be used for gaming. Something for the “Project Groove Chicken” engineers to think about, anyway.

If Intel really wanted to break into the GPU card business, they could intentionally block mining on their cards, and even if they aren't technically as good as AMDs/NVidias, they have much better real world price/performance and availability because of that.

Then Intel would gain lots of mind/market share among gamers, and more reason for developers to optimize for Intel GPUs...