• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News Intel GPUs - we've given up on B770, where's Celestial already

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
if you consider 5 1/2 years to be quickly.

Yes it took 5 years for Ryzen to reach the market, but up to Ryzen's release it seems most people here and probably at Intel were blindsided by how competitive Ryzen is. Thats what I mean by overnight. Maybe Intel has had something in the works for years now. I hope AMD treats the next 2 years with a sense of urgency,ie, 2 years is the amount of time for them to not only keep pace with Nvidia but to greatly outpace Intel
 
Intel is in a much better place to do well in GPUs than their last attempts. They've got a team thats been executing on a really pretty decent iGPU over many iterations. They've got 2 iterations of scaling up and building fabric for Phi, which is GPU-like in many respects (fast attached memory, fabric-heavy). If they dedicate the resources, they could do it. Not to mention their normal advantages - direct access to leading edge process tech
 
None of those need full x16 (apart from some high end video capture, which I would not count as consumer). I already mentioned storage, USB is built into the chipset, and ethernet falls under the "miscellania" which can be serviced with an x1 or x2 link.

USB in the chipset is implemented using PCIe internally. Oh, and Thunderbolt can use 8 lanes.
 
How many other consumer applications of it need a full X16 Gen3 link? Sound is in the chipset, WiFi is getting integrated into the chipset... Leave an X4 link for storage, couple of X1 links for miscellania, and you're done.

Look, bottom line is there's a big market for discrete GPUs, and if Intel is dumb enough to not include a full x16 link on its CPUs, then AMD will capitalize on that mistake and become the de facto gaming CPU vendor overnight.

Not gonna happen.
 
They already do that, console semi-custom designs are merely beefy APUs.

No they are not. Anything with Jaguar cores is not beefy by modern standards. The only reason why consoles can get away with it is because they are built for a singular purpose (gaming).

ARM is not going to do anything even to 15W market.

Pretty much every analyst agrees that the threat of switching to ARM is one of the key reasons why Apple has been able to make Intel dance to their tune, and design products that are custom made for their needs (and largely useless for other OEMs). Most recently with Kaby Lake G.

Remember that Apple only sits on about 10% of the notebook market, and even less than that for the desktop market, so it's hardly because of market share that Intel has been so eager to please them (instead of bigger OEMs such as Dell, Lenovo or HP). Instead it's because Apple is seen as a trendsetter by consumers, and if they do something then other OEMs will inevitably follow.

Riding the meme learning boom. It's a bubble that will eventually pop.

Spoken like a true luddite
 
Intel's is just reacting to compute market shifting from CPUs to GPUs. CPUs have hit a wall, meaning it's only a matter of time before AMD and others catch up and commoditize that market. GPUs on the other hand are just getting started. The problem is that gluing your CPU to a 2nd ranked competitor's GPU is not a GPU strategy. It's a tactic at the fringes, a hail Mary pass. Intel beancounters don't have the stomach for a real GPU strategy. A real GPU strategy would mean that after being wrong about GPUs for decades, Intel would have to believe they have the best vision for what the GPUs look like in five to ten years, and bet the company on that vision. Not dabble in GPUs until they need to cut costs to meet earnings expectations. Not play anticompetitive games to try to force customers to buy whatever crap GPU they make. Have the right vision, commitment to it and perseverence through thick and thin over many many years of losses. I don't think Intel has it in them. Once their CPUs come under competitive pressure (now) and GPUs start taking over the gravy train in the datacenter (soon), Intel management will start cost cutting to keep up short term profits for Wall Street.
 
Intel's is just reacting to compute market shifting from CPUs to GPUs. CPUs have hit a wall, meaning it's only a matter of time before AMD and others catch up and commoditize that market. GPUs on the other hand are just getting started. The problem is that gluing your CPU to a 2nd ranked competitor's GPU is not a GPU strategy. It's a tactic at the fringes, a hail Mary pass. Intel beancounters don't have the stomach for a real GPU strategy. A real GPU strategy would mean that after being wrong about GPUs for decades, Intel would have to believe they have the best vision for what the GPUs look like in five to ten years, and bet the company on that vision. Not dabble in GPUs until they need to cut costs to meet earnings expectations. Not play anticompetitive games to try to force customers to buy whatever crap GPU they make. Have the right vision, commitment to it and perseverence through thick and thin over many many years of losses. I don't think Intel has it in them. Once their CPUs come under competitive pressure (now) and GPUs start taking over the gravy train in the datacenter (soon), Intel management will start cost cutting to keep up short term profits for Wall Street.

I think your first part about using Vega as stop-gap of sorts is correct. I think your other part might be a bit off. I'm sure they have a vision and plan, but you can't just hire one person or throw money at a problem and expect results. They need experienced GPU engineers, and quite a few of them to catch Nvidia. In the coming months, I imagine we will see Nvidia increase it's R&D expenditure and perhaps branch out to Samsung's fab more (if it makes sense). Not that Nvidia has been idle in the past few years (quite the contrary), but with AMD falling behind a generation in perf/w and performance, having Intel entering the race will help keep competition healthy and in check.
 
With Google, Facebook, DL startups, etc, top engineering talent in parallel computing is not exactly lining up to join Intel to work on yet another GPU attempt. GPUs will always play second fiddle to CPUs at Intel. Top talent doesn't like playing second fiddle.
 
With Google, Facebook, DL startups, etc, top engineering talent in parallel computing is not exactly lining up to join Intel to work on yet another GPU attempt. GPUs will always play second fiddle to CPUs at Intel. Top talent doesn't like playing second fiddle.
As if the people writing distributed computing software are the same people doing hardcore electrical engineering and ASIC design... the venn diagram for overlap there is small
 
...

Intel may have packaged up an EMIB commercial proof of concept, using a Radeon dGPU chip but you can bet that future integrated designs from Intel will be all Intel once they get their dGPU ready, and AMD will be packaging all AMD solutions.

Where does that leave NVidia? Mostly on the outside without a CPU partner for laptops.

NVidia dGPUs will pushed to the niche hardcore Gamer Laptops, that run GTX 1070+

So the real loser here is NVidia, not AMD.

Those parts now have code names:

Intel Preps Their Own Discrete GPUs For Gen 12 and 13 Codenamed Arctic Sound and Jupiter Sound – Will Be Featured in Post-Cannonlake Chips Replacing AMD’s dGPU Solutions
 
Last edited:
Man... Those are lame code names for a GPU. They need something like "Rampage" (an old 3dfx codename) or "Groove Chicken" (because it just sounds cool) 🙂
 
Yeah, you gotta figure that mining performance is a concern for any modern GPU at this point, even if the chip makers want their products to be used for gaming. Something for the “Project Groove Chicken” engineers to think about, anyway.
 
Yeah, you gotta figure that mining performance is a concern for any modern GPU at this point, even if the chip makers want their products to be used for gaming. Something for the “Project Groove Chicken” engineers to think about, anyway.

If Intel really wanted to break into the GPU card business, they could intentionally block mining on their cards, and even if they aren't technically as good as AMDs/NVidias, they have much better real world price/performance and availability because of that.

Then Intel would gain lots of mind/market share among gamers, and more reason for developers to optimize for Intel GPUs...
 
Back
Top