News Intel GPUs - more reviews coming in!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kenmitch

Diamond Member
Oct 10, 1999
8,504
2,245
136
If Intel really wanted to break into the GPU card business, they could intentionally block mining on their cards, and even if they aren't technically as good as AMDs/NVidias, they have much better real world price/performance and availability because of that.

Then Intel would gain lots of mind/market share among gamers, and more reason for developers to optimize for Intel GPUs...
I don't think it would work out the way your thinking. Internet fanboys are set in their ways, don't go for change, use the past to predict the future, constantly move goal posts, and trash the competition with any silly thing they can think of. Don't believe me....Just look at the many AMD related threads in the forums.

Heavy compute and exceptional mining performance would be what it would take for Intel to maintain it's margins I'd imagine. Why go after the gamer? Data centers and mining farms are where the money is at in the end.
 

jpiniero

Lifer
Oct 1, 2010
12,579
3,982
136
My understanding is that you won't see any of these discrete GPUs until 2020 at the earliest. Have to figure mining will be long dead by then, and they seem more interested in adding HPC/DL features to Core as opposed to the current Phi. Maybe the real target is ProViz, but that's kind of a small market.
 

jpiniero

Lifer
Oct 1, 2010
12,579
3,982
136
Still don't think it's going to end well unless they have a scalable design ready to go with Gen12.
 

Despoiler

Golden Member
Nov 10, 2007
1,965
769
136
Still don't think it's going to end well unless they have a scalable design ready to go with Gen12.
Hey man you don't think they are ready to rock the industry with their prototype scalable graphics shroud? I know I'm already expecting HUGE things now that they've confirmed it's a thing. /s
 

Kenmitch

Diamond Member
Oct 10, 1999
8,504
2,245
136
Jeez....You think they could have at least made a better looking teaser image.

After viewing the image again....I get it their dreaming!
 

AtenRa

Lifer
Feb 2, 2009
13,832
3,068
136
I will really like to see how this will go. But to be honest, i dont expect a lot from Intel in the GPU department, but never say never.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,574
126
I sure hope they fire a shot across the bow of NV/AMD and get a third competitor into the market.
 

SPBHM

Diamond Member
Sep 12, 2012
5,023
377
126
I'll probably be looking for a new card around 2020, so I'm hoping for the best, also my first gaming experience with D3D was with the Intel 810, so many good memories from that one, so... I'm fairly positive, they improved massively since the GMA 950 days, and if they are serious about it and allocate the resources I'm sure they can deliver something decent.
 

NTMBK

Diamond Member
Nov 14, 2011
9,976
4,365
136
I still think PowerVR could come back to the dedicated market, especially with their previous work in dedicated ray tracing IP blocks.
The Atom chips with PowerVR graphics apparently had terrible drivers for Windows. They'd need to significantly improve their software to compete.
 

beginner99

Diamond Member
Jun 2, 2009
5,083
1,459
136
The Atom chips with PowerVR graphics apparently had terrible drivers for Windows. They'd need to significantly improve their software to compete.
I owned such a first gen netbook. It was barley usable in windows. In linux under a now extinct netbook OS (Jolicloud) with some tweaking it could almost play 720p mkv files, mostly it could just some frame drops now and then. That made it much more usable.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
The Atom chips with PowerVR graphics apparently had terrible drivers for Windows. They'd need to significantly improve their software to compete.
And the the drivers were nonexistent for the BSDs and Linux.
 

Despoiler

Golden Member
Nov 10, 2007
1,965
769
136
Ehh. AMD's marketing was never that good. I think AMD is better without their old guard.

Gregory Stoner left AMD for Intel also

https://twitter.com/angstroms/status/1056887553859706880?s=21

AFAICT he was a big developer/contributor to AMDs ROCm
He was. If you look at the ROCm github repository he is likely the one doing commits and also interacting with people trying to get things working, reporting bugs, etc. This could be good for AMD or just bad for Nvidia if he takes the open source, common tools ecosystem mentality with him. It would be 2 companys vs 1 on the software front.
 

Dribble

Golden Member
Aug 9, 2005
1,999
527
136
It's all relative - they are going to Intel, where they are getting a lot more then 25% extra money. In addition it's a greenfield build - no existing stuff to support, no keeping Sony/MS happy. They get to build the gpu they've wanted to for years if only AMD had the money/manpower to do it.
 
  • Like
Reactions: beginner99

EXCellR8

Diamond Member
Sep 1, 2010
3,894
792
136
First time seeing that teaser image... which kind of just looks like a Vega Frontier that they blacked out or de-branded.

2020 is practically a decade away in Tech years, so we'll see see it when we have it.
 

ASK THE COMMUNITY