What can we expect from Intel's first foray into discrete graphics?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ajay

Diamond Member
Jan 8, 2001
5,240
240
136
#26
If I had to guess, pretty good hardware, lackluster drivers. Intel will need to put alot into developer support like NV does to get anywhere.
 

Fir

Senior member
Jan 15, 2010
396
23
116
#28
Maybe they'll bring Matrox back.
You know because 2D is soooo important!
Solitaire never moved smoother or faster when you win! ;)
 

Stuka87

Diamond Member
Dec 10, 2010
4,211
165
126
#29
Maybe they'll bring Matrox back.
You know because 2D is soooo important!
Solitaire never moved smoother or faster when you win! ;)
Matrox had good 3d cards too. They just were not able to keep with with ATI during those years. nVidia was still playing catch up at that time as well. Matrox then moved to the pro market. But, Matrox still exist, and they still make add-on cards.
 
Jul 1, 2001
21,047
138
126
#30
^ This. Intel may eventually have a place in the market but drivers are going to take a LONG time to mature and as we've seen with NVIDIA, software makes a huge difference. I have extremely low expectations.
They've been making Integrated graphics drivers for a long time now. I'm hoping that experience will help them have stable, secure, and fast drivers out of the box or soon after.
 
Apr 27, 2000
12,356
1,318
126
#31
They've been making Integrated graphics drivers for a long time now. I'm hoping that experience will help them have stable, secure, and fast drivers out of the box or soon after.
I know at least Intel's OpenCL drivers are reputed to be terrible. Not sure about the rest of the driver stack.
 

Insomniator

Diamond Member
Oct 23, 2002
6,260
11
106
#32
Its going to be delayed, slow, buggy and expensive.
 
Apr 27, 2000
12,356
1,318
126
#34
Don't think this was posted yet.

https://www.anandtech.com/show/1428...y-tracing-acceleration-on-data-center-xe-gpus

Intel to Support Hardware Ray Tracing Acceleration on Data Center Xe GPUs
Huh. Okay, I'm gonna show my ignorance here: why would anyone want/need hardware raytracing support on a datacenter GPU? I might be able to muddle through the thought process halfway-competently, but surely someone here knows better than I. Does it have something to do with 8-bit integer performance?
 

maddie

Platinum Member
Jul 18, 2010
2,690
630
136
#35
Huh. Okay, I'm gonna show my ignorance here: why would anyone want/need hardware raytracing support on a datacenter GPU? I might be able to muddle through the thought process halfway-competently, but surely someone here knows better than I. Does it have something to do with 8-bit integer performance?
Movies and related stuff? Cad renderings?
 
Apr 27, 2000
12,356
1,318
126
#36
Movies and related stuff? Cad renderings?
Maybe. Generally I associated "data center GPU" with AI or HPC tasks. Since a lot of serious rendering work still takes place on CPU clusters.
 

Innokentij

Senior member
Jan 14, 2014
237
17
81
#37
Im thinking it will be just like it is atm, for laptops and beneath even AMD offering in perf wat. But i am glad we get more competition same as i am glad amd tries to compete in cpu again. That way i can get better and cheaper intel nvidia products.
 

Krteq

Senior member
May 22, 2015
744
62
136
#38
Don't think this was posted yet.

https://www.anandtech.com/show/1428...y-tracing-acceleration-on-data-center-xe-gpus

Intel to Support Hardware Ray Tracing Acceleration on Data Center Xe GPUs
Nah, it's only saying that Xe will be supported for the Intel framework API.

Full quote from press release:
Intel® architecture processors are the flexible, large memory capable, performance engines that drive the end-to-end creative process for visual effects and animated feature films. Today’s available GPUs have architecture challenges like memory size limitations and performance derived from years of honing for less sophisticated, “embarrassingly parallel” rasterized graphics use models. Studios continue to reach for maximum realism with complex physics processing for cloth, fluids, hair and more, plus modeling the physics of light with ray tracing. These algorithms benefit from mixed parallel and scalar computing while requiring ever growing memory footprints. The best solutions will include a holistic platform design where computational tasks are distributed to the most appropriate processing resources.

David Blythe’s recent blog provided initial insights into our exciting new Intel® Xe architecture currently under development. We are designing the Intel® Xe architecture as a cohesive acceleration companion to our continuing roadmap of Intel® Xeon® processors. As David closed his blog he mentioned, “We will look forward to sharing more details on the Intel® Xe architecture in the months ahead.” I’m pleased to share today that the Intel® Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel® Rendering Framework family of API’s and libraries.

Your existing investments in graphics and rendering solutions based on Intel® Rendering Framework open source products will seamlessly map to the exponential performance benefits of these flexible accelerated platforms. Further, ray tracing as a general computational technique for a variety of simulation computation beyond rendering is rapidly growing. To put it succinctly in my own words “Leave no transistor behind” by creating a holistic software and compute environment ready to maximize your workflow for exponential benefits.
No mentions about DXR etc.
 

Dribble

Golden Member
Aug 9, 2005
1,699
126
126
#39
What it suggests is that their primary market is pro-users not gamers.
 
Jan 17, 2019
138
28
51
#40
For gamers I'd say nothing. I just hope it works well and eventually we get more competition. Screw the crypto card pricing, it sucks.
 

railven

Diamond Member
Mar 25, 2010
6,534
230
126
#41
Hoping for: Intel to compete with AMD/NV in the current 2080 performance range, for probably the same price.

What I think will happen: Intel puts out something to cover the bottom tier, sell it as the new eSport king, and offer stupid bundles with Intel processors just to get it into every eSports category out there.

Either way, I got low hopes for their first card. But if it does well, it can paint a positive future. If it barely makes a splash, I'll just finish the NV tattoo on my arm. (this is a joke, relax!)
 

jpiniero

Diamond Member
Oct 1, 2010
6,496
295
126
#42
If it's fabbed at TSMC or Samsung, the compute product has a decent shot.
 
Mar 13, 2006
10,129
130
126
#43
They've been making Integrated graphics drivers for a long time now. I'm hoping that experience will help them have stable, secure, and fast drivers out of the box or soon after.
Their IGP drivers still suck. They are stable, but slow and featureless.
 

ozzy702

Senior member
Nov 1, 2011
980
203
136
#44
Their IGP drivers still suck. They are stable, but slow and featureless.
This is why I have very very low expectations for discrete Intel GPUs in the short term. Long term I'm sure Intel can be a competitor in the space if they stick with it for Gaming HPC, etc, but that's likely quite a ways out.
 

Head1985

Golden Member
Jul 8, 2014
1,701
183
136
#45
Well if intel wants market share and sell some cards to gamers they will need for atleast first 2 generations something like 4870 or ryzen.Not top performance, but close and for half price of competition.
 


ASK THE COMMUNITY

TRENDING THREADS