What can we expect from Intel's first foray into discrete graphics?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ajay

Lifer
Jan 8, 2001
15,431
7,849
136
If I had to guess, pretty good hardware, lackluster drivers. Intel will need to put alot into developer support like NV does to get anywhere.
 

Fir

Senior member
Jan 15, 2010
484
194
116
Maybe they'll bring Matrox back.
You know because 2D is soooo important!
Solitaire never moved smoother or faster when you win! ;)
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Maybe they'll bring Matrox back.
You know because 2D is soooo important!
Solitaire never moved smoother or faster when you win! ;)

Matrox had good 3d cards too. They just were not able to keep with with ATI during those years. nVidia was still playing catch up at that time as well. Matrox then moved to the pro market. But, Matrox still exist, and they still make add-on cards.
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
^ This. Intel may eventually have a place in the market but drivers are going to take a LONG time to mature and as we've seen with NVIDIA, software makes a huge difference. I have extremely low expectations.

They've been making Integrated graphics drivers for a long time now. I'm hoping that experience will help them have stable, secure, and fast drivers out of the box or soon after.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,830
136
They've been making Integrated graphics drivers for a long time now. I'm hoping that experience will help them have stable, secure, and fast drivers out of the box or soon after.

I know at least Intel's OpenCL drivers are reputed to be terrible. Not sure about the rest of the driver stack.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,830
136
Don't think this was posted yet.

https://www.anandtech.com/show/1428...y-tracing-acceleration-on-data-center-xe-gpus

Intel to Support Hardware Ray Tracing Acceleration on Data Center Xe GPUs

Huh. Okay, I'm gonna show my ignorance here: why would anyone want/need hardware raytracing support on a datacenter GPU? I might be able to muddle through the thought process halfway-competently, but surely someone here knows better than I. Does it have something to do with 8-bit integer performance?
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
Huh. Okay, I'm gonna show my ignorance here: why would anyone want/need hardware raytracing support on a datacenter GPU? I might be able to muddle through the thought process halfway-competently, but surely someone here knows better than I. Does it have something to do with 8-bit integer performance?
Movies and related stuff? Cad renderings?
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Im thinking it will be just like it is atm, for laptops and beneath even AMD offering in perf wat. But i am glad we get more competition same as i am glad amd tries to compete in cpu again. That way i can get better and cheaper intel nvidia products.
 

Krteq

Senior member
May 22, 2015
991
671
136
Don't think this was posted yet.

https://www.anandtech.com/show/1428...y-tracing-acceleration-on-data-center-xe-gpus

Intel to Support Hardware Ray Tracing Acceleration on Data Center Xe GPUs
Nah, it's only saying that Xe will be supported for the Intel framework API.

Full quote from press release:
Intel® architecture processors are the flexible, large memory capable, performance engines that drive the end-to-end creative process for visual effects and animated feature films. Today’s available GPUs have architecture challenges like memory size limitations and performance derived from years of honing for less sophisticated, “embarrassingly parallel” rasterized graphics use models. Studios continue to reach for maximum realism with complex physics processing for cloth, fluids, hair and more, plus modeling the physics of light with ray tracing. These algorithms benefit from mixed parallel and scalar computing while requiring ever growing memory footprints. The best solutions will include a holistic platform design where computational tasks are distributed to the most appropriate processing resources.

David Blythe’s recent blog provided initial insights into our exciting new Intel® Xe architecture currently under development. We are designing the Intel® Xe architecture as a cohesive acceleration companion to our continuing roadmap of Intel® Xeon® processors. As David closed his blog he mentioned, “We will look forward to sharing more details on the Intel® Xe architecture in the months ahead.” I’m pleased to share today that the Intel® Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel® Rendering Framework family of API’s and libraries.

Your existing investments in graphics and rendering solutions based on Intel® Rendering Framework open source products will seamlessly map to the exponential performance benefits of these flexible accelerated platforms. Further, ray tracing as a general computational technique for a variety of simulation computation beyond rendering is rapidly growing. To put it succinctly in my own words “Leave no transistor behind” by creating a holistic software and compute environment ready to maximize your workflow for exponential benefits.
No mentions about DXR etc.
 

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
For gamers I'd say nothing. I just hope it works well and eventually we get more competition. Screw the crypto card pricing, it sucks.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Hoping for: Intel to compete with AMD/NV in the current 2080 performance range, for probably the same price.

What I think will happen: Intel puts out something to cover the bottom tier, sell it as the new eSport king, and offer stupid bundles with Intel processors just to get it into every eSports category out there.

Either way, I got low hopes for their first card. But if it does well, it can paint a positive future. If it barely makes a splash, I'll just finish the NV tattoo on my arm. (this is a joke, relax!)
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
They've been making Integrated graphics drivers for a long time now. I'm hoping that experience will help them have stable, secure, and fast drivers out of the box or soon after.

Their IGP drivers still suck. They are stable, but slow and featureless.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Their IGP drivers still suck. They are stable, but slow and featureless.

This is why I have very very low expectations for discrete Intel GPUs in the short term. Long term I'm sure Intel can be a competitor in the space if they stick with it for Gaming HPC, etc, but that's likely quite a ways out.
 

Head1985

Golden Member
Jul 8, 2014
1,864
688
136
Well if intel wants market share and sell some cards to gamers they will need for atleast first 2 generations something like 4870 or ryzen.Not top performance, but close and for half price of competition.