News Intel GPUs - more reviews coming in!

Page 42 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tup3x

Senior member
Dec 31, 2016
714
612
136
DG2 uses different architecture so how DG1 performs doesn't really matter. DG1 is essentially integrated (laptop) GPU but with higher power budget and not exactly a new thing any more.
 
  • Like
Reactions: Tlh97 and mikk

Glo.

Diamond Member
Apr 25, 2015
5,114
3,727
136
TUM_APISAK looks through publically available benchmarks. There's only two where you'd actually see a 3070 be beat by a 6700XT. Fire Strike and AoTS.

In the case the benchmark is the latter, then the results won't even matter. Even on a N22/GA104 die that "game" is CPU-bound at 4K.

In the case it's the former, Xe graphics well outperforms where it actually should stand. For example, 96EU TGL often scores about 60% over the 4800U, despite the real world difference between the two capping out at about a 30% lead for TGL.

Without knowing what the test is, I would not advise immediately coming to the conclusion that MLID is actually correct on DG2's performance just yet.
MLiD has Intel sources, and he reports what they are feeding him with. Keep this in mind guys, in bashing of MLiD :p.
 
Feb 4, 2009
32,769
13,566
136
DG2 uses different architecture so how DG1 performs doesn't really matter. DG1 is essentially integrated (laptop) GPU but with higher power budget and not exactly a new thing any more.
Per what I saw in the video it appears to be a fairly competent card for what it is IMO.
 

guidryp

Platinum Member
Apr 3, 2006
2,663
3,535
136
Still doesn't change the fact that MLiD only reports what his "sources" for better or for worse they are, are feeding him with
MLID is an inaccurate irritant, who became the impetus for me to find a YT channel blocker, so I would never accidently see his crap again.

Sad, that some of you keep posting his BS like it matters.
 

Glo.

Diamond Member
Apr 25, 2015
5,114
3,727
136
MLID is an inaccurate irritant, who became the impetus for me to find a YT channel blocker, so I would never accidently see his crap again.

Sad, that some of you keep posting his BS like it matters.
And your post is exactly what I was implying.

Don't attack the messenger, attack the message that he/she brings.

Why?

Your post brought absolutely zero to the discussion. Only you care about your opinion of him. Nobody else.
 
Feb 4, 2009
32,769
13,566
136
Then why did people from AMD in effect publicly thank him for RDNA2

Given time lines he/ his teams would of had to been heavily involved , also if you listen to any of the pod casts with Jim Keller he has positive things to say.

I think it was probably internal politics and AMD's hierarchy being very CPU side of the business and CPU focused. There is no doubt that in the lead up to Zen it was AMD number 1 priority. Seems like he tried for a power play didn't work and moved on. Remember within Intel he has already been effectively promoted.
First off why the F does he get a card how about getting some to consumers
Second he is a successful guy at a successful company, what is up with the crappy laminate countertop and back splash?
I’ll issue a pass on the textured wall which I hate to do. Textured walls have no place in a break room and no place in a home kitchen. Very gouache.
Finally and this is pure speculation the white cabinet is far too bright for that wall and countertop. I do suspect it is a camera flash thing...

Above is mostly sarcasm
 

jpiniero

Lifer
Oct 1, 2010
12,580
3,982
136
First off why the F does he get a card how about getting some to consumers
Second he is a successful guy at a successful company, what is up with the crappy laminate countertop and back splash?
I’ll issue a pass on the textured wall which I hate to do. Textured walls have no place in a break room and no place in a home kitchen. Very gouache.
Finally and this is pure speculation the white cabinet is far too bright for that wall and countertop. I do suspect it is a camera flash thing...

Above is mostly sarcasm
He's in the Bay Area, right? Probably can't afford a nice place.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,518
3,567
136
You know who has a better track record?

Believe me I know who I'd be willing to bet on more between the two.
Hmm, I expected a bit more from Tum_Apisak.

LPDDR4x? What are these Tiger Lake SoC with the CPU disabled?

I've seen more comprehensive test of Tiger Lake vs AMD APU, and the AMD APU wins, and expect it would win here as well.

IMO it doesn't bode well for their GPU when AMD's old Vega 8 APUs are beating it.
Tigerlake can be 20-30% faster in the GPU test but not at low power level.

I have a feeling the CPU is a sore spot for power efficiency especially at the lower power levels, therefore it can't clock as high as Zen 2/3 mobile chips. The Zen 2/3 cores are very power efficient.

Also specifically for the DG1, you can see it performs upwards of 50% faster than the Iris Xe mobile even with the EU deficit. Certainly the CPU and the dedicated memory can be a huge deal in some cases.
 
Last edited:
  • Like
Reactions: Tlh97

PingSpike

Lifer
Feb 25, 2004
21,706
532
126
Any DG1s in existence that don't require special OEM motherboards yet? If not then this product doesn't exist for me. I really don't care about a discrete card that only works in a crappy OEM box.
 

jpiniero

Lifer
Oct 1, 2010
12,580
3,982
136
Any DG1s in existence that don't require special OEM motherboards yet? If not then this product doesn't exist for me. I really don't care about a discrete card that only works in a crappy OEM box.
Intel isn't intending to sell DG1 at retail so it doesn't matter.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,554
507
126
Any DG1s in existence that don't require special OEM motherboards yet? If not then this product doesn't exist for me. I really don't care about a discrete card that only works in a crappy OEM box.
No, none of them have a BIOS on board and rely on the Motherboard to provide it.
 

mikk

Diamond Member
May 15, 2012
3,682
1,560
136
He seems to have a lot of sources lol:

According to a couple of sources
According to a couple of sources at Intel even told me
I have verified with a couple of people now
I'm hearing
One person told me not so bad....however another told me pretty bad

He doesn't sound trustable to me.
 

PingSpike

Lifer
Feb 25, 2004
21,706
532
126
No, none of them have a BIOS on board and rely on the Motherboard to provide it.
That's what I thought, I was hoping it would change though. Its to bad because it would be a nice QSV card for servers but if its locked to an OEM board its basically still an iGPU.
 

Glo.

Diamond Member
Apr 25, 2015
5,114
3,727
136
The sad thing is that if they launched them NOW (like right now), they could STILL sell them for MSRP because you simply can't get a GeForce 3060 Ti for anywhere near retail price.
Rumors reported by both: RGT and MLiD say that those GPUs will be very low in prices. Paul said that midrange GPUs will be around 200 to 300$ MSRP, while MLiD said that what he hears is that 200-400$ with possible top end SKU to be anywhere between 350 to 500$(kek...).

So the fact that those GPUs could be slower than 3060Ti would fit with the rumors about prices.

Even in Q1 2022 they will not be bad deals.
 

insertcarehere

Senior member
Jan 17, 2013
515
450
136
With the state of GPUs now the top DG2 could be a dog that brings 3060 ti performance at 3090 power, as long as the MSRP is reasonable this will sell, and sell well.
 

Saylick

Platinum Member
Sep 10, 2012
2,169
3,848
136
With the state of GPUs now the top DG2 could be a dog that brings 3060 ti performance at 3090 power, as long as the MSRP is reasonable this will sell, and sell well.
*deep sigh* This is just sad to think about, if true. Nvidia might as well re-release the good ol' FX5800 Ultra and it could probably sell like hot cakes in this market.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,518
3,567
136
GamersNexus has a review now.

The review of the Cyberpower system shows board power of only 12W for the DG1. Even if we consider that to be chip power, the thing must be well under 30W.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,829
1,501
136
*deep sigh* This is just sad to think about, if true. Nvidia might as well re-release the good ol' FX5800 Ultra and it could probably sell like hot cakes in this market.
The really sad part is that compared to modern GPUs the FX5800 has a very modest TDP of 44W. Today, nobody bats an eyelid at 200-250W GPUs.

All the old hairdryer jokes and memes just seem quaint compared to whats happening today.
 
  • Like
Reactions: PingSpike

ASK THE COMMUNITY