News Intel GPUs - Intel launches A580

Page 41 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Its not about wanting anything, its because many people dont have confidence on Intel skills with drivers and expect them to resort to using tricks to get higher fps, they did this in the past.

My only experience with Intel on systems i personally used, not something that i got for someone else or related to my work, was the I5-2500K and a Z3735F Bay Trail Windows tablet and they both were a very positive experience, incluiding both those IGPs that i used for gaming from time to time, outside that i always used AMD systems (a K6-266 was my first one), and some people say that i like to complain when i point out bad things about AMD :confused_old:
 
  • Like
Reactions: Tlh97 and Leeea

Tuxon86

Junior Member
Jan 13, 2010
6
9
81
I want Intel to succeed. I want a GPU that is able to do mid to high 3D performances and a good amount of fast vram, I leave the ultra performance to NVidia and AMD at first, and also be Quicksync enabled.

I'd buy a mid range Intel GPU at MSRP or a bit over vs and NVidia/AMD at mining price presently.
 
Feb 4, 2009
34,553
15,766
136
I want Intel to succeed. I want a GPU that is able to do mid to high 3D performances and a good amount of fast vram, I leave the ultra performance to NVidia and AMD at first, and also be Quicksync enabled.

I'd buy a mid range Intel GPU at MSRP or a bit over vs and NVidia/AMD at mining price presently.

This guys gets it
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
It is not that people want Intel to fail, but rather people have cynical views toward Intel. Just not ready to accept the hype, smoke, mirrors, and "promises" at face value any more.

Especially since we all know from whom this hype is coming from. Poor volta...

Then like all projects it will be rushed in the end. I can guarantee they will have lots of issues with older games or unpopular games. Which would be a problem for me.
 
  • Like
Reactions: Tlh97 and Leeea

Ajay

Lifer
Jan 8, 2001
15,429
7,849
136
Lack of confidence in Raja and Intels driver team doesn't really mean that posters want them to fail in the end.

It's a big hill to climb is all.
I don't want them to fail, but there is certainly a lack of confidence in Raja and their driver team. At best, it's going to be a learning curve which could easily take several product cycles. At worst, they auger in and give up as they have done before. As some have pointed out, their best business opportunity for now would be putting out GPUs with very high hashrates (mining) - that would get them off the ground fast and give Intel time to play catch up.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126
their best business opportunity for now would be putting out GPUs with very high hashrates (mining)
Hey, why not? That would probably play well to the HPC crowd too, that is, if they could get any after the miners grabbed theirs.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,764
3,131
136
Well....Was it AMD holding Raja back or was it Raja holding AMD back?

Viewing the recent evidence it's looking like Raja was the guilty party. Viewing his postings it's looking like the Zebra hasn't changed it's stripes.
Then why did people from AMD in effect publicly thank him for RDNA2

Given time lines he/ his teams would of had to been heavily involved , also if you listen to any of the pod casts with Jim Keller he has positive things to say.

I think it was probably internal politics and AMD's hierarchy being very CPU side of the business and CPU focused. There is no doubt that in the lead up to Zen it was AMD number 1 priority. Seems like he tried for a power play didn't work and moved on. Remember within Intel he has already been effectively promoted.
 
  • Like
Reactions: lightmanek

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Continuing on what @itsmydamnation has said;

HardOCP covered how there was a struggle between Raja's team and AMD's CPU team. There was the more fanciful part about Raja was a big fan of Intel and always wanted to work there.

You cannot not blame Xe dGPU's future success/failure solely on Raja, because Xe architecture has been in works way before he arrived. Maybe he changed the course on few things(the rumor being that he convinced them to get dGPUs out on the market) but the overall architecture which determines power efficiency and performance?

You can see Xe is very much an Intel Gen GPU.

Also if you think he affected Xe that much, then we have to accept that RDNA2, and possibly as far in the future as RDNA4/5 Raja contributed to it. Certainly I wouldn't be surprised if RDNA2 has lots of Raja stuff.

I also don't think what I'm saying steers Xe dGPU being successful either. Intel has other problems. They've had driver issues for a while(even though it's better now) and their hardware sucked, not just in the performance department(not only issue, but the big contributor being that it was an iGPU) but compatibility as well.

Intel has problems with internal politics and infighting 10x AMD ever had with Raja and Lisa Su.
 
  • Like
Reactions: lightmanek

psolord

Golden Member
Sep 16, 2009
1,911
1,192
136
Then why did people from AMD in effect publicly thank him for RDNA2

Given time lines he/ his teams would of had to been heavily involved , also if you listen to any of the pod casts with Jim Keller he has positive things to say.

I think it was probably internal politics and AMD's hierarchy being very CPU side of the business and CPU focused. There is no doubt that in the lead up to Zen it was AMD number 1 priority. Seems like he tried for a power play didn't work and moved on. Remember within Intel he has already been effectively promoted.

The cheapos didn't even send him a 6900XT? Heck not even a 6800XT? pff xD
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Yeah the whole concept of AGP using the main memory was bad, BUT AGP allowed for graphics chips to be placed outside of the 133MB/s shared bandwidth of the PCI bus, that alone was a huge win.

Indeed.

I even remember a few specialised video editing cards using it to get data quickly in from source.

Edit; found it. ATi All-in-Wonder Radeon 8500DV.
 
Last edited:
  • Like
Reactions: Tlh97 and Leeea

Tuxon86

Junior Member
Jan 13, 2010
6
9
81
Indeed.

I even remember a few specialised video editing cards using it to get data quickly in from source.

Edit; found it. ATi All-in-Wonder Radeon 8500DV.

I had one of those! It was the start of my favorite hobby: HTPC, video capture/ripping/conversion... It cost me $899Cdn back in the day which was way more expensive than what a good gaming card was worth. But it was a dead end product and I went with a dedicated capture card, an Osprey-210, after that. Combined with Virtualdub and a small shareware to program the recording, it was really the best for SD capture. I replaced that with an Hauppaugee Colossus when the became available.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Not that hard to understand. It means the cut down 448EU version is ~5% away from RTX 3070 and the 128EU version is 14% faster than the GTX 1650.

Update: That's pretty much linear scaling between the 128EU version and the 448EU version.

Then I can't see why the full bred 512EU version won't perform like the 3070 Ti.

MLiD is right for Intel leaks again then.
 
Last edited:

uzzi38

Platinum Member
Oct 16, 2019
2,622
5,879
146
Not that hard to understand. It means the cut down 448EU version is ~5% away from RTX 3070 and the 128EU version is 14% faster than the GTX 1650.

Update: That's pretty much linear scaling between the 128EU version and the 448EU version.

Then I can't see why the full bred 512EU version won't perform like the 3070 Ti.

MLiD is right for Intel leaks again then.

TUM_APISAK looks through publically available benchmarks. There's only two where you'd actually see a 3070 be beat by a 6700XT. Fire Strike and AoTS.

In the case the benchmark is the latter, then the results won't even matter. Even on a N22/GA104 die that "game" is CPU-bound at 4K.

In the case it's the former, Xe graphics well outperforms where it actually should stand. For example, 96EU TGL often scores about 60% over the 4800U, despite the real world difference between the two capping out at about a 30% lead for TGL.

Without knowing what the test is, I would not advise immediately coming to the conclusion that MLID is actually correct on DG2's performance just yet.
 
  • Like
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Intel DG1 retail system review:

LPDDR4x? What are these Tiger Lake SoC with the CPU disabled?

I've seen more comprehensive test of Tiger Lake vs AMD APU, and the AMD APU wins, and expect it would win here as well.

IMO it doesn't bode well for their GPU when AMD's old Vega 8 APUs are beating it.
 
  • Like
Reactions: Tlh97