News Intel GPUs - Intel launches A580

Page 79 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

linkgoron

Platinum Member
Mar 9, 2005
2,300
821
136
-In defense of Raja, he seems to gravitate toward lost causes or insurmountable challenges in this day and age. It's easy to judge from a distance, but think of the scope of the tasks he has recently tried to take on:

- Taking over the Radeon group at AMD at the absolute nadir of AMD's profitability and competitiveness in all markets. Vega and Polaris were not all they were cracked up to be for sure, but they kept AMD competitive and driver packages were solid under his tenure. RDNA and CDNA were either developed or managed under his tenure as well.

- Bringing Intel's GPU department up from piddling iGPUs to full fledged discreet GPUs in 5 years, especially for a company like Intel, is no small feat. Will Raja ultimately succeed? All indicators point to no, but god damn if it doesn't look like he's giving it his all.

Raja's biggest downfall seems to be managing expectations. Dude loves to hype, and given some of the externalities surrounding his projects, the hype is rarely ever met. That gap between the perception he likes to create around his projects and the reality of what his projects will actually be able to accomplish do not do him any favors.
I've written a post about why I really really dislike Koduri quite a while ago https://forums.anandtech.com/thread...ectures-thread.2579999/page-137#post-40322782

People take challenges and fail, that's fine and even commendable. The issue is that Raja continues over-hyping his things since the AMD days, even until very close to launch, and when things don't pan out he just stays silent and just starts hyping the next big thing.

Note that AMD made a huge (IMO) return to form once Koduri left - with RDNA and RDNA2.
 

andermans

Member
Sep 11, 2020
151
153
76

768 ALU/64 bit GPU in a 30W thermal envelope scoring 3500 pts in TimeSpy, the same score that GTX 1650 scores with 50W thermal envelope. In mobile, GTX 1650 has 1024 ALUs and 128 bit bus.

This architecture should be alright, after all. Graphics throughput should be at least around Turing architecture.

the 2900 score was at 30W, no? Do we know the power for the 3500W score? That would be a huge swing if it is also at 30W.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The first video is not encouraging. It's showing crashes and graphical artifacts. It's not as bad as some feared, but still a ton of work to do.

I don't have a good feeling about them releasing it on notebooks first. Sure, the majority of GPU attention is with enthusiasts and gamers using dGPUs. But with Notebooks usually the reviews and reviewers are stuck with drivers at the time manufacturers set the notebook out and are older. So in a way lots of notebook users that aren't so technical will feel more of the bad since they don't know much about drivers and such.

In contrast the dGPU reviews are much higher standard and reviewers much more knowledgeable so they know about having the latest drivers and such. The nature of the dGPUs also necessitate the user know more about computers since they need to get it and install it physically and in software.

the 2900 score was at 30W, no? Do we know the power for the 3500W score? That would be a huge swing if it is also at 30W.

The earlier leaks have shown Time Spy GPU score being 2400 for Default mode and 3100 for Performance mode. The overall Time Spy scores are higher than Time Spy GPU so the first review is likely using lower power mode. Maybe it's 40W.
 

mikk

Diamond Member
May 15, 2012
4,141
2,154
136
They could easily update the drivers, although most notebook reviewer won't do this. Lower power mode 40W? A350M is rated 25-35W. In the first video you can see it uses between 20-24W in some of the games if this is accurate.
 

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
The only properly functioning right now thing in Intel GPU drivers appears to be the 3DMark benchmark.

Lets do some maths.

768 ALUs/64 bit bus, 2.2 GHz - 3500 pts.

33.3% more ALUs - 1024, 50% wider memory bus, with at least 60% higher bandwidth - 192 bit bus, and at best 10-12% higher clock speeds - 2.45 GHz.

What do we reckon guys, will it be able to get to 4500-4700 pts in 3DMark Time Spy?

I think the rumors were telling the truth, and on desktop A380 will be around 1650 Super - GTX 1660.

In 3DMark Time Spy...
 
  • Like
Reactions: lightmanek

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The first video is the lower scoring one and without 3DMark Tests.

The only properly functioning right now thing in Intel GPU drivers appears to be the 3DMark benchmark.

Lets do some maths.

768 ALUs/64 bit bus, 2.2 GHz - 3500 pts.

33.3% more ALUs - 1024, 50% wider memory bus, with at least 60% higher bandwidth - 192 bit bus, and at best 10-12% higher clock speeds - 2.45 GHz.

What do we reckon guys, will it be able to get to 4500-4700 pts in 3DMark Time Spy?

I think the rumors were telling the truth, and on desktop A380 will be around 1650 Super - GTX 1660.

In 3DMark Time Spy...

3DMark has the same flaw as Geekbench that it tests it in a GPU intensive scenario, then in a CPU intensive scenario, and gets this arbitrary score from a bizarre combination of two scores. They need to show the GPU score as the final score, and have CPU score separately from the whole thing. The GPU intensive scenario has to take CPU into account so it's a terrible way of doing things.

It's like if a car manufacturer showed horsepower numbers, and actual performance numbers(say 0-100 acceleration times) and combined the two into a final result.

33% more ALUs, 10% higher clocks and 50% more bandwidth will result in 50% better performance. So you'd get 4500-4700 points in Time Spy GPU, versus 3100 it gets with the A350M using high power mode.
 

Tup3x

Senior member
Dec 31, 2016
965
951
136

768 ALU/64 bit GPU in a 30W thermal envelope scoring 3500 pts in TimeSpy, the same score that GTX 1650 scores with 50W thermal envelope. In mobile, GTX 1650 has 1024 ALUs and 128 bit bus.

This architecture should be alright, after all. Graphics throughput should be at least around Turing architecture.
It will definitely take them time to get the drivers right. I guess they are refactoring and rewriting large parts of their drivers and for that reason they just aren't at the level where they could be. In any case, I think the future looks quite promising. Things should get much more heated when next generation cards from all manufacturers are out. This generation is more or less at test run and main point is just to get the drivers ready for Battlemage (which might come out surprisingly soon I guess).

It's been ages since we had three competitive players on the field.
 
  • Like
Reactions: Tlh97 and psolord

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
The issue is that Raja continues over-hyping his things since the AMD days, even until very close to launch, and when things don't pan out he just stays silent and just starts hyping the next big thing.

The way you want companies to behave, is to under-promise, and over-deliver.

He seems to have a history of the doing the opposite.
 

xpea

Senior member
Feb 14, 2014
429
135
116
The drivers seem really rough. Things like control panel showing "unknow GPU" with latest driver is embarrassing. And same arch (Iris Xe / DG1) having 2 driver branches that are not compatible sounds very fishy too.
And we're talking about a product that already been delayed half a year. wow just wow. It will be a loooooooong road for Intel...
 
Jul 27, 2020
16,340
10,352
106
Note that AMD made a huge (IMO) return to form once Koduri left - with RDNA and RDNA2.

David Wang was responsible for R300. Incidentally, it was the first AMD GPU I bought (coz the Nvidia alternative was so crap at the time). These star performers really make or break a product.
 
  • Like
Reactions: lightmanek

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
Isn't Lisa Pearce responsible for Driver division on Intel GPUs?
 
Jul 27, 2020
16,340
10,352
106

Shader replacement to optimize performance. No matter how well they do it, it won't compare to shader code written specifically to take advantage of Arc. Down the line, we can expect performance improvements as more and more developers get on board and cozy with Arc but getting there is likely a pretty drawn out uphill battle.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Isn't Lisa Pearce responsible for Driver division on Intel GPUs?

Yes.

@igor_kavinski

And Raja gets too much flak/credit for what he does. He joined ATI in 2001, and David Wang joined in 2000. R300 came out in 2002. They are responsible for different parts of the division and one can't be faulted and/or credited for everything.

Also, if you consider product development cycles, Raja had a part in RDNA and RDNA2 as well. Raja didn't like working at AMD which will contribute to issues. Again, you'll have a bigger impact at smaller companies like AMD than at places like Intel.
 

moinmoin

Diamond Member
Jun 1, 2017
4,954
7,672
136
He joined ATI in 2001, and David Wang joined in 2000. R300 came out in 2002.
That's the wrong way to put it. Koduri joined ATi in 2001 after he worked for S3 Graphics since 1996. Wang worked for SGI starting in 1993, left with his team to form ArtX in 1997, which was bought by ATi in 2000. The former is somebody jumping to the competition, the latter is a continuous line, and R300 was the first ATi product by the former ArtX team.
 
  • Like
Reactions: Tlh97 and maddie

maddie

Diamond Member
Jul 18, 2010
4,747
4,691
136
That's the wrong way to put it. Koduri joined ATi in 2001 after he worked for S3 Graphics since 1996. Wang worked for SGI starting in 1993, left with his team to form ArtX in 1997, which was bought by ATi in 2000. The former is somebody jumping to the competition, the latter is a continuous line, and R300 was the first ATi product by the former ArtX team.
Never knew Wang went back so far. ArtX, basically saved ATI if I remember correctly.
 

majord

Senior member
Jul 26, 2015
433
523
136
Yes.

@igor_kavinski

Raja didn't like working at AMD which will contribute to issues. Again, you'll have a bigger impact at smaller companies like AMD than at places like Intel.

Raja only joined AMD because was able to do his side gig(s) at the same time (i.e they were happy for him to do so) . That says a lot in hindsight
 
  • Like
Reactions: Tlh97