News Intel GPUs - Intel launches A580

Page 70 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
These graphs that start the first part at 1-10 and end at the fifth part to 10,000-100,000 are the stupidest thing I have ever seen, since Nvidia's off base graphs, that showed going to 60 from 55 like 100%.
 
Jul 27, 2020
15,759
9,821
106
75W product that shouldn't be worth more than $75 if GPU prices weren't crazy. It's overall a bit slower than 6500 XT and the bad drivers could make it worth much less.
 

Grabo

Senior member
Apr 5, 2005
240
40
91
Yet we are here

Yes, and it'll be interesting to see some game/graphics relevant benchmarks and not openCL further along the line, and what the market is like once Arc discreets launch in the northern hemisphere summer. Probably somewhat like now, so will they have realistic msrps or not..well, lot's of time to speculate before crazy dragon lands.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,174
126
You see this is why intel fails today.
They completely missed the launch opportunity window.
Had they released these gpu's 4 months ago, people would not care if it didn't perform better then the competition.
People would care that it would be a alternative and a possible replacement/carry over to the unavailable selection, and also allowing it to be sold even if it didn't perform as well as intended.
Money from sales then could fund RnD to make the GPU better in the next cycle / generation, so it could perform as well, if it didn't, instead of trying to delay this as long as possible to tweak an already difficult segment which team green and red has made comparable to fort knox.

These gpu's needed to be released 4 months ago, not 4 months from now.
4 months from now, and people will start looking heavily at metrics instead of looking at "$" signs.

I sometimes question Intel's marketing.
It really feels like its a team of metaverse idiots who think brand will push them to the next checkpoint, when intel has already lost most of that.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
These gpu's needed to be released 4 months ago, not 4 months from now.
4 months from now, and people will start looking heavily at metrics instead of looking at "$" signs.

They wanted to launch it earlier but couldn't, because it's hard. If they release it now with immature hardware and drivers it'll be way worse.

By the way I don't expect prices to plummet. It'll probably drop over time but the authoritanism *cough* I mean lockdowns *cough* I mean covid and war threats are not going to help. In fact, it might be a drop for a bit and rise even further in the future years.

Results are a bit all over the place and that doesn't really tell anything about actual gaming performance but considering it's a 75W products, that doesn't look to bad.

Worst benchmark software ever tries to stay relevant.

Sisoft figures are relevant to nobody. I'm pretty sure claiming "first" will get them attention, but the data is useless.
 
Last edited:
  • Like
Reactions: NTMBK

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
By the way I don't expect prices to plummet. It'll probably drop over time but the authoritanism *cough* I mean lockdowns *cough* I mean covid and war threats are not going to help. In fact, it might be a drop for a bit and rise even further in the future years.

The war will only lower demand due to simply not shipping to Russia anymore and obviously their economic issues on top. I doubt that will really matter however.

Another issue with further delay is that then RDNA3 and lovelace will be too close. so if you waited out the extreme prices why not wait another 6 months for huge increase in performance/$.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
Performance yes. Performance/$... maybe compared to current prices

Of course compared to current prices. Even if they don't go down we will probably get 1.5x-1.7x performance increase per level. I don't see how they can hike prices further. People simply won't buy anymore.

EDIT: I also think AMD really low-balled GPU production. When I look at used-market here it's about 10:1 in terms of 3000series vs rdna2. I think NV basically had to supply the market almost alone and AMD was happy to just make big bucks from server cpus.
 
Jul 27, 2020
15,759
9,821
106
When I look at used-market here it's about 10:1 in terms of 3000series vs rdna2.
Or there was enough production but lots of shrink wrapped unsold RDNA2 inventory that scalpers are hanging onto. Maybe more people are paying scalper prices for Nvidia GPUs than AMD GPUs.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
The war will only lower demand due to simply not shipping to Russia anymore and obviously their economic issues on top. I doubt that will really matter however.

You really belive that? stopping shipments during war time is very common, but it will resume after the war is over whiout telling anyone. Just like many, many others, they are doing that just to say they are doing something, but in the end that does not do anything.
 
Jul 27, 2020
15,759
9,821
106
I suppose Ukraine has thousands, if not millions of viable GPUs tied up in mining? They will either get destroyed in bombing or smuggled out. The latter might provide some relief to the gamer community, assuming the Ukranians don't resume their mining operations wherever they take refuge.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
You really belive that? stopping shipments during war time is very common, but it will resume after the war is over whiout telling anyone. Just like many, many others, they are doing that just to say they are doing something, but in the end that does not do anything.

Depends on the outcome of the war. Sanctions might last for a long time.

Anyway- let's not get sidetracked into politics please. P&N has a thread about the ongoing war.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
Yup. They could have created special mining rigs with DG1 GPUs. Large companies are so dumb.

Not talking about DG1. Talking about DG2. The idea that Intel is going to make big money on this is dead even without a real price decrease on nV/AMD cards.

Of course compared to current prices. Even if they don't go down we will probably get 1.5x-1.7x performance increase per level. I don't see how they can hike prices further. People simply won't buy anymore.

To be clear, the 3080's real price is like $1300-1400. The $1300-1400 MSRP card could very well be 50-70% faster than the 3080; the $699 card won't be.
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
EDIT: I also think AMD really low-balled GPU production. When I look at used-market here it's about 10:1 in terms of 3000series vs rdna2. I think NV basically had to supply the market almost alone and AMD was happy to just make big bucks from server cpus.
While making very tiny bucks from console chips!

 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Another issue with further delay is that then RDNA3 and lovelace will be too close. so if you waited out the extreme prices why not wait another 6 months for huge increase in performance/$.

You know it doesn't work that way. Same logic could have been made for DG1. Getting DG2 out there will stabilize lots of things for them so it'll be lot better for DG3. Market positioning, pricing, vendor relationships, drivers are few I can think of. In fact rumors suggest Intel's ambition is much greater with DG3, and you will not get there without getting a feel first.

Why didn't AMD skip RDNA and go straight to RDNA2? Because the foundation for RDNA allowed RDNA2 to happen.

Also, DG3 isn't 6 months away. So you might end up with "why not wait 6 months for next gen" forever.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Not talking about DG1. Talking about DG2. The idea that Intel is going to make big money on this is dead even without a real price decrease on nV/AMD cards.



To be clear, the 3080's real price is like $1300-1400. The $1300-1400 MSRP card could very well be 50-70% faster than the 3080; the $699 card won't be.
To be clear, the 3080's real price is like $1300-1400.

Can you expand on this?
 

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
To be clear, the 3080's real price is like $1300-1400.

Can you expand on this?

Tell that to all the folks who paid $699 for their 3080. Just 2 weeks ago I had the 'opportunity' to buy one for $919. I elected not to since I bought a 3090 for near the price you are quoting.
 

mikk

Diamond Member
May 15, 2012
4,112
2,108
136
Several different sources have told me of another Intel card postponement, at least as far as SKU 1 to 3, the three performance models, are concerned. There are currently consistent rumors that the first models will be released between May 2, 2022 and June 1, 2022. Furthermore, the tape-out of the QS (qualified sample) should not take place for 1-2 weeks.


I think this is true between May and June unless there is another delay. SOC2 should come earlier though, we get a first glimpse from this.

Igor suspects they have software issues. I have to say that Intel did release only 1 driver this year for their iGPU lineup, to me this is a big hint that their driver team is mainly focused on DG2 drivers and that's why Intel didn't release iGPU drivers frequently in the last few months.
 
Jul 27, 2020
15,759
9,821
106
Raja Koduri interviewing candidates for the driver team:

Raja: Why should we hire you?

Candidate 1: I have years and years of experience writing drivers. I understand hardware more than most and I am able to debug the most obscure issues because I'm relentless in my approach to troubleshooting and finding the root cause.

Raja: That's great. What else do you like to do, other than writing code?

Candidate 1: I love tinkering with Raspberry Pi's. I have quite a few fun projects in various stages of development in my garage.

Raja: Oh nice. Well, I guess you meet a lot of our criteria. We'll get back to you soon.

Raja: Why should we hire you?

Candidate 2: I love games, man. I grew up playing Commander Keen and Doom. I'm a very good C coder. I haven't done hardware programming per se but I can get up to speed pretty quickly using Github. You will find me very passionate. I'm passionate about everything. By the way, I love that you are the chief technical advisor for Makuta VFX. Man, the GPU render farms to do those effects in "Baahubali", that must have required phenomenal GPU power.

Raja: Oh yeah. That's always been my focus. I want to bring GPUs into every aspect of our lives. GPUs should be everywhere. You know what? I'm a great judge of someone's untapped potential and your interests and passion seem right up our alley. When can you join???

'Baahubali' Tech Wizard Raja Koduri Discusses Making Movie And VR Magic (forbes.com)
Baahubali 2: 'Baahubali VR' - How AMD's ambitious effort for the visually breathtaking epic could be a game changer (indiatimes.com)
 
  • Haha
Reactions: psolord