News Intel GPUs - Intel launches A580

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
No thanks, playing 4K will require me to get a 4K display and pay more for a high end GPU then I ever did before. I think $320 is the most I have ever spent on one.

I don't blame you for that. If all you want is 1440p 60Hz then things get cheaper. But take a look at this:

https://pcpartpicker.com/products/video-card/#c=438,436,425&sort=price

Cheapest 1660Ti is $280, cheapest 2060 is $350, cheapest 2070 is $485. That's a pretty big gap between the ~$250 range (1660Ti is barely there) and ~$500 range (2070). All you have is the 2060. Rest assured that NV will continue pushing that price point upwards when they release a successor card (3060 or whatever). My guess is the successor will be $380-$400, which you can easily pay for a 2060 right now if you like. And that's for a 6GB card! NV has more-or-less deprecated their other products, making them eventually irrelevant to pricing discussion.

AMD, on the other hand, has put its 7nm top dog at $700. Navi10 will probably be in the $200-$250 price range (basically on-par with where the 590 is today). Once Navi10 hits, all Polaris and most RX Vega will be considered deprecated as well, making them irrelevant to the price discussion. AMD may have a massive price gap of $500 between product tiers unless they can launch something faster/more expensive than Navi10.

Intel can see that clearly, so they want their top-dog product sitting near where the 2070 is today - $500.

Yesterday's GPU pricing tiers are gone. You have cards for 1080p and 60 Hz 1440p gamers sitting at $250 and less, while everything else is being pushed to the stratosphere - $500 and higher. Entry level, midrange, and high-end no longer means anything.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Yesterday's GPU pricing tiers are gone. You have cards for 1080p and 60 Hz 1440p gamers sitting at $250 and less, while everything else is being pushed to the stratosphere - $500 and higher. Entry level, midrange, and high-end no longer means anything.
I don't see most gamers going to 4K anytime soon due to the expense. I see 1440p/1600p as the sweet spot as that has moderate cost in comparison to 4K gaming.

1080p will still be the mainstream for the foreseeable future.
 
  • Like
Reactions: psolord

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
I don't see most gamers going to 4K anytime soon due to the expense.

That's the thing: NV, AMD, and now Intel already know that. There's a limit to the number of cards they can sell to people chasing top performance. So the trick is to push prices ever higher, and maybe drag everyone else along for the ride wherever possible.

When I first got into PC gaming heavily, my first card that I bought for a custom system was a GeForce 2 Ti 200 (I had an ATI card in my pre-built P5-100 machine). It was a decent midrange card that performed below GeForce3 solutions, but outperformed some previous GeForce2 products, and definitely sat above the entry-level category. I don't remember what I paid for it, but it had to be $150 or less. Here's an old roundup of GeForce 2/3 products from AT:

https://www.anandtech.com/show/873/24

The GeForce 2 Ti 200 products ranged in price from $115-$125 depending on which one you got.

Can you get a midrange card today for $115? No. Midrange is slowly vanishing anyway, but if you want to think of a 1660Ti as midrange, then you can see how prices have moved.

I see 1440p/1600p as the sweet spot as that has moderate cost in comparison to 4K gaming.

Only if you're willing to make compromises on framerates and/or image quality. Take the used card market out of the picture and look at what's available today: can you guarantee a 1% or .1% percentile of 60fps @ 1440p in current titles with something like an RX590?

https://www.anandtech.com/show/13570/the-amd-radeon-rx-590-review

For many of those games, the answer is "no". Even RX Vega 64 struggles in a few areas. My overclocked Vega FE had issues staying above 60 fps all the time in a few games that didn't like it, like Fallout 4 (though that may have been more my CPU). Compare that to the Quake III Arena benchmarks from the old 2002 AT benchmarks: every card tested was way over 60 fps! Resolutions were lower back then, but still. If you had a monitor that supported 1280x1024 or 1600x1200 then you could get that higher resolution and probably still not drop below 60fps very often.

So yes, I respect that you (and many others) have set price limitations on what you will pay. It's smart not to cross those lines. We are still all paying more for less. Intel knows that, and will price accordingly.
 
  • Like
Reactions: Tlh97

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
So yes, I respect that you (and many others) have set price limitations on what you will pay. It's smart not to cross those lines. We are still all paying more for less. Intel knows that, and will price accordingly.
Thanks. I'm on disability so I have some limits on what I can spend on hardware along with stuff I do need to spend money on. However I have a heavy soda addiction that I'm struggling to quit or at least control. Between HUD and what I get I can live decently and still save up money for some decent hardware, if I manage my money properly.

And I'm concern about the health effects of drinking some much soda.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
And I'm concern about the health effects of drinking some much soda.

Not exactly on-topic, but yeah, I would be too. I dumped most soda from my diet years ago since it was making me feel kinda funny. Now I just get water at restaurants. Plus if/when I need an energy drink at work, the caffeine kicks harder if I consume less recreationally.

If only Intel would sell us a midrange card for $125. How many 12-packs of Coke would you have to skip to buy something like that?
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Not exactly on-topic, but yeah, I would be too. I dumped most soda from my diet years ago since it was making me feel kinda funny. Now I just get water at restaurants. Plus if/when I need an energy drink at work, the caffeine kicks harder if I consume less recreationally.

If only Intel would sell us a midrange card for $125. How many 12-packs of Coke would you have to skip to buy something like that?
If I could quit altogether, or at least limit myself to one 2L bottle a day, I could afford to replace my 6 y/o system if saved my money for a few years.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
It would seem to me, there is a good reason people are buying up the stack. If you want meaningful improvements, you have to, as cards are not increasing in performance like they used to.

Oh the improvmenets are there but again they moved up the stack at least one level. If you want x50 level performance you now need to buy and pay x60 level price and so forth.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Oh the improvmenets are there but again they moved up the stack at least one level. If you want x50 level performance you now need to buy and pay x60 level price and so forth.
Good Grief, if Nvidia keeps this up a large number of gamers will price out of the dGPU market. Goodbye AAA titles?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Once Navi10 hits, all Polaris and most RX Vega will be considered deprecated as well, making them irrelevant to the price discussion.
What are the odds, that if Navi10 drops, that it will be a "gaming focused" architecture, and that it will, therefore, basically suck at compute (aka mining). Maybe the RX 570/580 has some life in it yet/still?

Or if they're still producing Radeon VII cards.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
When I first got into PC gaming heavily, my first card that I bought for a custom system was a GeForce 2 Ti 200 (I had an ATI card in my pre-built P5-100 machine). It was a decent midrange card that performed below GeForce3 solutions, but outperformed some previous GeForce2 products, and definitely sat above the entry-level category. I don't remember what I paid for it, but it had to be $150 or less. Here's an old roundup of GeForce 2/3 products from AT:

https://www.anandtech.com/show/873/24

The GeForce 2 Ti 200 products ranged in price from $115-$125 depending on which one you got.

Can you get a midrange card today for $115? No. Midrange is slowly vanishing anyway, but if you want to think of a 1660Ti as midrange, then you can see how prices have moved.

That's $125 in 2002 dollars. Adjusted for inflation, that's $175 in 2019 dollars- enough to buy you an RX 580.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
but RX 580 is not a midrange card in 2019, this goes to RX 2060/2070 and Vega 56/64 at $350-500
$500? Mid-range? That's SOME price inflation. I feel mid-range should be around $250. At least, my GTX 460 1GB OC cards were like $200 or so, and those were mid-range back in the day.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
$500? Mid-range? That's SOME price inflation. I feel mid-range should be around $250. At least, my GTX 460 1GB OC cards were like $200 or so, and those were mid-range back in the day.

I mean what todays midrange cards cost. $200-250 midrange cards were at the time that top-end was at $500-550. Today top-end is more than 1K USD.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
If only Intel would sell us a midrange card for $125. How many 12-packs of Coke would you have to skip to buy something like that?
One thing you can guarantee - Intel will not be "cheap". They have very high margins to maintain.

Imo whether Intel is successful I think heavily depends on whether they yet again insist on using x86. Intel has a real quandary in that they get their wins by trying to make everything x86 and then using patents to lock everyone else out of that market other then AMD. However x86 is an ancient dated architecture and really isn't suitable for gpus. But every time Intel try to break away from x86 (e.g. they made some ARM chips for a while, and there was Itanium) they fail.

If they were really brave they'd make a new architecture and then try to catch up, but that would be a long hard fight and take many years and a lot of money (being as Nvidia has a huge head start in hardware and software). If they are less brave they'll go x86 again, and fail again (see Larrabee).
 

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
One thing you can guarantee - Intel will not be "cheap". They have very high margins to maintain.

Imo whether Intel is successful I think heavily depends on whether they yet again insist on using x86. Intel has a real quandary in that they get their wins by trying to make everything x86 and then using patents to lock everyone else out of that market other then AMD. However x86 is an ancient dated architecture and really isn't suitable for gpus. But every time Intel try to break away from x86 (e.g. they made some ARM chips for a while, and there was Itanium) they fail.

If they were really brave they'd make a new architecture and then try to catch up, but that would be a long hard fight and take many years and a lot of money (being as Nvidia has a huge head start in hardware and software). If they are less brave they'll go x86 again, and fail again (see Larrabee).
Wait what?!?! Are current intel iGPUs based on x86?
No they aren't, so why on earth would you think that the GPUs would be x86 based?
All they have to do is to fit enough units into a small enough TDP and they will be competitive.
Even the cost factor is pretty irrelevant because intel could just sell at a loss to gain market share.
 

maddie

Diamond Member
Jul 18, 2010
4,722
4,625
136
Wait what?!?! Are current intel iGPUs based on x86?
No they aren't, so why on earth would you think that the GPUs would be x86 based?
All they have to do is to fit enough units into a small enough TDP and they will be competitive.
Even the cost factor is pretty irrelevant because intel could just sell at a loss to gain market share.
Love that line. So simple.
 
  • Like
Reactions: Tlh97

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
What are the odds, that if Navi10 drops, that it will be a "gaming focused" architecture, and that it will, therefore, basically suck at compute (aka mining). Maybe the RX 570/580 has some life in it yet/still?

I would say that Navi is probably not going to be a great compute card. We'll see.

Discrete GPUs, for Integrated Graphics prices? I'll go for that, maybe. Depends if the drivers suck or not.

That's the rub, isn't it? Intel drivers for their iGPUs are not always the bees knees.

That's $125 in 2002 dollars. Adjusted for inflation, that's $175 in 2019 dollars- enough to buy you an RX 580.

Okay, that is true. $175 doesn't get you 120 fps in 1080p or 1440p in the latest titles though, does it? What a GeForce 2 Ti 200 could do back then (in Quake III Arena) compared to what a RX 580 can do today in AAA titles is a vast difference.

One thing you can guarantee - Intel will not be "cheap". They have very high margins to maintain.

I'll agree with that. We also know that Intel will not use x86 in their dGPU - it will be based on Gen12. So we can already put aside that speculation.[/QUOTE]
 

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
I would say that Navi is probably not going to be a great compute card. We'll see.



That's the rub, isn't it? Intel drivers for their iGPUs are not always the bees knees.



Okay, that is true. $175 doesn't get you 120 fps in 1080p or 1440p in the latest titles though, does it? What a GeForce 2 Ti 200 could do back then (in Quake III Arena) compared to what a RX 580 can do today in AAA titles is a vast difference.



I'll agree with that. We also know that Intel will not use x86 in their dGPU - it will be based on Gen12. So we can already put aside that speculation.
[/QUOTE]

Yup, 1600x1200x32 in Quake III Arena the GeForce 3 Ti 200 nets you a whopping 92.7fps. Impressive considering the average resolution back then was 1024x768.

https://www.anandtech.com/show/831/7
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Wait what?!?! Are current intel iGPUs based on x86?
No they aren't, so why on earth would you think that the GPUs would be x86 based?
All they have to do is to fit enough units into a small enough TDP and they will be competitive.
Even the cost factor is pretty irrelevant because intel could just sell at a loss to gain market share.
Because their focus is compute not PC gamers - the margins and big money are in compute cards. Intel's eco system is based around x86, so you can see the draw of trying to make some new compute gpu's using x86 - exactly what they did with Larrabee.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136

Yup, 1600x1200x32 in Quake III Arena the GeForce 3 Ti 200 nets you a whopping 92.7fps. Impressive considering the average resolution back then was 1024x768.

https://www.anandtech.com/show/831/7[/QUOTE]

That takes me back. And I'd be floored if a $175 video card today would get you 92.7 fps at 4K, much less 1080p max settings.

Because their focus is compute not PC gamers - the margins and big money are in compute cards. Intel's eco system is based around x86, so you can see the draw of trying to make some new compute gpu's using x86 - exactly what they did with Larrabee.

Not trying to be repetitive here, but Intel's upcoming dGPU - Intel Xe - is Gen12. In fact, Gen11 is the last "Gen"-named iteration of Intel's graphics architecture. Xe is the new name. I can assure you that it is not another Larrabee, just as Gen11 and earlier are also not Larrabee. Intel killed off the last of their x86 compute cards . . . last year, I think?
 
  • Like
Reactions: Tlh97

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
Because their focus is compute not PC gamers - the margins and big money are in compute cards. Intel's eco system is based around x86, so you can see the draw of trying to make some new compute gpu's using x86 - exactly what they did with Larrabee.

I don't think Intel would be dumb enough to make that mistake twice. Xeon Phi is being killed off because x86 isn't competitive with GPUs in that environment.
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
I don't think Intel would be dumb enough to make that mistake twice. Xeon Phi is being killed off because x86 isn't competitive with GPUs in that environment.

The Phi is being killed off, but it is being replaced by the AP line.