News Intel GPUs - more reviews coming in!

Page 62 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gdansk

Golden Member
Feb 8, 2011
1,097
913
136
So currently we have nvidia on Samsung 8nm, AMD on TSMC 7nm and Intel ARC on TSMC 6nm, so they are not competing for the same wafers. But next gen both nvidia and AMD will be on TSMC 5nm, and Intel?
There are rumors of AMD refreshing Navi 2x on 6nm. We'll see.
And I expect next generation Intel to be on TSMC N4 or N3, as they'll likely be beat to market by AMD and Nvidia again.
 

biostud

Lifer
Feb 27, 2003
16,579
2,083
126
There are rumors of AMD refreshing Navi 2x on 6nm. We'll see.
And I expect next generation Intel to be on TSMC N4 or N3, as they'll likely be beat to market by AMD and Nvidia again.
I doubt they will refresh on 6nm, except for mobile GPU/CPUs.
 

eek2121

Platinum Member
Aug 2, 2005
2,054
2,502
136
So, in a gamer's mind, buying general-purpose PC hardware on the open market to use how one wants, is now "stealing"? Gamers are straight-up delusional creatures, and should be locked up, until de-programmed.

The gamers that I know of (IRL), are either not employed, or if they are, they're living at home, where their parent cooks and cleans for them.

At least I'm doing for myself, trying to be more financially free, by mining.

I wonder if "gamer homelessness" is going to be major thing in a few years, once most of their supporting boomer parents pass away. There's going to be a lot of delusional people wandering the streets, looking for a "power up".
You have a rather inaccurate view on gamers.

I am a gamer and I make over a quarter million a year. I am married and have kids. I game regularly with 6-8 different friends and we are all well off. Most of them are married or are in long term relationships and 3 of them also have kids. We travel to unique locations all over the world with a gaming laptop, see the sights and sounds, and game with with each other in person. We do this 1-2 times a year.

The gaming demographic is quite different today than what it was 20 years ago.
 

LightningZ71

Golden Member
Mar 10, 2017
1,404
1,578
136
I don't know about refresh, but, for a while, the rumor was that they would return to waterfalling models. Like, the current navi 21 for the 6900/6800 would get refreshed on N6 and become the new 7700xt/7600xt, the navi 23 woukd be moved to N6 and become the 7500xt. The existing navi24 might become the 7400xt. Memory speeds would go to 18Gbps, and clock speeds would increase a bit.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,383
3,333
136
First, this is an Intel dGPU thread.

Second, I am a miner, or I used to be. I am for freedom and against further regulation of the already oversized government.

I would not pay scalper prices. Why would I? The GPUs don't fall from heaven. I gotta go buy them, and if crypto prices drop I lose money. Who the hell would make money from buying a $2000 GPU to get $5 a day, maybe?

If you regulate every little aspect of human life, you are asking for Nazi Germany or North Korea like regimes again. I don't have to tell you that it's a terrible thing, unless we are talking about one of those unreasonable people and maybe you are. But most likely you are swept by the emotion and misled.

Don't confuse between individuals like us and those who sets up warehouses or make deals with the government and electrical companies to do it.
 

DrMrLordX

Lifer
Apr 27, 2000
19,998
8,984
136
In reality, ARC GPU stocks are not large + poor Intel GPU drivers.
Blasphemy! Raja can satisfy many gamers with but one dGPU and a loaf of bread.

Miners want to mine, pay for the damn specialized hardware
What specialized hardware? Are you sure you know what you're talking about? And boy howdy, if miners make you that angry, just wait how furious you'll be when you find out how much silicon the server market is stealing from the consumer market because of higher margins! Some hyperscalar has my Raphael right now as a part of his Genoa.

So currently we have nvidia on Samsung 8nm, AMD on TSMC 7nm and Intel ARC on TSMC 6nm, so they are not competing for the same wafers. But next gen both nvidia and AMD will be on TSMC 5nm, and Intel?
See below.

There are rumors of AMD refreshing Navi 2x on 6nm. We'll see.
Confirmed for mobile dGPUs, but unknown for dGPUs. It would be really nice to see AMD bring the RDNA2 family to N6.
 
  • Like
Reactions: Tlh97 and Leeea

eek2121

Platinum Member
Aug 2, 2005
2,054
2,502
136
First, this is an Intel dGPU thread.

Second, I am a miner, or I used to be. I am for freedom and against further regulation of the already oversized government.

I would not pay scalper prices. Why would I? The GPUs don't fall from heaven. I gotta go buy them, and if crypto prices drop I lose money. Who the hell would make money from buying a $2000 GPU to get $5 a day, maybe?

If you regulate every little aspect of human life, you are asking for Nazi Germany or North Korea like regimes again. I don't have to tell you that it's a terrible thing, unless we are talking about one of those unreasonable people and maybe you are. But most likely you are swept by the emotion and misled.

Don't confuse between individuals like us and those who sets up warehouses or make deals with the government and electrical companies to do it.
Does crypto contribute to the GPU shortage? Yes. Is it the reason for the shortage? No.

When the pandemic hit the US, our economy changed drastically. 2 things in particular helped cause the shortage:

  1. Most of us got multiple rounds of stimulus, even though we were working. Even though we had 6 figure incomes.
  2. Tons of new jobs opened up thanks to remote jobs and a record number of retirements. This meant that salaries had to go up in order to compete. In some areas, including mine, salaries have doubled or even tripled.
Those two things left gamers flush with cash, which means that they had money to spend on new builds or upgrades. The supply chain quickly became overwhelmed by the sudden spike in demand, starting with incorrect projections leading to lower fab allocations than needed.
Confirmed for mobile dGPUs, but unknown for dGPUs. It would be really nice to see AMD bring the RDNA2 family to N6.
I would love for AMD to release a Radeon 6950X with faster memory, faster clocks, and a shrink to 6nm. I wouldn’t mind it if TGP were raised to 350W. I would pay decent money for such a product.

I do hope that AMD, NVIDIA, and Intel focus on doing more with less in the future, however. Die sizes are increasing regardless of shrinks, even for Intel (DG3 dies rumored to be bigger than DG2). This drives prices and thermals up. I dislike the uber high thermals in particular. Long gaming sessions or mining warms up my office considerably, and it is in a basement that stays cool.
 

Leeea

Platinum Member
Apr 3, 2020
2,183
3,037
106
Who is paying the scalpers then? What do they get for just sitting on GPUs? Are they mining with GPUs until they can sell them? Someone has to be feeding this vicious cycle for it to go on. Die-hard gamers may pay scalper prices but most won't.
The answer is everyone.

The miners, the big miners, they run the numbers and figure it still makes sense. Miners also look at resale value of the card when determining their return. The numbers make sense for them, even at scalper prices.

The gamers, the ones with jobs and a house, $2000 for a video card makes them blink, but what else are you going to do in a socially distanced society? Gaming lets you get together with your friends, even if it is just virtually.

The professionals, especially the youtube video makers or want to be s, adobe premiere pro benefits greatly from a nvidia GPU. They use a "pugetbench" to measure performance, and they see good results even from a 24GB rtx 3090.

3d animation, 3d rendering, 3d art, etc all benifit from a nice GPU.

Same for scientific modeling, neutral nets, AIs, particle interactions, etc

Lane centering / adaptive cruise on newer vehicles. The I bought a GPU from Tesla, but their packaging was a bit excessive joke exists for a reason. Those things are running around with Navi23 GPUs. Same chip that is in the rx6600xt.


GPUs are more then just gamers and miners. Everyone wants one now.
 
Last edited:

mikk

Diamond Member
May 15, 2012
3,642
1,513
136
Vega 8 vs Xe comparison.

Look at the CPU clock speed, power, and GPU clock speed results. AMD's GPU clocks are very stable, and CPU clocks are very low, generally under 2GHz.

Intel's CPUs go often over 3GHz, and some reach 4GHz! You can see comparing the power meter where the Intel setup uses more power is generally where the CPU clock frequency is high.

Also where it performs lower is when the iGPU is clocked low at 1-1.1GHz range, rather than 1.3GHz it's supposed to. Whereas on AMD the GPU clocks are very stable.

So this is the traditional Intel iGPU issue where it's prioritizing the CPU over the iGPU so it steals the power available and doesn't perform as well in games. Also Zen mobile has a very efficient CPU architecture. Running Tigerlake at 4GHz of course doesn't help.

CPU clock speed more than twice as high, this is so bad for the power efficiency. I wonder what happens when he disables the CPU turbo so it can't use crazy CPU clock speeds, GPU should have more power budget.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,710
677
126
Not only have crypto miners destroyed the GPU market and damaged the environment, but they are now stinking up this thread. No one cares about your dumb pretend pyramid scheme money go post about it on reddit or something. This is a thread for GPU discussion, in particular the upcoming Intel GPU's. Hopefully these babies help stabilize the market from its madness.
 

DrMrLordX

Lifer
Apr 27, 2000
19,998
8,984
136
No one cares about your dumb pretend pyramid scheme money go post about it on reddit or something.
You will when they buy all "your" cards. Lolz. Seriously though, Intel's support for mining software (and vice versa) has been abysmal up to the point, so why is anyone really worried about it? Intel clearly wants in on the BTC ASIC game, so I doubt they'll spend all that much R&D bringing ARC to the mining set. dGPU mining is about to go to pot anyway.
 

DrMrLordX

Lifer
Apr 27, 2000
19,998
8,984
136
What are the odds that some new crypto optimized for sucking GPUs will rear its ugly head once Ethereum goes PoS?
Already been discussed in other threads, but a). other dGPU projects already exist and b). they don't have the critical mass necessary to absorb enough hashpower to take on even the existing Ethereum miners, much less more of them. The only way this situation perpetuates itself is if the EF says "ehhh never mind!" and abandons PoS, which would probably be suicidal for them at this point.

Intel is probably as aware of this fact as anyone else.
 

beginner99

Diamond Member
Jun 2, 2009
5,064
1,409
136
Then they should stick a dagger into the hearts of miners by clearly stating that their GPU's will be crippled in mining. Miners want to mine, pay for the damn specialized hardware rather than steal hardware that was never meant for their usage.
You do know that bitcoin can only be mined on ASICs right now as GPUs are way too slow? So the bitcoin miner cards and GPUs from intel don't really relate to each other.
Mining is software and intel can't make a general statement mining will not work. It^s basically impossible to generally block "mining". They could try to block one specific algorithm (ethereum for example) but then ethereum will move to proof of stake around the time intel arc finally releases. So pointless to make such a limit. and afaik nv lhr has also been cracked.

NVs LHR is a joke anyway. It is and has always been and "easy" solution to appease to gamers while not affecting NVs bottom line. If NV actually cared, they should have strong armed their AIBs (like the have many times before) and punished them severly for bulk selling from factory to miners. Like any AIB gettign caught once -> no more chips. I bet this shit would have stopped immediately and if not NV could have just used the chips for founders cards or sell them to other AIBs. But that would have been hard (legal efforts), pissed of people AIBs and cost some money. LHR? cheap. But a couple engineers on it and done.

And AMD? let's be honest- The keep up the scarcity "on purpose". As long as prices are high, no incentive to produce more chips and use wafers for CPU chiplets instead which are even with current GPU prices more profitable.
 

igor_kavinski

Diamond Member
Jul 27, 2020
4,881
3,077
106
And AMD? let's be honest- The keep up the scarcity "on purpose". As long as prices are high, no incentive to produce more chips and use wafers for CPU chiplets instead which are even with current GPU prices more profitable.
Aren't the transistor libraries for CPU and GPU incompatible? I'm not sure if they can just switch unused capacity from one to the other.
 
  • Like
Reactions: Tlh97 and Leeea

NTMBK

Diamond Member
Nov 14, 2011
9,942
4,256
136
Aren't the transistor libraries for CPU and GPU incompatible? I'm not sure if they can just switch unused capacity from one to the other.
As far as I'm aware, transistor libraries are a thing at the design stage, not the manufacturing phase. If they have a 7nm TSMC mask set for CPUs and a 7nm TSMC mask set for GPUs, I don't think TSMC really cares, so long as they're using the same production line.
 

KompuKare

Senior member
Jul 28, 2009
813
433
136
As far as I'm aware, transistor libraries are a thing at the design stage, not the manufacturing phase. If they have a 7nm TSMC mask set for CPUs and a 7nm TSMC mask set for GPUs, I don't think TSMC really cares, so long as they're using the same production line.
Exactly.
The only thing which this *might* be a factor (and we'll have to wait for someone with more fab knowledge) is if a given designs requires more layers/doping of (let's call it "Y") for some reason.

So imagine a certain part of a chip requires an extra layer of some metal because of speed or a part which needs more amps/current to drive something and for yields the customer (e.g. AMD) and the fab (e.g. TSMC) might have come up with the certain process step. Then when processing those wafers, those / that extra step gets applied, for other wafers it doesn't.

I would imagine swapping out masks is no trivial matter so from the fab's perspective it makes sense to keep a given "line" running certain wafers. From the customer's PoV too due to yields, leadtime etc.

However, in the fab for a given node (7N, N7P etc.) changing from "CPU" to "GPU" wafers shouldn't really matter
 

maddie

Diamond Member
Jul 18, 2010
4,149
3,419
136
I don't know if the manufacturing process is agnostic to the design. Can N6, for example, be used interchangeably with CPU and GPU, without any changes required to the process flow?
Production happens in minimum sized batches (economic reasons), where the steps are identical for the batch. You can switch to anything on that node when changing to a new batch of wafers. Masks, diffusion chemicals, etches, etc, are changed as needed for the new batch. CPUs, GPUs, WHATEVER, it's all silicon now.
 
  • Like
Reactions: igor_kavinski

maddie

Diamond Member
Jul 18, 2010
4,149
3,419
136
What he is saying is a production line to make GPUs cannot be switched to make CPUs one day, and then back to GPUs the next.
With batch production, you can't even switch between different CPU designs, so saying CPU-GPU is irrelevant.

I don't know the minimum size batch and how long it takes to clear a production stage, which is NOT the same as clearing the total production process.
 
  • Like
Reactions: igor_kavinski

ASK THE COMMUNITY