Rumor: NV pressuring AIBs not to work with Intel

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

KompuKare

Golden Member
Jul 28, 2009
1,025
969
136
This doesn't stop anyone with an Nvidia GPU from experiencing the way it is meant to be played so it is fairly uncontroversial.

Okay I promise I'll stop being salty and sarcastic now.
Or maybe that appropriate words are more like:
This doesn't stop anyone who was overpaid Nvidia for a GPU from experiencing their GPU the way they have been played, after they had paid handsomely for it.
The other thread does often seem like people complaining about they even they have paid to win, the promised saviour - upscaling because their card is not fast enough to run at native - is being held back at the door because some generic guy is being let in instead!
 

Mopetar

Diamond Member
Jan 31, 2011
7,868
6,097
136
What's stopping them from running FSR in their NVidia GPU? If they've got an older Pascal GPU that's pretty much their only choice.

The performance difference isn't noticeable since both do essentially the same thing at this point. Based on testing from HUB they found that there were games where DLSS looked better than FSR to them, but there are plenty of games where they couldn't tell any differences or they both had similar image quality issues.
 

moinmoin

Diamond Member
Jun 1, 2017
4,960
7,691
136
What's stopping them from running FSR in their NVidia GPU? If they've got an older Pascal GPU that's pretty much their only choice.

The performance difference isn't noticeable since both do essentially the same thing at this point. Based on testing from HUB they found that there were games where DLSS looked better than FSR to them, but there are plenty of games where they couldn't tell any differences or they both had similar image quality issues.
In before the "nooo, there's a huge difference there, there and there!" and "I can't live without mu DLSS!" throng.
 

Saylick

Diamond Member
Sep 10, 2012
3,194
6,489
136
None of those people are posting in this thread let alone acknowledging its existence. You could necro the thread three years from now with the same post and you'd still be in before them. :p
That approach is very Nvidia-like: just ignore and refuse to acknowledge the existence of something that potentially could be damaging to your case and wait until people forget or it fixes itself, a la the new 12-pin power connector.
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
I think why this thread is not picking up traction is because the evidence is weaker and the actual reasons to block out intel are simply not there, particularly not from Nvidia.

Intel current graphic performance, lack of performance at launch and long term outlook on continued support are likely enough to deter partners as it is.

Looking into leaks, battle mage with it current rumors specs, i would suspect tops out at RTX 4070 ti performance but I would expect something more along the lines of RTX 4070 performance if they keep the 256bit bus. Nvidia will have a new graphic card lineup somewhat soon to launch to make this performance look ancient.

This is from a die almost 400mm2 and on 4nm in 2024. This is not a threat to Nvidia and is not particularly attractive to partners either. A big die, with high power consumption(more complex board and bigger cooler), with mid range performance that has to sell for a low cost due to skepticism in the Intel Brand in the GPU space is not something attractive to partners. Add in doubt from Intel staying in the discrete GPU space to support the card and if your a partner, I think you don't want to deal with the risk of selling Intel cards.

I think if anything AMD should be feel more threatened by Intel. Current AMD pricing has made it that it is easy for Intel to be the value player and take previous AMD customers who bought AMD graphic cards in the past for value. Volume is key for AMD because they are reliant on low volume with high margins to stay in the videocard game. Compress their volume any further and their prices will have to be higher than Nvidia to sustain the division which is not possible. AMD with less mindshare than Nvidia, not as much AI to pivot on and Intel attacking AMD's primary strength in graphics card(value), Intel presents a bigger risk to AMD.
 
  • Like
Reactions: GodisanAtheist

gdansk

Platinum Member
Feb 8, 2011
2,142
2,666
136
There are already vendors selling both Intel and AMD graphics card (Acer, ASRock). How about any selling both Nvidia + Intel graphics cards? Just Gigabyte? Are they still shipping Arc cards? I can't find them anywhere. Sus as the kids say.
 
Last edited:

Tup3x

Senior member
Dec 31, 2016
967
953
136
I have somewhat hard time believing that NVIDIA really cares about Intel at this point. If this is true then I guess they might be a bit scared. They did set the bar really low so Intel could be rather successful with Battlemage. All they need is 16GB of VRAM and performance that is around RTX 4070 (Ti).

Not sure how legal this kind of anticompetitive stuff is. If Intel suspects something like that, I'd imagine that they could escalate this.
 
  • Like
Reactions: Leeea

Mopetar

Diamond Member
Jan 31, 2011
7,868
6,097
136
Their worry is probably that with both AMD and Intel having their own CPUs, they'd be effectively frozen out of a lot of the market. JHH isn't stupid and has been around long enough to know how the game is played.

It's certainly in NVidia's interests to keep Intel dependent on them for more than just bare bones entry-level graphics. Intel would absolutely love it if all of the manufacturers paired their Intel notebook CPUs with Intel GPUs. AMD is probably offering some incentives given they're at least more competitive in graphics.
 
  • Like
Reactions: Tlh97 and Leeea

dr1337

Senior member
May 25, 2020
341
595
106
It's certainly in NVidia's interests to keep Intel dependent on them for more than just bare bones entry-level graphics.
This is exactly what I would assume causes this line of thinking. What blows so many of us away is that a company with so much money and brand influence would rather waste cash on anti-competitive measures instead of just delivering better products to consumers.

And I just want to put this out there: The main differentiating factor with something like this and an exclusivity sponsorship like with the other controversy in the news right now is that Nvidia isn't paying board partners to design and build cards. But in a software sponsorship work is actually done and paid for by them, the sponsor, that the developer doesn't have to do. A hardware manufacturer influencing an AIB partner makes a bigger impact on more people than sponsoring a single piece of software.
 

gdansk

Platinum Member
Feb 8, 2011
2,142
2,666
136
A hardware manufacturer influencing an AIB partner makes a bigger impact on more people than sponsoring a single piece of software.
Yes, it would be actual anti-competitive behavior if true. But since no one buys AMD/Intel GPUs regardless of availability I'm not sure it will make an impact. The impact for graphics card owners is gonna be smaller than the impact of AMD's self-defeating behavior. Impact might be the wrong way to look at it.
 
  • Like
Reactions: Leeea

Leeea

Diamond Member
Apr 3, 2020
3,631
5,369
136
But since no one buys AMD/Intel GPUs regardless of availability I'm not sure it will make an impact.
Depends on how you look at it.

If you include consoles and iGPUs, Nvidia suddenly becomes a minor player.


What is really odd is the phone market. The whole subject entirely may be a tempest in a teapot about to get swept away by an ocean of devices and users. This race to ever larger heat sinks and power draws may be the race to irrelevance.

At some point someone will release a must install cultural phenomenon for phones, and the sheer numbers of potential users will make that the dominate platform. Once someone shows phones are the most yield per investment $, AAA games are dead. At the end of the day its not about the graphics, or the power draw, it is about the gameplay, and game play can happen on a potato. The most common potato is a phone.

We are living in the glorious last days of the PC master race.
 
Last edited:

dr1337

Senior member
May 25, 2020
341
595
106
Yes, it would be actual anti-competitive behavior if true. But since no one buys AMD/Intel GPUs regardless of availability I'm not sure it will make an impact. The impact for graphics card owners is gonna be smaller than the impact of AMD's self-defeating behavior. Impact might be the wrong way to look at it.
Impact is the prime motivator for marketing, its what drives no PR is bad PR. Yes there is no evidence, as for many things right now, but the old Intel v AMD case was brought up in this thread already is a good example.

Nvidia lost their strongest AIB partner almost a year ago, that had an impact on this business in a lot of ways. This move would impact the people wanting a slice of that 80% marketshare, as in it would impact Nvidia's biggest customers, the board partners.

Perhaps in a crazy world someone at Nvidia is playing 4d chess and doing this hoping they get less AIB partners and further cut out the middleman to be replaced with more FE sales. There are a lot of angles on this kinda move other than anti-consumer, though it does impact us at the end of the line regardless.
 
Last edited:
  • Like
Reactions: Leeea

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
But since no one buys AMD/Intel GPUs regardless of availability I'm not sure it will make an impact.

i have been doing nothing but recommending AMD gpu's as of late, as you could not get a better gpu then a 6750XT at that price point.

Infact unless your going after the prom queens of the gpu world, i still think the 6750XT is probably hands down the best mid level card...(yes i downgraded it from upper lower to mid) dollar to dollar.

But its still funny to see how a 1650 3060 1060 are the most common gpu's used on steam.
The highest ranked AMD gpu is the iGPU. lmao.
 

moinmoin

Diamond Member
Jun 1, 2017
4,960
7,691
136
What is really odd is the phone market. The whole subject entirely may be a tempest in a teapot about to get swept away by an ocean of devices and users. (...)
Once someone shows phones are the most yield per investment $, AAA games are dead. (...)
We are living in the glorious last days of the PC master race.
I personally think this is a misinterpretation of today's reality.

Mobile gaming already passed the market size of AAA gaming years ago. The huge difference is that the biggest profiteer currently is Apple with its App Store while for developers getting financial successes gets harder and harder in the huge ocean of offerings. AAA gaming on the other hand is more predictable.

As for "PC master race", high end graphics on PC wouldn't exist anymore if it weren't for its relevancy for developing more potent compute acceleration for data centers. It's the sole reason Intel suddenly rejoined the dGPU race. I'd call its state "on life support".
 
Last edited:

tajoh111

Senior member
Mar 28, 2005
298
312
136
Impact is the prime motivator for marketing, its what drives no PR is bad PR. Yes there is no evidence, as for many things right now, but the old Intel v AMD case was brought up in this thread already is a good example.

Nvidia lost their strongest AIB partner almost a year ago, that had an impact on this business in a lot of ways. This move would impact the people wanting a slice of that 80% marketshare, as in it would impact Nvidia's biggest customers, the board partners.

Perhaps in a crazy world someone at Nvidia is playing 4d chess and doing this hoping they get less AIB partners and further cut out the middleman to be replaced with more FE sales. There are a lot of angles on this kinda move other than anti-consumer, though it does impact us at the end of the line regardless.

The intel and AMD case made more sense at the time because during the years the Dell bribes were going on, AMD launched Athlon/athlon 64 which was a better product than Intel which had launched Pentium 4 during that time. Intel would have likely bled significant more market share hences the bribes made sense for Intel.

Around the time these bribes stopped is when Intel launched core 2 duo, during which time, the bribes were not necessary since Intel had a superior product at the time. Everyone wanted to put Intel in their systems during this time. Intel's core 2 duo lineup annihilated AMD's product stack, The bribes and their timing made sense.

With a 400mm2 4nm, 8192 shader chip with a 256bit bus, Battlemage does not sound like an spec monster when looking at the current performance of the 4096 shader A770. The scaling between A380 to A770 is already showing scaling issues so with it's rumors specs, Nvidia does not have much to worry about and AMD with their chiplet design can do well with price drops. Also with prices for cards plummeting as much as they are at the moment, Intel is going to be launching their card in a pricing environment where they are selling at or below cost again. This is not good for partners or Intel. With intel's history with discrete graphics and Intel aggressively cutting projects and divisions as of late, if I was a board partner, I would be wary of selling Intel graphics. It why this rumor does not make sense and seems like something made up to take attention away from the AMD DLSS controversy at the moment.

Intel blocking AMD made sense when we compare Athlon 64 vs Pentium 4. Nvidia blocking Intel based on the performance of ARC vs Amphere/Lovelace/Blackwell seems irrational.
 

Mopetar

Diamond Member
Jan 31, 2011
7,868
6,097
136
The best time to kill the competition is in its infancy. Letting Intel get any kind of toehold and being in any kind of position to offer incentives to manufacturers to buy strictly Intel parts would be a bad position.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,852
7,228
136
The intel and AMD case made more sense at the time because during the years the Dell bribes were going on, AMD launched Athlon/athlon 64 which was a better product than Intel which had launched Pentium 4 during that time. Intel would have likely bled significant more market share hences the bribes made sense for Intel.

Around the time these bribes stopped is when Intel launched core 2 duo, during which time, the bribes were not necessary since Intel had a superior product at the time. Everyone wanted to put Intel in their systems during this time. Intel's core 2 duo lineup annihilated AMD's product stack, The bribes and their timing made sense.

With a 400mm2 4nm, 8192 shader chip with a 256bit bus, Battlemage does not sound like an spec monster when looking at the current performance of the 4096 shader A770. The scaling between A380 to A770 is already showing scaling issues so with it's rumors specs, Nvidia does not have much to worry about and AMD with their chiplet design can do well with price drops. Also with prices for cards plummeting as much as they are at the moment, Intel is going to be launching their card in a pricing environment where they are selling at or below cost again. This is not good for partners or Intel. With intel's history with discrete graphics and Intel aggressively cutting projects and divisions as of late, if I was a board partner, I would be wary of selling Intel graphics. It why this rumor does not make sense and seems like something made up to take attention away from the AMD DLSS controversy at the moment.

Intel blocking AMD made sense when we compare Athlon 64 vs Pentium 4. Nvidia blocking Intel based on the performance of ARC vs Amphere/Lovelace/Blackwell seems irrational.

- Can never bank on stuff like this. I don't disagree that the odds are on Battlemage being lackluster vs it's contemporaries, but it's also possible if improbable that Intel fixes whatever is holding back Alchemist.

How many times has AMD/ATi suddenly pulled a rabbit out of its hat?
 

MrTeal

Diamond Member
Dec 7, 2003
3,572
1,710
136
- Can never bank on stuff like this. I don't disagree that the odds are on Battlemage being lackluster vs it's contemporaries, but it's also possible if improbable that Intel fixes whatever is holding back Alchemist.

How many times has AMD/ATi suddenly pulled a rabbit out of its hat?
It's definitely not impossible, but I think it's much harder now for Intel with them not having a process lead and competing with Nvidia and AMD for cutting edge TSMC silicon.
If Intel had dedicated these kinds of resources to GPUs back in early 2010 when they were launching Westmere on their 28nm process and Nvidia was struggling to get the GTX480 out the door on TSMC 40nm, it might have been a very different story.
 
  • Like
Reactions: moinmoin

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,566
20,850
146
It's definitely not impossible, but I think it's much harder now for Intel with them not having a process lead and competing with Nvidia and AMD for cutting edge TSMC silicon.
If Intel had dedicated these kinds of resources to GPUs back in early 2010 when they were launching Westmere on their 28nm process and Nvidia was struggling to get the GTX480 out the door on TSMC 40nm, it might have been a very different story.
Despite how much crap he takes; maybe they couldn't have done it without Raja? Or at least someone as experienced as him. Of which there can't be many? So they ramped up once they had a leader for the division that knew how to set it up and run it.

"Run it right into the ground" I here some grumbling. I don't know, I think maybe it he didn't do all that bad all things considered. Better upscaling and ray tracing than his old company right out of the gate. Sure, they are late to the party, and there is still a lot of wood to chop in software. But despite all that, ARC is a hell of a first shot across the bow.
 

MrTeal

Diamond Member
Dec 7, 2003
3,572
1,710
136
Despite how much crap he takes; maybe they couldn't have done it without Raja? Or at least someone as experienced as him. Of which there can't be many? So they ramped up once they had a leader for the division that knew how to set it up and run it.

"Run it right into the ground" I here some grumbling. I don't know, I think maybe it he didn't do all that bad all things considered. Better upscaling and ray tracing than his old company right out of the gate. Sure, they are late to the party, and there is still a lot of wood to chop in software. But despite all that, ARC is a hell of a first shot across the bow.
Sure, though big piles of cash have been known in the past to help people move from one job to another. Raja left AMD in 2009 for Apple so it's not like he couldn't move if that was the linchpin needed to get things rolling.
 
  • Like
Reactions: Leeea

KompuKare

Golden Member
Jul 28, 2009
1,025
969
136
Despite how much crap he takes; maybe they couldn't have done it without Raja? Or at least someone as experienced as him. Of which there can't be many? So they ramped up once they had a leader for the division that knew how to set it up and run it.

"Run it right into the ground" I here some grumbling. I don't know, I think maybe it he didn't do all that bad all things considered. Better upscaling and ray tracing than his old company right out of the gate. Sure, they are late to the party, and there is still a lot of wood to chop in software. But despite all that, ARC is a hell of a first shot across the bow.
I suspect that if Sony or Microsoft had demanded more RT grunt for the current console generation, AMD could have delivered on that. Provider the console vendors were willing to pay for it. For RT it seems more like they don't want to spend the transistors.

Hardware-assisted upscaling is probably something AMD did not think of until recently.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,566
20,850
146
I suspect that if Sony or Microsoft had demanded more RT grunt for the current console generation, AMD could have delivered on that. Provider the console vendors were willing to pay for it. For RT it seems more like they don't want to spend the transistors.

Hardware-assisted upscaling is probably something AMD did not think of until recently.
Yeah, the console market wasn't clamoring for ray tracing. It still isn't for that matter. And FSR does well enough for consoles. I don't see console gamers griping about how bad their 9th gen console games look. Usually quite the opposite in fact.

BTW, I wasn't taking cheap shots at AMD. Simply pointing out that Intel made a strong debut in those areas, worthy of noting, and a little praise IMO.
 

moinmoin

Diamond Member
Jun 1, 2017
4,960
7,691
136
I don't know, I think maybe it he didn't do all that bad all things considered. Better upscaling and ray tracing than his old company right out of the gate. Sure, they are late to the party, and there is still a lot of wood to chop in software. But despite all that, ARC is a hell of a first shot across the bow.
I disagree. And not because of software but due to hardware. While absolute performance is respectable for a first try and at the price point demanded(!), power efficiency and performance per area are the worst of the three choices currently. So an unfavourable take on Arc would call it pure brute force, which is unsustainable for both Intel (higher cost than competition for bigger dies) and its users (higher cost for power and cooling).

I suspect that if Sony or Microsoft had demanded more RT grunt for the current console generation, AMD could have delivered on that. Provider the console vendors were willing to pay for it. For RT it seems more like they don't want to spend the transistors.
At what cost though? Current RT approach lives in a fantasy world of huge power sucking high end Nvidia cards that alone cost more than multiple current gen consoles, not to speak of power consumption. How is that ever going to work in any console, never mind handheld?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,566
20,850
146
I disagree. And not because of software but due to hardware. While absolute performance is respectable for a first try and at the price point demanded(!), power efficiency and performance per area are the worst of the three choices currently. So an unfavourable take on Arc would call it pure brute force, which is unsustainable for both Intel (higher cost than competition for bigger dies) and its users (higher cost for power and cooling).
I don't disagree with anything you wrote. Not certain what we disagree on either? :p You did rightly pointed out that the hardware has issues and for what reasons.

I suppose I am approaching it from the POV that both AMD and Nvidia have had generations that were power hungry, poor performance per watt, with loud blower models. ARC is a good first salvo given that history, at least IMO.