News Intel GPUs - Intel launches A580

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,628
136
Other then assigning addresses to all the units what else is there to consider? They will all be on the same board or chip even so you won't have all the problems that come from trying to sync up two cards on different slots/busses.

We've had lots of graphics cards with 2 GPUs on the same board but they still had to work in a crossfire/SLI type setup for gaming. I'm not saying it's impossible that someone could make an MCM type GPU and for sure we know AMD has been working on such a solution, but it's a very different thing than expanding a single chip design.

I'm not a GPU designer, but from what I do know of digital design and what I've read from professionals on the matter, the main problem is data coherency. In order to have a multi-GPU system appear as a single unit, you are adding an extra layer of abstraction in the data management between hardware and software.

So you have to drastically rework the individual GPUs in how data is handled and then you have to create a new piece of hardware that can intelligently manage the data flow. Where in the stack that new hardware exists or if it's one circuit or several at different layers I imagine is all being researched for optimal performance.

From a compute perspective, this is a lot easier to handle and can be handled even today with the current hardware implementation of today's GPUs. However, with gaming, you need real time response of constant data flow with low latency requirements (the more fps you want, the lower latency you need).
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
So you're comparing game requirements from early 2000's to game requirements from 2019 and somehow reach the conclusion the ~80mm2 chip from 2001 was a better deal than the ~230mm2 chip from 2019, because the latter is not "midrange" anymore. Don't you see a problem here?

No. I see no problem. You take the hardware of the day and use it to run the games of the day, and you see what you get for your money. Whether it's 80mm2 or 230mm2 makes no difference to the consumer. Just goes to show that each transistor is doing a lot less for the consumer today than it did nearly two decades ago.

Should be interesting to see if Intel can reverse that trend. Somehow I doubt it.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Should be interesting to see if Intel can reverse that trend. Somehow I doubt it.

Reversing the trend like releasing a <80mm2 die to play todays games? Seems you have not understood the argument from both coercitiv and tajoh111...that was that the gaming requirements and expectations have far outpaced moore's law over the last decade - resulting in increased die sizes and finally pricing within a comparable relative performance range.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Reversing the trend like releasing a <80mm2 die to play todays games? Seems you have not understood the argument from both coercitiv and tajoh111...that was that the gaming requirements and expectations have far outpaced moore's law over the last decade - resulting in increased die sizes and finally pricing within a comparable relative performance range.

Multiple GPUs on an interposer could seriously drop prices and seems like the next step. As has been mentioned, AMD is working on it, and there's no reason to believe that Intel and NVIDIA aren't/won't work on it as well. Only time will tell.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
gaming requirements and expectations have far outpaced moore's law over the last decade

On what basis do we prove or disprove that statement? IQ and resolution weren't going to stay the same, but then transistor density wasn't either.

I will agree that modern GPUs have more non-rendering capabilities than they did in 2000-2002, so that's going to bloat your transistor count whether those particular transistors do anything to help with rendering.

So it’s been confirmed by Intel that Tom “TAP” Petersen is joining them:

Interesting. I wonder how well all these people are going to mesh with Intel's existing corporate culture. Do they all report directly to Raja?
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,681
136
On what basis do we prove or disprove that statement?
On the basis of "midrange" chips moving from 80mm2 on 150nm to more than 230mm2 on 14nm. Had the pace been in line with Moore's law we would've seen roughly the same die area, while any loss due to compute "bloat" would have been easily (over)compensated by the 10X frequency gains.
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
On what basis do we prove or disprove that statement? IQ and resolution weren't going to stay the same, but then transistor density wasn't either.

I will agree that modern GPUs have more non-rendering capabilities than they did in 2000-2002, so that's going to bloat your transistor count whether those particular transistors do anything to help with rendering.



Interesting. I wonder how well all these people are going to mesh with Intel's existing corporate culture. Do they all report directly to Raja?

What you need to realize is it much easier to scale up work loads for graphics and GPU's which creates a high variability in terms of performance requirements. This means you can make a game nice in terms of performance or a game that is nice in terms image quality.

As a result, the limiting factor and why all games cannot run high on a low end small chip is software.

There are games that run at 100 fps max settings on midrange hardware and there is some that will run at 30FPS on max settings on the same hardware and resolution.

The reason for this is it is so easy to make graphic scale in terms of workload.

https://www.slideshare.net/pixellab...-and-trends/11-PS1_generation_560_polygons_4x

I think this slide demonstrates this well. Graphic workloads you can increase the amount of work a GPU has to do exponentially, unlike office workloads.

The best demonstration of this is the original crysis. Crysis was a titled that focused purely on image quality rather than performance which allowed it visual fidelity to be vastly better than anything at the time. As a result, it had high system requirements, ran like crap still. That doesn't mean there was no games that ran at high FPS with lower fidelity released in the same year. What this demonstrate is software is the variable which determines frame rate mostly. Not hardware.

As a result, there are programs with 10 times the performance requirements of other games developed in the same year because you can make games that focus on performance and potential install base(like fortnite) or for pure image quality for games like the most recent metro.

What your asking for is programmers to develop games for the same hardware and same performance requirement across all games. Impractical and impossible outside of the console market. This would ultimately decrease image quality of all software and ensure that graphics never evolve beyond the console market because consoles have the largest install base which would means developers would only code for performance for a 100mm2 GPU die and nothing else.

Programmers can code for hardware with 100mm2 dies and get 100 fps at max setting today with low end hardware, but these games would look like crap in today's market.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
On the basis of "midrange" chips moving from 80mm2 on 150nm to more than 230mm2 on 14nm. Had the pace been in line with Moore's law we would've seen roughly the same die area, while any loss due to compute "bloat" would have been easily (over)compensated by the 10X frequency gains.

See below.

What this demonstrate is software is the variable which determines frame rate mostly. Not hardware.

Software requirements are still generational. Even when Crysis came out, it existed within general reference to the hardware of the day. The only practical way to judge hardware of any generation is to see how well it ran software that launched to run on said hardware. Crysis is a bit of an outlier since, as you said, it actually pushed existing hardware to its limits and still delivered substandard frames. By the time "game engine" programming (popularized by people like Carmack) became a thing, it was fairly normal to tailor software to whatever was current or immediately future hardware. Nobody wanted to deliver 20 fps to gamers holding top-end hardware, with perhaps a few exceptions.

At least the link you posted attempted to introduce objectivity by putting some numbers to the complexity of models over the years (PS1 in the mid-to-late 90s to PS3 in 2006). What I am saying is, no developer - short of crazy or incompetent ones - wants to deliver a product that completely outstrips everything out there. That was true in the Quake III Arena days, and that's true today. I mean hell we have a lot of PC games that are just console ports today, from x86 systems with AMD hardware no less. It's not like they are really pushing the boundaries on what high-end PC hardware can do. Those are the games that are quite literally programmed to run at 30-60 fps at max settings on low-end hardware. In many cases, they don't really look like crap.

And yet somehow, a "midrange" card today costs $250 and had problems pulling its weight in some titles at 1080p (read: can't achieve 60 minfps @ 1080p max settings, much less 1440p or 1600p).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Intel is bringing in some important people...

Really makes me curious if this will be a boat with too many captains. Eager to see their first dGPU product. I know it isn't aimed at me, but if it delivers good, it will at least give an idea of their high end products (I hope).

I guess if Intel rises to even match NV, my wallet is screwed. :(
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
And yet somehow, a "midrange" card today costs $250 and had problems pulling its weight in some titles at 1080p (read: can't achieve 60 minfps @ 1080p max settings, much less 1440p or 1600p).
How much of that is the API overhead of Windows, thought?
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
How much of that is the API overhead of Windows, thought?

Hard to say. I would think API overhead would have more to do with CPU bottlenecking though, wouldn't it? Unless the API is just badly-designed.

And I'm still a bit confused/intrigued by the Ashraf hire. Technical marketing strategist? Okay, Ashraf generally knows his stuff, but hiring somebody that I generally associate with being a media analyst while there's obviously a problem with the product stack strikes me as a bit weird. What exact help do they need with their upcoming strategy for Xe? Or anything else, for that matter? Intel is awash in marketing talent.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
And I'm still a bit confused/intrigued by the Ashraf hire. Technical marketing strategist? Okay, Ashraf generally knows his stuff, but hiring somebody that I generally associate with being a media analyst while there's obviously a problem with the product stack strikes me as a bit weird. What exact help do they need with their upcoming strategy for Xe? Or anything else, for that matter? Intel is awash in marketing talent.
Maybe to silence their potential critics? Give him a salary, keep him under NDA... "Keep your friends close, and your enemies closer"?
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Maybe to silence their potential critics? Give him a salary, keep him under NDA... "Keep your friends close, and your enemies closer"?

Maybe, but critics are a dime a dozen. It takes time for them to develop a following, but still . . .
 

senseamp

Lifer
Feb 5, 2006
35,783
6,187
126
Bummer, I like Ashraf's articles. But I get it, people grow older have more responsibilities and what not, need a stable job, and Intel just keeps hiring various journalists and analysts for whatever reason. Might as well milk the cow while it's here. Can't blame him.
 

maddie

Diamond Member
Jul 18, 2010
4,722
4,625
136
I'm curious as to the US law.

What is the legality of say Raja Koduri now at Intel in revealing all of AMD's future plans up to when he resigned?
Can you just hire competitors senior staff and get the scoop on what's up with them?
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,478
14,434
136
I'm curious as to the US law.

What is the legality of say Raja Koduri now at Intel in revealing all of AMD's future plans up to when he resigned?
Can you just hire competitors senior staff and get the scoop on what's up with them?
Its based on the contract they signed when they were hired. MOST companies will have it in the contract, that should they be terminated, they are NOT to reveal any company confidential material. That material is marked as such.
 

senseamp

Lifer
Feb 5, 2006
35,783
6,187
126
Non-competes are unenforceable in California. But NDAs and trade secret laws are, including companies inducing violations thereof. I am sure Intel has made it clear to Raja et al that they don't want to hear any of AMD's trade secrets.
 

maddie

Diamond Member
Jul 18, 2010
4,722
4,625
136
Non-competes are unenforceable in California. But NDAs and trade secret laws are, including companies inducing violations thereof. I am sure Intel has made it clear to Raja et al that they don't want to hear any of AMD's trade secrets.
Is there supposed to be a (/s) somewhere?
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
Hard to say. I would think API overhead would have more to do with CPU bottlenecking though, wouldn't it? Unless the API is just badly-designed.

And I'm still a bit confused/intrigued by the Ashraf hire. Technical marketing strategist? Okay, Ashraf generally knows his stuff, but hiring somebody that I generally associate with being a media analyst while there's obviously a problem with the product stack strikes me as a bit weird. What exact help do they need with their upcoming strategy for Xe? Or anything else, for that matter? Intel is awash in marketing talent.

He has spent years pointing out crappy Intel messaging. Seems like a useful skill to have on a marketing team!

EDIT: Just wait until they hire Charlie Demerjian ;)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Maybe to silence their potential critics? Give him a salary, keep him under NDA... "Keep your friends close, and your enemies closer"?

This seems like exactly what they are doing.

Ashraf has been spot on with his information. He had genuine sources. And while I felt that he liked Intel, he also didn't shy away from telling the truth about the company.

I hate to say this, but maybe he was "bought out" by Intel. You can see in the tech world, top journalists have been sent to work at many corporations.

When it comes to morality, vast majority of people tread the grey area. If they are given enough financial incentives, they will likely say "Oh I deserve a break, what the hell" and take it.