Mark Rein: “PC innovation suffered” from Intel’s decisions to not fix their graphics

csbin

Senior member
Feb 4, 2013
899
600
136
http://www.dsogaming.com/news/epic-games-mark-rein-pc-innovation-suffered-from-intels-decisions/



Now this is really interesting. An article about the battle between Intel and Samsung caught Mark Rein’s, Epic Games’ CEO, eye who afterwards claimed that PC innovation suffered from Intel’s decision to not fix (or improve) their graphics cards.

This is really… well… bizarre as Epic Games has not been – lately – pushing the PC boundaries or supporting our platform as much as they did in the past, but could this really be the reason behind it?

As Mark Rein said:

“For years we tried to convince Intel to fix their graphics but their data said they were good enough. PC innovation suffered for it.”

In case you’re wondering then no; AMD and Nvidia – regardless of their graphics cards being more powerful than those of Intel’s – did not top Intel in terms of the number of GPUs powering current PC systems. When Peter Tilbrook suggested that, Rein replied and said that Intel ‘still owned the lions’ share of the graphics market with its integrated chips

Our guess is that Rein was referring to a technical/graphical push similar to the first Unreal, otherwise we can’t see how innovation suffered from Intel’s decision to keep its integrated cards as it were and not upgrade them.

Innovation comes in many forms, and most indie titles have proven that they can be innovated, yet not as demanding as most triple-A games.

What’s also funny is that PC gamers have been asking for a proper sequel to Unreal Tournament, yet Epic has failed to deliver such a title. And don’t get us started with Unreal Tournament 3.

Or how about a game based on the Samaritan tech demo? At least that Blade Runner-themed demo was more innovative than our usual games, right? But did Epic decide to scrap that project (or perhaps move it to next-generation platforms) because Intel did not upgrade its graphics? Sounds plausible.

Do note that Epic has not revealed any plans about a game based on the Samaritan tech demo, we’re simply making assumptions.

All in all – and according to Epic Games – Intel’s decision has hurt the PC platform. Question is whether this is a mere excuse for the exclusive console deals Epic has signed or not. What do you think?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I can state with near certainty that intel's decision to enter the graphics market via iGPU had little to do with gaming. They want to drive graphics in ultra portable, ultra slim computing devices with high resolution or "retina" displays. For this market, the vast majority of consumers don't care about gaming performance, believe it or not.

It's just rather funny when someone states that a rMBP (the 13 inch variant uses HD4000) can't run bf3 very well. My first thought is, "yeah? So what?". That isn't what the device is intended for. Everyone here needs to get OUT of the mindset that everything and anything is designed for gaming. The mass market doesn't care, only the 1% of tech nerds do.
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
It might be part of the reason they haven't pushed PC graphics as much as in the past, combined with console release cycles.

But it's not the reason they have been putting out mediocre to bad games. Graphics don't make the game. The games they have been making haven't been particularly well received for reasons other than graphics.
There are many many Unreal Engine 3 games people have found to be great, and also older games (e.g. Bioshock 1) which have nice added stuff above the base Unreal engine which have been good as well graphically and in other ways.

Epic can blame Intel for not pushing the boundaries of PC graphics, but they are the ones putting out mediocre games overall outside the engine/graphics.
 

2timer

Golden Member
Apr 20, 2012
1,803
1
0
I can state with near certainty that intel's decision to enter the graphics market via iGPU had little to do with gaming. They want to drive graphics in ultra portable, ultra slim computing devices with high resolution or "retina" displays. For this market, the vast majority of consumers don't care about gaming performance, believe it or not.

It's just rather funny when someone states that a rMBP (the 13 inch variant uses HD4000) can't run bf3 very well. My first thought is, "yeah? So what?". That isn't what the device is intended for. Everyone here needs to get OUT of the mindset that everything and anything is designed for gaming. The mass market doesn't care, only the 1% of tech nerds do.

Do you have a source from Intel confirming this? If that were the case, then why would Intel demo Haswell's GT3 at CES using a GAME? http://www.geek.com/chips/intel-demos-haswell-with-gt3-integrated-graphics-at-ces-1535712/

Kind of contradicts your theory, no?
 

Spjut

Senior member
Apr 9, 2011
931
160
106
Rising the bar for the lowest common denominator is most important.

But yeah, what has Epic done apart from some fancy tech demos?
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
The mass market doesn't care, only the 1% of tech nerds do.

So we have people in PC gaming forum talking about how they are doing "better things" than gaming, and now we have people in the Video Card forum referring to said gamers as "tech nerds" ...


Mods - there are obviously some people that do not want to be on Anandtech anymore, can we do something about that? Instead of letting them troll around and passively aggressively insult the rest of us?
 
Nov 26, 2005
15,188
401
126
But yeah, what has Epic done apart from some fancy tech demos?

Somewhere a while back I read that Epic felt held back by hardware and I think this was coming from Tim Sweeny. Maybe they were developing some lighting tech that needed better hardware to run their coding. I recently remember seeing mentioned in the thread "How the PS4 will be better than a PC" about Epic's lighting technology being to difficult to run on the hardware they were testing and I'm not sure if it was the PS4 or a PC but it's somewhere in that long thread.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I can see what he's saying a little bit but not in terms of gaming. I think it was very naive of them to blame intel for gaming. Is intel supposed to dominate the CPU, GPU, Ram, Mobo, and just build the whole PC as fast as possible just for gaming for them? Nvidia and ATI are better equipped at the moment for doing that.

Maybe, I can see it in terms of just general PC use. Integrated graphics that are better would allow for more beautiful interfaces, and other types of innovation (I just can't think of it because I'm not an innovator). Would it be nice if Intel graphics were better so that software developers could bring some neat things to the average user? Of course, that's why intel is making better graphics. For them to blame intel though for not being able to run good video games is absurd.
Intel doesn't dominate the GPU market because they're trying to. They dominate it because people expect PCs to just WORK and if Intel didn't have an integrated GPU PC prices would go up, and intel would sell less CPUs. Better to throw in an integrated GPU that does exactly what people need it to do. The minimum of running a display.
I think we'd ALL be mad here if Intel GPUs trippled in performance, rose the price of CPUs by say 40 dollars, and the average user still didn't do anything with this power while we all had to spend 40 extra dollars to get an integrated GPU we wouldn't use anyway.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
How does lighting tech hold back innovation that much?
Hell, Epic bought into NV's PhysX thing and then did nothing with it, and are blaming Intel for stuff. If Intel are holding things back, and AMD plus NV isn't enough, why bother adding proprietary stuff to your game (UT3 PhysX levels)?
Does that help innovation, vendor locking additional content?

Bioshock managed to pretty much innovate on UE2.5/3 with really immersive environments, how has UE3 been held back by Intel on the innovation front, when game developers are a problem.

Mirror's Edge. Innovative.
Dust 514. Innovative.
Borderlands. Almost unique graphics (cell shading was around before, but not many big games with it)

Loads of games have made great use of Unreal Engines with varying degrees of "innovation". Prettier graphics does not an innovative game make, and you don't need a super engine to make an immersive world, see Bioshock.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
So we have people in PC gaming forum talking about how they are doing "better things" than gaming, and now we have people in the Video Card forum referring to said gamers as "tech nerds" ...


Mods - there are obviously some people that do not want to be on Anandtech anymore, can we do something about that? Instead of letting them troll around and passively aggressively insult the rest of us?

I trolled nothing, I pointed out hard truths - 100% of macbooks airs and ultrabooks with integrated intel graphics aren't designed for gaming. If you feel that statement is somehow offensive, i'd suggest that you're the one with issues, not me - because it's the truth. Macbook airs aren't designed for gaming, yet sell by the millions, and they're all using integrated intel graphics. And when I say tech nerds, if you find THAT term offensive i'd have to laugh, because I actually group myself in the "tech nerd" category. Again, you're the one with issues, you're the one TRYING to find reasons to be offended. Maybe the truth is bothersome? I don't know. Whatever.

If you feel anything I said was against guidelines, please do report it. Or if it bothers you THAT much please by all means put me on ignore. I promise I won't lose any sleep over it. Promise. It won't hurt my feelings or give me second thoughts.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Do you have a source from Intel confirming this? If that were the case, then why would Intel demo Haswell's GT3 at CES using a GAME? http://www.geek.com/chips/intel-demos-haswell-with-gt3-integrated-graphics-at-ces-1535712/

Kind of contradicts your theory, no?

No, it really doesn't. It simply caters to what the PC tech centric audience expects. We can rewind back to the reason why intel decided to use integrated graphics: it was due to Apple's urging (numerous articles here at AT point this out, FYI) - Apple wanted cheaper, better graphics to put in their macbooks and macbook pros. And that is precisely what intel gave them: graphics to drive higher resolution displays and provide a good user experience to Apple's audience. Apples audience doesn't care about gaming, and it is wrong to think 100% of every ultrabook and macbook is designed specifically for gaming. Sure, SOME people care about gaming, but ultra portables as a whole - i'm talking, macbook airs and what not - are not designed for gaming at all. *Most* of the audience that purchases these devices, do not game.

The mass market buys devices for basic computing tasks, not gaming. Now, obviously intel is increasing their graphics performance exponentially over time and their chips are slowly becoming more suitable for gaming and I think that's great. Yet, I don't believe for a second that gaming is their sole motivation for iGPU - the fact of the matter is, when you look at the market buying portables as a whole (macbook airs, macbook pros, etc) the vast majority of the buyer base does not game. Therefore it stands to reason that gaming is a secondary benefit, not the primary purpose of iGPU.
 
Last edited:

2timer

Golden Member
Apr 20, 2012
1,803
1
0
No, it really doesn't. It simply caters to what the PC tech centric audience expects. We can rewind back to the reason why intel decided to use integrated graphics: it was due to Apple's urging (numerous articles here at AT point this out, FYI) - Apple wanted cheaper, better graphics to put in their macbooks and macbook pros. And that is precisely what intel gave them: graphics to drive higher resolution displays and provide a good user experience to Apple's audience. Apples audience doesn't care about gaming, and it is wrong to think 100% of every ultrabook and macbook is designed specifically for gaming. Sure, SOME people care about gaming, but ultra portables as a whole - i'm talking, macbook airs and what not - are not designed for gaming at all. *Most* of the audience that purchases these devices, do not game.

The mass market buys devices for basic computing tasks, not gaming. Now, obviously intel is increasing their graphics performance exponentially over time and their chips are slowly becoming more suitable for gaming and I think that's great. Yet, I don't believe for a second that gaming is their sole motivation for iGPU - the fact of the matter is, when you look at the market buying portables as a whole (macbook airs, macbook pros, etc) the vast majority of the buyer base does not game. Therefore it stands to reason that gaming is a secondary benefit, not the primary purpose of iGPU.

Very true, I wouldn't doubt the Apple factor one second.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
This is the biggest load of rubbish I've ever read.

What has stifled growth and innovation in the PC space has been the explosion in popularity of consoles, and almost all PC game development moving towards being merely ports or cross platform games.

If anything it's engines like the Unreal engine increasing cross platform support that has aided developers towards making 1 game fits all, when you do this you inherently move towards game design which is targeted at the lowest common denominator, which is consoles.

I'm not blaming epic for that, but they had their hand in this, they made their bed and now they have to sleep in it.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
1. It's bad form to copy an entire article in a forum post.

2. Is a wall of bolded text necessary?
 
Last edited:
Nov 26, 2005
15,188
401
126
This is the biggest load of rubbish I've ever read.

What has stifled growth and innovation in the PC space has been the explosion in popularity of consoles, and almost all PC game development moving towards being merely ports or cross platform games.

If anything it's engines like the Unreal engine increasing cross platform support that has aided developers towards making 1 game fits all, when you do this you inherently move towards game design which is targeted at the lowest common denominator, which is consoles.

I'm not blaming epic for that, but they had their hand in this, they made their bed and now they have to sleep in it.

Well put, and it seems pretty accurate.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
I can see this being especially true during the Intel "extreme" graphics days. Those integrated chips suffered on the basics, like Warcraft III. If the bar had been raised a bit more, more people could have been exposed to PC gaming, and thus the market wouldn't have been so exclusive.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
It might be part of the reason they haven't pushed PC graphics as much as in the past, combined with console release cycles.

But it's not the reason they have been putting out mediocre to bad games. Graphics don't make the game. The games they have been making haven't been particularly well received for reasons other than graphics.
There are many many Unreal Engine 3 games people have found to be great, and also older games (e.g. Bioshock 1) which have nice added stuff above the base Unreal engine which have been good as well graphically and in other ways.

Epic can blame Intel for not pushing the boundaries of PC graphics, but they are the ones putting out mediocre games overall outside the engine/graphics.

This ^^

Crytek doesn't seem to have an issue pushing graphics, or DICE with their engine in the upcoming Battlefield 4. What about the Tomb Raider TressFX. Yeah it's buggy but it's a step forward from mud textures and flat faces with no hair movement. Epic can only make excuses. Intel wasn't set out to revolutionize gaming. That's the whole reason Nvidia and AMD exist in the GPU market with products like the GTX 600 series and HD 7900 series cards.

Epic is to blame for Epic's lack of innovation and puch to further the possibilities. They were the ones happy with Good Enough(by using their engine so long on consoles with limited capabilities compared to a full fledged PC with a dedicated GPU), why do you think UE3 has been out for 7 years this november with only minor DX11 updates, the first of which was dog slow(first game using UE3 was Gears of War in November 2006), while DICE have updated Frostbite 2 already for Battlefield 4 and Crytek have updated their engine at least twice in that same time frame. Crysis came out in November 2007 and I seem to remember it kicking the crap out of UE3 in terms of capabilities and visual quality.

Epic talks a lot. To me that's all they are, talk. They haven't given us a real game in a long time and their engine is still riddled with stutter issues and is generally not well refined IMO.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I can state with near certainty that intel's decision to enter the graphics market via iGPU had little to do with gaming. They want to drive graphics in ultra portable, ultra slim computing devices with high resolution or "retina" displays. For this market, the vast majority of consumers don't care about gaming performance, believe it or not.

It's just rather funny when someone states that a rMBP (the 13 inch variant uses HD4000) can't run bf3 very well. My first thought is, "yeah? So what?". That isn't what the device is intended for. Everyone here needs to get OUT of the mindset that everything and anything is designed for gaming. The mass market doesn't care, only the 1% of tech nerds do.

Well said.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think I agree with everyone's sentiments on Epic not being a great developer in reference to the PC over the past few years, but.......

Intel could've had much better iGPs sooner than they did, which could've meant a better lower common denominator in the PC gaming space. Granted, Intel has made great inroads the past three years, no doubt. The standards just should've been risen much earlier.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I think I agree with everyone's sentiments on Epic not being a great developer in reference to the PC over the past few years, but.......

Intel could've had much better iGPs sooner than they did, which could've meant a better lower common denominator in the PC gaming space. Granted, Intel has made great inroads the past three years, no doubt. The standards just should've been risen much earlier.

Since the dawn of this console generation developers have made a game for the Xbox 360 or PS3 and ported it. No iGPU could play these games at 1080p. Even the first Gears of War on PC would not be playable at console equivalent settings on the Intel HD4000. Skyrim on Low at 720p with HD4000 is getting 40fps with a i7-3820QM. That's much worse looking than the console version of the game. So when a developer is making a game based on console specifications which are already above every iGPU available on the market how can they blame Intel for keeping things down? You should be aiming for a similarly powerful GPU made for gaming purposes because it is clear to me Intel isn't here to win the GPU wars for the hearts of gamers.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
It wasn't just about gaming, it was also about HD video playback while minimizing CPU usage, and giving the user an overall smooth visual experience with everyday work. The added value of a better IGP in those earlier years would've given Intel a better head start on taking on ATI/AMD and Nvidia head on like they are now in the desktop and mobile space.

Oh, and interestingly enough, it's not 1080p but it's close.. Earlier console ports, especially UE3, were a different ballgame. The 8600GT could run GeoW sub-1080p maxed out well enough.

HD4000 vs 8600GT

I wouldn't have ever considered Intel IGPs good enough for any kind of light PC gaming until the HD4000 came along, but you can think this console generation for dragging on for far too long too.
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I can see this being especially true during the Intel "extreme" graphics days. Those integrated chips suffered on the basics, like Warcraft III. If the bar had been raised a bit more, more people could have been exposed to PC gaming, and thus the market wouldn't have been so exclusive.

Forget about games. Pre-Arrandale iGPUs were craptacular even at HW H264 decoding or Flash.

Now, just when they finally getting respectable iGPU gaming performance, the general audience has moved away PCs. That is just deliciously ironic to say the least.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I pretty much agree with others that Epic is part of the problem, not part of the solution. As mentioned earlier, intel created iGPU as a value add for apple and was never designed with the sole intention of gaming - so to blame things on intel, is laughable to say the least.

In the meantime, it would be nice for Epic and company to produce an engine that isn't near broken when using any DX11 features. Intel iGPU sure didn't stop the frostbite engine, 4A engine (which is REALLY good in last light, I have to admit) and the crytek 3 engine. Epic needs to be part of the solution, instead they're deflecting blame on intel? Why? Intel never asked or sought to be the premier gaming GPU.