You need an nVidia GPU to make Intel machines capable of running games reasonably and Haswell will not change this.
I think Ivy Bridge integrated graphics will play about 99% of PC games ever released.
You need an nVidia GPU to make Intel machines capable of running games reasonably and Haswell will not change this.
I think Ivy Bridge integrated graphics will play about 99% of PC games ever released.
Sooner or later the ROI for discrete cards will be too low to continue.
I guess "16.8 fps 1280x800 2xAA" technically falls into the category of 'will play' though I don't know many people that would actually attempt to play through the game on a slideshow like that.
http://www.techpowerup.com/reviews/Intel/Core_i7_3770K_Ivy_Bridge_GPU/19.html
If you're happy only playing games released before 2006 then Ivy Bridge's iGPU should be fine.
I guess "16.8 fps 1280x800 2xAA" technically falls into the category of 'will play' though I don't know many people that would actually attempt to play through the game on a slideshow like that.
http://www.techpowerup.com/reviews/Intel/Core_i7_3770K_Ivy_Bridge_GPU/19.html
If you're happy only playing games released before 2006 then Ivy Bridge's iGPU should be fine.
Clearly 100% of the tablet market are looking for a tablet or ultrabook that can play crysis 3 at max settings
In fact, it's clear that gaming is the primary reason that macbooks and ipads are not selling well /sarcasm. They can't play crysis 3 at max settings folks! Intel should just call it a day and give up. Getting their chips into the smallest ultra portable devices is a fruitless endeavor, because of crysis 3.
You're missing the point which was that the iGPU still can't play games well. If you don't want to play games you're fine with an iGPU, if you want to play modern games you need a real GPU or at the very least an A10, end of story.
AMD will hold a great price advantage for a part that will be better Vs majority of iGPU range found in Haswell and more than good enough x86(CPU) part - 2.5-3.5Ghz Trinity will do anything you throw at it albeit a lot slower than 8T Haswell could do it (for a fraction of Haswell's price).
So my point is that I have no clue where intel will slide Gt3e mobile parts and who will buy this kind of HW setup if there is no Optimus switchable graphics in there. This means that you actually buy only CPU part of Haswell and GT3e will do nothing for you in anything GPU related (maybe QuickSync can be used,I don't know).
If intel can get GT650M level graphics in a product with 10 hours of battery life, that is a product which will sell like hotcakes.
If a 15"-17" notebook with GT3e ends up $150-$300 cheaper than one with a dgpu, I'd be interested in one.
The way I see it GT3>Richland>NewGT2. And considering that AMD keeps saying that Kaveri will be a 2013 chip....AND considering that the GT3e is rumored to be out around October. I'll wait for more info.
True, but Intel can't. Don't mix the High Performance Quad Core with the Ultrabook chips...which don't even have the GT3e.
Benchmarks please. No making stuff up.Intel doesn't care about chasing the gaming market - that isn't the intent of GT3E. I'm sure it will game fairly well since it matches the nvidia GT650M, yet that isn't intel's sole intent.
Intel doesn't care about chasing the gaming market - that isn't the intent of GT3E. I'm sure it will game fairly well since it matches the nvidia GT650M, yet that isn't intel's sole intent.
I'll raise my hand here on something!
Can someone please explain to me how the higher performing Haswell-U line has a lesser number than the lower performing one? --> i7-4650U(15w) vs i7-4558U(28w) <--
Isn't it confusing?
It does not match the GT650M, atleast not by 3DMark11 score.
So Intel will end up destroying PC gaming eventually. How can it survive on Intel's integrated GPUs as the only option left on the market?
So Intel will end up destroying PC gaming eventually. How can it survive on Intel's integrated GPUs as the only option left on the market?
Fortunately in order for iGPUs to destroy the PCIe card market they have to actually become powerful enough to compete with standalone cards.
No they don't.
They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.
So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.
Fortunately in order for iGPUs to destroy the PCIe card market they have to actually become powerful enough to compete with standalone cards.
No they don't.
They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.
So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.
lagokc, the analogy you need to conceptualize here is one in which Phynaz has direct and personal experience with...namely the utter slaughter that low-cost low-performing (relatively) x86 CPUs from Intel did to the big-iron processor segment in the 90's.
x86 was like a wind-blown wild-fire in that market demographic and it left every legacy big-iron provider in bankruptcy or irrelevance (other than IBM and SUN, although one could argue neither are all that relevant now).
Now the point here is not that x86 was special or made magically disruptive market moves, because it wasn't. x86 was simply disruptive enough as to destabilize the tenuous economics of the existing big-iron market segment...and once it tipped that balance past the destabilization point where revenue could no longer fund enough R&D as needed to keep big-iron low-volume CPUs ahead of the performance advances low-cost high-volume x86 CPUs the big-iron market as it was then known was dead.
The same slippery slope can happen in discrete GPUs. Once low-cost iGPU/APU products hit critical volumes and manage to revenue-starve the R&D engines for future discrete GPU products the game will be over for them as their amortized costs will spiral out of control as the volumes sold get lower and lower.
It is a tale that is not unique to x86, it happens the world over in industry and nation alike. See what cheap labor in Asia and open markets via the World Trade Organization did to manufacturing in Europe and N.America. It is a rather generic and universal phenomenon.
That is possible, but what about workstation cards? Will IGPs ever replace those? If they continue to be made, a big part of the R and D could be amortized to that.
No they don't.
They just have to become powerful enough that the standalone (discrete) card market becomes unsustainably small. Eliminate the low end cards where most of the sales volumes are, and then all the costs for the companies producing the chips and cards have to be spread over a much lower number of cards.
So now the entry point of a decent gaming card isn't $150 anymore, now it's $400. Very few people will spend that much money.
