You forget its an optional part of DP.
nVidia havent done it and dont have any plans for it. Instead they launched mobile G-Sync for laptops.
Even if intel implement it, then they also need driver support. And for what, 30-40FPS minimums? Anyway, its not going to happen anytime soon, even if they commit. Skylake+Skylake Refresh is what you get on desktops. So we are way deep into 2017 before there is even a possibility with Icelake.
If you want adaptive sync you buy AMD.
No, I didn't forget at all. Intel and NVIDIA both have the option - RIGHT NOW - to support it via driver updates.
NVIDIA may not be officially stating anything, but I guarantee* that they have an internal test bed that is operational. They basically proved it with the leaked notebook driver a few months back. G-Sync without a module? Sounds familiar.
Intel is... being Intel. Intel is actually the perfect customer for ASync given the wide gamut of frame rates its IGP gets.
AdaptiveSync will keep gaining momentum, then Intel will support it out of the blue with little to no fanfare, then when NVIDIA realizes that it can make more money supporting both, they will announce a driver update that will enable AdaptiveSync.
*not guaranteed
It requires hardware support. Not just drivers. AMD itself is an example of this.
I never saw any test with this, besides someone selecting an option in a driver. Working or not. Could you link me the tests?
Intel is obviously doing the right thing, attacking the gpu market from the bottom.
Going for the high end right now is pointless, anyway. Give or take 5 to 10 years and systems even in the high end sector will slowly start phasing out dGPUs.
Why? Because APU and SoCs are the future. Especially once we approach that sub 10nm area it would just be the economically smart thing to do.
Sure, for super high end enthusiasts and supercomputers dGPUs will be a thing.
But AMD and Intel are both doing the right thing. I actually feel like it's going to be Nvidia that will be at a disadvantage in a few years.
Sure, they dominate dGPU shares right now and dipped their feet into cars...but what else is there? They half heartedly try some stuff with their Tegra chips...and while the GPU for smartphone sized gaming is actually neat.... in this Area Nvidia will be years behind the competition for years to come.
IF AMD manages to get through its' current struggle...we might end up seeing Nvidia just slowly vanish into different markets and super enthusiast only.
Once Intel has a few more revisions on its' iGPU and AMDs APUs start using at least 14nm and HBM²...even the Desktop markets will change toward this...I would bet my dog on it.
That said....the iGPUs/APUs are not quite where I want them yet...but I expect them to "master" 1080P gaming (thanks to DX12 as well) by the end of 2016. (Maybe not with that SSAO+ 4xMSAA...but close xD)
P.S. of course god forbid that Nvidia and Intel actually teamed up to make Desktop SoCs...that stuff would just be straight up insane...a man can dream.
It wasn't so much of tests, but someone hacked the drivers and manged to get Async to work, which goes to show that the hardware to support it exists now. He also made the drivers available to others to try.
I don't know if it truly worked personally, but others claim it did.
You mean g-sync.
And by hacked, was it actually working? Or was it just someone that hacked the driver to change the setting without effect.
few seems say it worksYou mean g-sync.
And by hacked, was it actually working? Or was it just someone that hacked the driver to change the setting without effect.
The problem is they are charging $120 to $200+ for half a die's worth of GPU transistors. It is getting to the point where so much money is wasted on useless GPU transistors that at some point people will have to seek an alternative. Every single notebook that uses one of these chips plus a nvidia discrete is going to be getting outright ripped off by intel for at least $120.
Intel is obviously doing the right thing, attacking the gpu market from the bottom.
Going for the high end right now is pointless, anyway. Give or take 5 to 10 years and systems even in the high end sector will slowly start phasing out dGPUs.
Why? Because APU and SoCs are the future. Especially once we approach that sub 10nm area it would just be the economically smart thing to do.
Sure, for super high end enthusiasts and supercomputers dGPUs will be a thing.
But AMD and Intel are both doing the right thing. I actually feel like it's going to be Nvidia that will be at a disadvantage in a few years.
Sure, they dominate dGPU shares right now and dipped their feet into cars...but what else is there? They half heartedly try some stuff with their Tegra chips...and while the GPU for smartphone sized gaming is actually neat.... in this Area Nvidia will be years behind the competition for years to come.
IF AMD manages to get through its' current struggle...we might end up seeing Nvidia just slowly vanish into different markets and super enthusiast only.
Once Intel has a few more revisions on its' iGPU and AMDs APUs start using at least 14nm and HBM²...even the Desktop markets will change toward this...I would bet my dog on it.
That said....the iGPUs/APUs are not quite where I want them yet...but I expect them to "master" 1080P gaming (thanks to DX12 as well) by the end of 2016. (Maybe not with that SSAO+ 4xMSAA...but close xD)
P.S. of course god forbid that Nvidia and Intel actually teamed up to make Desktop SoCs...that stuff would just be straight up insane...a man can dream.
Haswell Iris Pro was a $90 cost add, Broadwell is $56. Logic dictates it's significantly cheaper now, right?
No there isn't: http://www.overclockers.ua/news/cpu/115939-intel-broadwell-m-2.pngAre we discussing the 6200 or the 5200, because as far as I saw in the reviews, there's quite a difference between the two of them.
I'll take that as your opinion on the matter.
I guess we'll know more on performance and efficiency when reviews come in for the new Broadwell SKUs.
ShintaiDK said:Not that long:
It requires hardware support. Not just drivers. AMD itself is an example of this.
I never saw any test with this, besides someone selecting an option in a driver. Working or not. Could you link me the tests?
Adaptive sync needs to support well below 30 or 40FPS minimums before its worthwhile for IGPs.
And lets see when or if Intel will support it. We are talking 2017 at the soonest with Icelake.
This is actually a really good point. To further add to this, Intel only managed 20% despite having a 22nm to 14nm die shrink.Even Intel themselves say it. Iris Pro 6200 looks better because they refuse to compare with Iris Pro 5200 parts, which are direct replacements. 20%. Whoopee-doo.
Also the eDRAM is a one-trick pony, meaning once you have it, you won't get that kind of boost again. In other words unless Skylake moves to HBM, I expect the same 20%, especially since they're staying with 14nm.
THG just posted a review of the new Broadwell desktop CPU's. Nothing too interesting on the CPU side, but the integrated Iris Pro 6200 puts up some pretty startling numbers for a low power integrated GPU. Certainly not going to replace a 980Ti, but it destroys anything AMD has in their integrated portfolio.
http://www.tomshardware.com/reviews/intel-core-i7-5775c-i5-5675c-broadwell,4169-6.html
Looks like one helluva HTPC chip, that can legitimately game at 1080p.