[Sweclockers] AMD opens up about Freesync

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
We don't know how g-sync works, what that huge 768MB memory is for, all we know is what does and to what degree and what it can't do. Nvidia aren't forthcoming with such details, so my question is were do some on this thread get off demanding AMD come out with details on how their unique software/hardware approach to VRR works:confused:

We have a lot of details of how Vsync works. We know how the vblank signal is used, we know how the polling is being used and the what and why of minimum frame rates and even how gsync monitors will be detected. From the perspective of understanding the interface and how the GPU basically hands off a frame and tells the monitor when a new frame is ready we have more than enough information to know how it more or less works and its performance characteristics. So actually we know quite a bit. The intricate details of the module and how it drives the LCD we don't know about, but then I can't think of a single monitor where we do know those sorts of details. The bits we are missing are all module to screen tech related and not GPU to module related.

On the other hand we have literally nothing from AMD/VESA in terms of technical brief about the interface between the two. Can you honestly not see the huge difference in quality of information?
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
We have a lot of details of how Vsync works. We know how the vblank signal is used, we know how the polling is being used and the what and why of minimum frame rates and even how gsync monitors will be detected. From the perspective of understanding the interface and how the GPU basically hands off a frame and tells the monitor when a new frame is ready we have more than enough information to know how it more or less works and its performance characteristics. So actually we know quite a bit. The intricate details of the module and how it drives the LCD we don't know about, but then I can't think of a single monitor where we do know those sorts of details. The bits we are missing are all module to screen tech related and not GPU to module related.

On the other hand we have literally nothing from AMD/VESA in terms of technical brief about the interface between the two. Can you honestly not see the huge difference in quality of information?
The only differnce I see is that we knew nothing of g-sync until it was ready and yet here we are demanding info on how AMD's use of adaptive sync work several months out.
On what you say we know, was it from nvidia or just forum and reviewer speculation? I ask because I know they were deliberately vague on the details in their big unveil.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
The reason you're hearing so much about A-Sync is so you forget about G-Sync or hold off on purchasing a G-Sync equipped monitor until an A-Sync one arrives. This is smart marketing regardless of when A-Sync equipped monitors will actually be ready..

Yes there's a lot of marketing and half-truth's behind the released information (isn't there always?) but the one thing that's abundantly clear is A-Synch will have a much lower bill of materials as no FPGA module is required to perform similar functionality to G-Sync.

This in turn will cost monitor manufacturers "MUCH LESS MONEY" to sell A-Sync monitors.

The comparisons of HD-DVD vs BluRay is a silly comparison. BluRay won mainly because it had huge industry support and was the default drive in the PS3 which was released at a crucial time during the infancy of the tech. G-Sync really only has Nvidia behind it trying to sell expensive FPGA modules.

If both technologies are comparable (which they appear to be), monitor manufacturers will simply choose the simpler and cheaper option as they stand to make more money with it. I don't see why this is so hard to understand?

Yes there's the initial R&D cost that both AMD (in the cards and drivers) and Nvidia (FPGA module) have to swallow but the monitor manufacturer's will favour the easier and cheaper solution which I'm betting will be AMD.

Either way it'll be interesting to see how this all plays out :)
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The only differnce I see is that we knew nothing of g-sync until it was ready and yet here we are demanding info on how AMD's use of adaptive sync work several months out.
On what you say we know, was it from nvidia or just forum and reviewer speculation? I ask because I know they were deliberately vague on the details in their big unveil.

Why are you lying like this? Even if we argue the big reveal from Nvidia didn't contain enough technical detail (it had a lot more than anything AMD has shown) but then just a little while after is the pcper.com interview with Tom Paterson that lays it all out. Someone uploaded it to youtube: https://www.youtube.com/watch?v=KhLYYYvFp9A

That was October, days after the big reveal and its responsible for a lot of the technical info we have. That was 3 months before the release. AMD hasn't done anything like this, not even remotely. It matters as well, we need to know that Freesync is genuinely a competitor, because they haven't yet shown it is, infact they have made a few noises that suggest it isn't.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The reason you're hearing so much about A-Sync is so you forget about G-Sync or hold off on purchasing a G-Sync equipped monitor until an A-Sync one arrives. This is smart marketing regardless of when A-Sync equipped monitors will actually be ready..

Yes there's a lot of marketing and half-truth's behind the released information (isn't there always?) but the one thing that's abundantly clear is A-Synch will have a much lower bill of materials as no FPGA module is required to perform similar functionality to G-Sync.

This in turn will cost monitor manufacturers "MUCH LESS MONEY" to sell A-Sync monitors.

The comparisons of HD-DVD vs BluRay is a silly comparison. BluRay won mainly because it had huge industry support and was the default drive in the PS3 which was released at a crucial time during the infancy of the tech. G-Sync really only has Nvidia behind it trying to sell expensive FPGA modules.

If both technologies are comparable (which they appear to be), monitor manufacturers will simply choose the simpler and cheaper option as they stand to make more money with it. I don't see why this is so hard to understand?

Yes there's the initial R&D cost that both AMD (in the cards and drivers) and Nvidia (FPGA module) have to swallow but the monitor manufacturer's will favour the easier and cheaper solution which I'm betting will be AMD.

Either way it'll be interesting to see how this all plays out :)

You're right, it will be interesting, but as a correction: The G-sync DIY kit is expensive because FPGA is expensive. G-sync from what NV has stated will transition to being ASIC based which has a far lower BOM so g-sync should be much cheaper down the road - i'd expect some of the upcoming displays that have g-sync " built in" will be reasonably priced compared to the competition; the asus ROG swift panel will be using an ASIC based g-sync module and i'd assume the Acer, BenQ and Viewsonic models will be as well. Think of ASIC based g-sync as being pre-integrated into monitors. Whereas the FPGA solution is the DIY kit which can be installed by a 3rd party or done yourself. In either of those cases, the DIY kit costs too much (due to being FPGA)

Essentially, FPGA was used for the G-sync DIY kit to speed it to market. Once it does in fact transition to being integrated into monitors and ASIC based, it will be cheaper. I agree that the DIY kit cost was a big negative against it, but that won't be a long term thing I don't think - I personally believe the upcoming g-sync panels with integrated g-sync will be fairly priced. They may not be priced exactly the same as non g-sync panels, but should be reasonably priced without a 200$ FPGA markup. But we'll see.

So..yeah. It will be interesting to see how it pans out. The ASIC based gsync panels should be out in the next couple of months. Free-sync, not sure when. My personal wish about free-sync is that AMD would just shut up and deliver the goods. Once the goods are released and purchasable on the market, by all means, make some noise then. But the pre-hype is in such poor taste, they could have done so much better. In any case, like I said, the idea is a sound one. Just shut up about it though. It's been a continual foot in mouth type of thing on AMD's end with misinformation.
 
Last edited:

SimianR

Senior member
Mar 10, 2011
609
16
81
So..yeah. It will be interesting to see how it pans out. The ASIC based gsync panels should be out in the next couple of months. Free-sync, not sure when. My personal wish about free-sync is that AMD would just shut up and deliver the goods. Once the goods are released and purchasable on the market, by all means, make some noise then. But the pre-hype is in such poor taste, they could have done so much better. In any case, like I said, the idea is a sound one. Just shut up about it though. It's been a continual foot in mouth type of thing on AMD's end with misinformation.

Poor taste? Maybe, but also smart in some ways because even without a market ready product they have taken a bit of the wind out of NVIDIA's sails because suddenly they've planted the idea in the consumers head that a product NVIDIA has been hyping the last few months might become an industry standard that doesn't lock you into a specific vendor.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Poor taste? Maybe, but also smart in some ways because even without a market ready product they have taken a bit of the wind out of NVIDIA's sails because suddenly they've planted the idea in the consumers head that a product NVIDIA has been hyping the last few months might become an industry standard that doesn't lock you into a specific vendor.

It was in poor taste because of outright lies (or unintentional misinformation, I don't know) on the part of their marketing, as I detailed in an earlier post. At this point, due to that, I can't take anything seriously. It's hard to NOT be harsh on AMD based on some of the ridiculous claims they've made about freesync. BUT what I can take seriously is real hardware that is purchasable. So once that happens, they can yell as loud as they want without having to resort to half-truths. Conversely, while NV's marketing isn't completely innocent (Their titan Z marketing was a screwup), they did not utter a word about g-sync until it was, essentially, done. It was on the market as a DIY kit shortly afterwards. Granted, I completely agree that the FPGA kit is too costly. No argument here. However this will be fixed with the ASIC based integrated gsync panels.

As far as taking the wind out of g-sync, maybe, but it's coming much sooner than free-sync (my impression anyway) and ASIC based g-sync panels should be hitting in the next month or so. Like I said, ASIC based g-sync will be far less costly than the FPGA based DIY kit. So we'll see how things pan out. One thing I am NOT against is more competition. I am all for competition, if AMD steps it up, hey that's great IMO. A VESA based standard is all good in my book. But after the half truths, AMD really does just need to shut up and start yelling once they are able to get vendors with purchasable monitors on the market. All indications are that won't happen for a long time yet.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
Why are you lying like this? Even if we argue the big reveal from Nvidia didn't contain enough technical detail (it had a lot more than anything AMD has shown) but then just a little while after is the pcper.com interview with Tom Paterson that lays it all out. Someone uploaded it to youtube: https://www.youtube.com/watch?v=KhLYYYvFp9A

That was October, days after the big reveal and its responsible for a lot of the technical info we have. That was 3 months before the release. AMD hasn't done anything like this, not even remotely. It matters as well, we need to know that Freesync is genuinely a competitor, because they haven't yet shown it is, infact they have made a few noises that suggest it isn't.
It was ready, how far out it was is not relevant
Again, you knew nothing of g-sync until it was ready, yet here we are wanting to know more of something several months out :confused:
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
You're right, it will be interesting, but as a correction: The G-sync DIY kit is expensive because FPGA is expensive. G-sync from what NV has stated will transition to being ASIC based which has a far lower BOM so g-sync should be much cheaper down the road - i'd expect some of the upcoming displays that have g-sync " built in" will be reasonably priced compared to the competition; the asus ROG swift panel will be using an ASIC based g-sync module and i'd assume the Acer, BenQ and Viewsonic models will be as well. Think of ASIC based g-sync as being pre-integrated into monitors. Whereas the FPGA solution is the DIY kit which can be installed by a 3rd party or done yourself. In either of those cases, the DIY kit costs too much (due to being FPGA)

Essentially, FPGA was used for the G-sync DIY kit to speed it to market. Once it does in fact transition to being integrated into monitors and ASIC based, it will be cheaper. I agree that the DIY kit cost was a big negative against it, but that won't be a long term thing I don't think - I personally believe the upcoming g-sync panels with integrated g-sync will be fairly priced. They may not be priced exactly the same as non g-sync panels, but should be reasonably priced without a 200$ FPGA markup. But we'll see.

So..yeah. It will be interesting to see how it pans out. The ASIC based gsync panels should be out in the next couple of months. Free-sync, not sure when. My personal wish about free-sync is that AMD would just shut up and deliver the goods. Once the goods are released and purchasable on the market, by all means, make some noise then. But the pre-hype is in such poor taste, they could have done so much better. In any case, like I said, the idea is a sound one. Just shut up about it though. It's been a continual foot in mouth type of thing on AMD's end with misinformation.

Interesting thanks for the info, I wasn't aware Nvidia was going the ASIC route but I guess it makes sense given enough volume can be moved. Hopefully AMD demo's something soon showing they actually have something good as it's going to be difficult to to hold off if I start seeing an influx of G-Sync capable monitors.
 

Mand

Senior member
Jan 13, 2014
664
0
0
It was ready, how far out it was is not relevant
Again, you knew nothing of g-sync until it was ready, yet here we are wanting to know more of something several months out :confused:

I want to know more if AMD thinks it can poke holes at G-Sync. If AMD wants to say "Yeah, Nvidia beat us to it, enjoy their product for the next year. We think ours might be better, but maybe not" then that would be an honest approach.

Instead, everything they've done has been a calculated attack on G-Sync, one designed solely to spread FUD about G-Sync and get people to not buy it because "FREE OPEN WEEEEE." There's marketing spin, and then there's outright deception to sabotage a competitor who you have absolutely nothing to compete with them yourself. That's what AMD is doing, and I don't like it. What I like even less is that it's working, hence why I make posts like these in order to explain the actual truth that they are intent on obscuring as best they can.

And we know a lot about G-Sync works, and did from the very beginning. The two aren't even remotely comparable. Not only do we have detailed information about what G-Sync does, we have, you know, actual hardware that has undergone 3rd-party testing and characterization. To compare that to FreeSync, which for all we know is nothing but a line on a word document based on the presented information, is so blatantly wrong that you'd have to deliberately be trying to generate false equivalency by doing so.
 

NomanA

Member
May 15, 2014
134
46
101
Considering the quote below from the article,

Make no mistake, providing dynamic refresh rates to users still takes a lot of ‘secret sauce’ from the hardware and software ends of our products, including the correct display controllers in the hardware and the right algorithms in AMD Catalyst.

I hope that folks realize AMD rep is talking about the display controllers in their graphic cards, and not the monitor control board. They specifically mention that while all cards 7000 series onwards support adaptive refresh rates to some extent (video, power saving), only R9 290 and R7 260 series have additional display controller updates to support dynamic refresh in games.

The way some of the comments are worded here, it seems as if the "special sauce" is needed in monitor, which isn't what they said. Of course the monitors have to support this optional part of Displayport 1.2a spec, but that's not what the AMD rep was talking about.

Consider this comment,

This guy being interviewed is a marketer scum through and through, complete with half truths and vague statements. AMD originally stated that free-sync doesn't require new hardware. Guess what, it requires monitors with different controllers which is ............... new hardware.

That's what I was talking about. I think, the poster here who did the point by point rebuttal for the article mistook that part a bit, and then few others ran off with it. Like I said, monitors have to get through the spec compliance requirements and have the additional control logic (not expensive or proprietary, according to the rep) but that much was known from the beginning and hardly a surprise.

Secondly, the AMD rep mentioned Hawaii and Bonaire development timeline only because he was asked whether FreeSync was an answer to GSync. He didn't say that AMD came up with the idea first, rather that these two series of long in development GPUs have specific display controller tweaks (the secret sauce later on mentioned by him) to handle dynamic refresh rates used for FreeSync.

In any case, the article and the interview answers are fairly straight forward. The anger and hostility shown in the feedback by certain posters are quite idiotic.
 

sirroman

Junior Member
Aug 23, 2013
17
0
0
If you want proof, evidence, and cited sources, please read this thread:

http://forums.anandtech.com/showthread.php?t=2382628

Everything I mentioned is contained there. Real, sourced evidence, most of them quotes from AMD itself.

But, because they say things that certain people here don't like, they feel the need to harass me and claim I have some kind of untoward agenda.

On calling them out on their lie, not only did they say otherwise in this very article, which you seem to be ignoring, they've said the same thing as in that quote in other interviews. Repeating the lie, and repeating backing away from the lie when pressed by people who actually looked at what they said, does not make the lie true or the backing away false.

Read for yourself, think for yourself. There's no point in rehashing the same arguments. This interview gives very little new - what is new does not substantively affect their claims or the position of FreeSync as nothing more than words on a press release.

But, of course, rather than accepting that AMD might just really be horrendously misrepresenting the true nature of FreeSync, it's far simpler to just attack me because you don't like what I'm saying. I mean, how could I possibly have useful things to say, if I make such a long post that people can't be bothered to read it?

I asked one proof about one of your accusations, then you asked me to hunt for one in a closed topic with 210 posts.

I will ask again: Provide one source or proof that Gsync is (or will be) cheaper than A-Sync.

If you don't respond, I'm sure we can all make our minds about you "by ourselves".
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,809
1,289
136
All we need to see is "Adaptive Sync Ready" monitors. Ready = Firmware upgradeable to support Adaptive Sync. Those with Displayport 1.2a monitors already would just need to redo the compliance check for Adaptive Sync. Then, if they pass they can change the already out monitors marketing to say "Adaptive Sync Ready."
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
Considering the quote below from the article,

Make no mistake, providing dynamic refresh rates to users still takes a lot of ‘secret sauce’ from the hardware and software ends of our products, including the correct display controllers in the hardware and the right algorithms in AMD Catalyst.

I hope that folks realize AMD rep is talking about the display controllers in their graphic cards, and not the monitor control board.

No, I hope folks don't realize that, because that would be completely wrong. Even according to AMD's statements. The display controller in the display, you know, because it's a display controller, is what is doing the work. AMD, itself, said that FreeSync would require a compatible controller in the display. In the display.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
I asked one proof about one of your accusations, then you asked me to hunt for one in a closed topic with 210 posts.

I will ask again: Provide one source or proof that Gsync is (or will be) cheaper than A-Sync.

If you don't respond, I'm sure we can all make our minds about you "by ourselves".

I never claimed it will be cheaper. What I said was that AMD's claim that their version will be cheaper can't be substantiated.

You can't be mad when I don't provide proof for something I didn't claim. Well, you could, but then we can all make up our minds about you "by ourselves."

Stop setting fire to the strawmen, please.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,809
1,289
136
The monitor needs to support DP1.2a for its ASIC.
The GPU needs to support DP1.2a for its integrated ASIC.

If the monitor supports DP1.2a it already has the "secret sauce."

The reasons why Southern Islands and older do not support Adaptive Sync. Is that the older generation GPUs only support DP 1.2/1.1. If you don't support MST or HBR2, you don't support Adaptive Sync.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
In any case, the article and the interview answers are fairly straight forward. The anger and hostility shown in the feedback by certain posters are quite idiotic.

AMD said they have been supported variable refresh since 2 or 3 generations.
Reality is the relevant part - for gaming - will only be supported by Hawaii and Bonaire.

He explained that this particular laptop's display happened to support a feature that AMD has had in its graphics chips "for three generations": dynamic refresh rates.
http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

AMD said that displays dont need new controllers or logic boards.
Fact is that display manufactures need to change the board to support Adaptive-Sync.

Adding support in a monitor should be essentially "free" and perhaps possible via a firmware update.
http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech

6-12 months for retail monitors.

Now that the DP spec has been amended with DPAS, scaler vendors must update their firmware/scaler roadmaps to incorporate the spec, which will lead to DRR-ready scalers that monitor vendors can incorporate, which will ultimately lead to DRR-ready monitors.

The spec paves the way for R&D on working hardware.
http://forums.overclockers.co.uk/showpost.php?p=26336247&postcount=4

/edit: Computerbase.de did an interview (german) with Asus and Asus said that they need a new scaler to support Adaptive-Sync like G-Sync:
http://www.computerbase.de/2014-05/adaptive-sync-auch-fuer-asus-und-iiyama-ein-thema/
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Just a couple of things:

1, The monitors are what needs to come first. AMD doesn't make monitors, so they are not the ones who are behind or need X number of months to catch up. They used their leverage to get the standard in place quickly. I'm sure the optional status was to help speed it up.

2, G-Sync is such great tech and it's been available for all these months who here talking it up has actually bought it?

Adaptive-Sync monitors will arrive soon enough. When that happens AMD will be able to introduce Free-Sync. AMD has gotten the standard in place and are offering support to the monitor companies. There's nothing more they can do at the moment. In the meantime people can pay $500 for a $260 monitor with the G-Sync feature, or soon $800 for a 1440p TN panel with it. For those who don't mind paying additional premiums for nVidia proprietary tech and be locked into only nVidia cards on budget TN panels, go ahead and buy it.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
With Nvidia not supporting Freesync then that will basically be proprietary to AMD. Since its an optional part of the spec it will only be on some monitors specifically targeting AMD's hardware. Despite the fact its written into the spec the reality for the coming year is that both implementations are going to end up proprietary, the main difference being that AMD isn't making the scalar changes whereas Nvidia is. Its funny how that works out, unless both companies can agree to do this in the same way and support a common standard there isn't really a standard at all. Who cares what Intel does, there are 2 GPU makers and if we want to be able to migrate from one to the other when we buy cards then they need to both implement a common thing and we have monitors that support both. That isn't happening right now.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
With Nvidia not supporting Freesync then that will basically be proprietary to AMD. Since its an optional part of the spec it will only be on some monitors specifically targeting AMD's hardware. Despite the fact its written into the spec the reality for the coming year is that both implementations are going to end up proprietary, the main difference being that AMD isn't making the scalar changes whereas Nvidia is. Its funny how that works out, unless both companies can agree to do this in the same way and support a common standard there isn't really a standard at all. Who cares what Intel does, there are 2 GPU makers and if we want to be able to migrate from one to the other when we buy cards then they need to both implement a common thing and we have monitors that support both. That isn't happening right now.

On pure volume alone you can't discount Intel.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Freesync is AMD's unique hardware/software adoptation and use of the optional spec adaptive sync in the DP 1.2a standard. This is a very important distinction.
A lot of monitor manufacturers use the same scalers so going forward (assuming the tech works as advertised) there's no reason why, once the scalers that support this come out, any new monitors wont offer this functionality unless thy're g-sync capable instead.
 

DigDog

Lifer
Jun 3, 2011
14,449
2,874
126
i think Mand's post to be very reasonable and intelligent. So here is my bet:
in two years, GSync will be pretty much in all gaming grade monitors and FreeSync will not exist at all.

– We expect Project FreeSync-ready monitors to be available in retail within 6-12 months, and prototypical monitors suitable for tradeshows or press demonstrations to be ready within 4-10 months.

You know what's sad about this? Not only they are admitting to basically being incapable of getting a product on the market, but if *any* of that interview is real, and all they do is a bit of algorithm wizardry with open standards, then Nidia will have it figured out and on the market way before FreeSync hits the tradeshows.
 

caswow

Senior member
Sep 18, 2013
525
136
116
i think Mand's post to be very reasonable and intelligent. So here is my bet:
in two years, GSync will be pretty much in all gaming grade monitors and FreeSync will not exist at all.

– We expect Project FreeSync-ready monitors to be available in retail within 6-12 months, and prototypical monitors suitable for tradeshows or press demonstrations to be ready within 4-10 months.

You know what's sad about this? Not only they are admitting to basically being incapable of getting a product on the market, but if *any* of that interview is real, and all they do is a bit of algorithm wizardry with open standards, then Nidia will have it figured out and on the market way before FreeSync hits the tradeshows.

so the same can be said about dx12?
 

MeldarthX

Golden Member
May 8, 2010
1,026
0
76
i think Mand's post to be very reasonable and intelligent. So here is my bet:
in two years, GSync will be pretty much in all gaming grade monitors and FreeSync will not exist at all.

– We expect Project FreeSync-ready monitors to be available in retail within 6-12 months, and prototypical monitors suitable for tradeshows or press demonstrations to be ready within 4-10 months.

You know what's sad about this? Not only they are admitting to basically being incapable of getting a product on the market, but if *any* of that interview is real, and all they do is a bit of algorithm wizardry with open standards, then Nidia will have it figured out and on the market way before FreeSync hits the tradeshows.


Computex - we know prototype will be there -

Second g-sync in all gaming monitors? seriously you mean like nvidia's 3d is all gaming monitors?

Not going to happen; g-sync doubles the cost of a monitor - while putting 1.2a standard dp into a monitor and then fixing a firmware to use the adaptive-sync....which won't double the cost of the monitor but most likely and a few dollars/pounds...

Which one do you think will be used? Common sense ...1.2a - why its the cheapest solution...that's why 16:9 won out; as it matches TV format; and cheapest solution for monitor makers to do.

g-sync will have to come down in price; or it will only make it to a couple models if that before passing into the night. if it incorps adaptive-sync which I think it will.....

Sadly people are forgetting Intel; they own half the market - this is something that would seriously help craptastic igpu - ....they are the dark horse here
 
Status
Not open for further replies.