[TechPowerUp article] FreeSync explained in more detail

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Kinda off topic but...

I dont believe the costs in performance to get 4k is worth the visual quality improvment, at this point in time.

How many people can run newer games are resolutions of 4k? very very few.

I honestly wonder if at some point PC monitors will reach the point of "good enough", just like with we re seeing with CPUs. Where each upg the improvements are less and less, and cost more and more.

Something like G-sync or Freesync is needed.
The impact these technologies can have are more so needed than 4k.


If I was in the bussiness of developing monitors, my R&D wouldnt be going into 4k or ever higher resolutions, but instead makeing them as power effecient as possible, and complient with both Gsync & Freesync, so either way I could sell to both consumer bases.

I think that'd be a safer bet, for your company as a developer of monitors, than 4k or higher res.
 

XiandreX

Golden Member
Jan 14, 2011
1,172
16
81
The issue is that for most people TN panels are the better fit. IPS screens while far superior in color and viewing angles really stink in response time. The few IPS screens that overclock to improve the response time/lag are few and far between. Most IPS screens simply dont have the focus on games.

Small edit to clarify things. I understand there are some IPS panels that are decent for response but most are not.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,412
5,680
136
The issue is that for most people TN panels are the better fit. IPS screens while far superior in color and viewing angles really stink in response time. The few IPS screens that overclock to improve the response time/lag are few and far between. Most IPS screens simply dont have the focus on games.

Small edit to clarify things. I understand there are some IPS panels that are decent for response but most are not.

Depends what games you play. Not everybody is an FPS gamer; RTS games look pretty glorious on IPS.
 

XiandreX

Golden Member
Jan 14, 2011
1,172
16
81
Depends what games you play. Not everybody is an FPS gamer; RTS games look pretty glorious on IPS.

I will give you that point for sure. However if you look at games in general more/most games have more motion than RTS games.

Flight, FPS, Sandbox, Fighting 2D/3D, Adventure, Driving, Sports games, Platformers, MMOS and RPG's prefer a faster response time. I understand there are exceptions and some people might not mind a bit of latency/lag but it for sure bothers me and I know it bothers a fair amount of people out there.

I also want to mention that I love IPS screens, I just wish they had better input lag.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
This is what Asus had to say about panel technologies, IPS versus TN , in terms of gaming:

http://www.bit-tech.net/news/hardware/2014/01/07/asus-pg278q-g-sync-pb287q-gaming-monitor/1

"Not all TN’s are made the same: the premium panel used in the PG278Q is of very high quality. IPS panels (and their derivatives like PVA/MVA etc) are not suitable for a multitude of reasons: 1) the response rate is simply not fast enough to react to the active change in refresh rate and 2) They cannot reliably achieve >60Hz without significantly affecting the quality of the image. IGZO technology (and LTPS – low temperature polysilicon – likewise) – yields 100′s of times faster electron mobility versus standard amorphous silicon panels – and thus can provide a response rate comparable to TN (up to 60Hz currently), but, however desirable this technology is, it is still currently cost prohibitive for many PC gaming enthusiasts in 2014, which is why ROG has used a better price:performance, high quality TN panel."

So asus confirms that strong response time is not possible on IPS panels. But it is possible on Sharp IZGO panels, which are comparable to TN panels in terms of response time.

Following my gut-instinct and logic from this thread, I guess that means true gamers will be getting 4k IZGO panels? IZGO 4k panels provide the most immersion, best viewing angles, and best color quality. Since the IZGO 4k panels cost 3500$, that's a drop in the bucket for proper response times and true gaming. That's what the majority of gamers will buy, I highly suspect - the 3500$ 4k IZGO. But if g-sync adds another 200$ on top of that, i'm not so sure. Eh. 3500$ is acceptable but 3700$ isn't.
 
Last edited:

NIGELG

Senior member
Nov 4, 2009
852
31
91
This is what Asus had to say about panel technologies, IPS versus TN , in terms of gaming:

http://www.bit-tech.net/news/hardware/2014/01/07/asus-pg278q-g-sync-pb287q-gaming-monitor/1



So asus confirms that strong response time is not possible on IPS panels. But it is possible on Sharp IZGO panels, which are comparable to TN panels in terms of response time.

Following my gut-instinct and logic from this thread, I guess that means true gamers will be getting 4k IZGO panels? IZGO 4k panels provide the most immersion, best viewing angles, and best color quality. Since the IZGO 4k panels cost 3500$, that's a drop in the bucket for proper response times and true gaming. That's what the majority of gamers will buy, I highly suspect - the 3500$ 4k IZGO. But if g-sync adds another 200$ on top of that, i'm not so sure. Eh. 3500$ is acceptable but 3700$ isn't.
What is a true gamer??
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't know bro you tell me! I've read pages and pages telling me that true gamers only buy the more expensive technologies, and they're buying 4k IPS panels by the droves. I'm just following the logic presented in this thread! I don't have all the answers, sometimes I make mistakes. I could be wrong. If I made a mistake let me know, cuz that's how we all learn and grow right. That's what 3dvagabond stated and I completely agree with him and respect that. We learn and grow from repeated mistakes.

Right? I don't know what the true gamer is, I know what I prefer but i'm just going by what the popular opinion in this thread is. :| Clearly that opinion is :

#1) Buy IPS
#2) Buy the most expensive tech possible
#3) 4k is selling like hotcakes

With 1-3 being the case, like asus indicated, clearly 4k IZGO panels are the way to go for gaming. That's what i've inferred from this thread. So following that logic, I guess that's what people participating in this thread think the true gamer is!

Personally I don't know! You tell me!
 
Last edited:

XiandreX

Golden Member
Jan 14, 2011
1,172
16
81
This is what Asus had to say about panel technologies, IPS versus TN , in terms of gaming:

http://www.bit-tech.net/news/hardware/2014/01/07/asus-pg278q-g-sync-pb287q-gaming-monitor/1



So asus confirms that strong response time is not possible on IPS panels. But it is possible on Sharp IZGO panels, which are comparable to TN panels in terms of response time.

Following my gut-instinct and logic from this thread, I guess that means true gamers will be getting 4k IZGO panels? IZGO 4k panels provide the most immersion, best viewing angles, and best color quality. Since the IZGO 4k panels cost 3500$, that's a drop in the bucket for proper response times and true gaming. That's what the majority of gamers will buy, I highly suspect - the 3500$ 4k IZGO. But if g-sync adds another 200$ on top of that, i'm not so sure. Eh. 3500$ is acceptable but 3700$ isn't.

Blackened... you make me chuckle.. Its true though.. thats not a lot of money. :hmm:
 

NTMBK

Lifer
Nov 14, 2011
10,412
5,680
136
I don't know bro you tell me! I've read pages and pages telling me that true gamers only buy the more expensive technologies, and they're buying 4k IPS panels by the droves. I'm just following the logic presented in this thread! I don't have all the answers, sometimes I make mistakes. I could be wrong. If I made a mistake let me know, cuz that's how we all learn and grow right. That's what 3dvagabond stated and I completely agree with him and respect that. We learn and grow from repeated mistakes.

Right? I don't know what the true gamer is, I know what I prefer but i'm just going by what the popular opinion in this thread is. :| Clearly that opinion is :

#1) Buy IPS
#2) Buy the most expensive tech possible
#3) 4k is selling like hotcakes

With 1-3 being the case, like asus indicated, clearly 4k IZGO panels are the way to go for gaming. That's what i've inferred from this thread. So following that logic, I guess that's what people participating in this thread think the true gamer is!

Personally I don't know! You tell me!

We're in a forum full of people who buy multiple Titans, and 6 core Intel processors. What did you really expect?

I'm pretty happy with my HD7770 and cheap 1080p TN monitor, personally.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
I don't know bro you tell me! I've read pages and pages telling me that true gamers only buy the more expensive technologies, and they're buying 4k IPS panels by the droves. I'm just following the logic presented in this thread! I don't have all the answers, sometimes I make mistakes. I could be wrong.

Right? I don't know what the true gamer is, I know what I prefer but i'm just going by what the popular opinion in this thread is. :| Clearly that opinion is :

#1) Buy IPS
#2) Buy the most expensive tech possible
#3) 4k is selling like hotcakes

That's what i've inferred from this thread. So following that logic, I guess that's what people participating in this thread think the true gamer is!

Personally I don't know! You tell me!
A true gamer is simply anyone who plays video games....Period.

New,cool Tech like Freesync or G sync is welcome but ultimately it's about playing video games.When I look at some posters here I have to wonder whether they actually play video games or post on forums and push marketing agendas.....
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
A true gamer is simply anyone who plays video games....Period.

New,cool Tech like Freesync or G sync is welcome but ultimately it's about playing video games.When I look at some posters here I have to wonder whether they actually play video games or post on forums and push marketing agendas.....

Try reading the Mantle thread :whiste:
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Well, my two cents here.

1. Gsync sounds like the more advanced (ie. likely to perform better) implementation, just because it uses more hardware and a frame buffer.

2. Proprietary tech is the last thing I ever want to see in the world. It ends up either killing technologies with potential (PhysX) because it ruins adoption, and fragments the market...

3. or becoming a giant money grab, since companies (Nvida in particular has a history...) of digging their hands as far into your pockets as they can. You're an exploitable, walking bag of money to them. And they will charge as much as possible for Gsync for as long as they can. Now, me? I'm cheap. I want my money to go a long ways. I don't want to pay $200-300 for this technology. Maybe Gsync will get this cheap, once they actually make an ASIC for it (not as likely, considering the 768MB of RAM as well, and the fact they only use it with TN panels that cost two times what any TN panel ought to, but whatever).

4. The OTHER option, is that we have a (likely) less effective but essentially free technology, which BOTH companies can use. Knowing Nvidia, they likely won't adopt it, since they only want their customers to have the option through them and at a high cost, but I digress. I like this option. If the feature is included in future standards, or at least in some monitors, which sell at a somewhat reasonable rate, then this is great news. Maybe the monitor will be more expensive, maybe it will be a 1080p TN panel, maybe it will cost $300.... oh wait, that's what you pay for a Gsync monitor BEFORE you even factor in the cost of the module ;D. My point is, I don't know why anyone would not want a technology which (ideally, maybe it won't work well at all, what do any of us know?) performs almost identically, and is simply a checkmark on a featureset box. Or as simple as getting a DP 1.3 monitor. You get the point. That's what I want.

5. Gsync should be excellent on IPS monitors because the "useful range" of 30-60Hz is where tearing is most visible, and IPS monitors support refresh rates in that region. Tearing at 120Hz+ is much less of a visible issue...
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I was just watching a CES presentation done by tech of tomorrow and he was interviewing some guy from nvidia. Funnily enough, that dude employed by nvidia seemed to think that G-sync is a much more advanced implementation. He also went on to mention that there are a lot of "hardcore" PC gamers working at nvidia, and they work hard to bring the best technology for PC gamers.

I would think that an nvidia employee would have a skewed interpretation of things, but that's just my suspicion. So take any interview with an nvidia employee with a grain of salt - He's gotta be biased. I mean. He's going to always be in favor of his employer and never view things objectively. Just like AMD employees, I don't know if they're biased but there's always the possibility. But, I haven't found a video of an AMD employee being interviewed about free-sync. Anyone know an AMD employee that has been interviewed about free-sync? I mean they'll give us the unbiased scoop right? AMD employees are full of objectiveness? Anyway, despite that bias-ness of the nvidia employee in question, here's a link to the youtube video here:

Like I said. Grain of salt. Nvidia employee:

http://www.youtube.com/watch?v=A_ha6RKoNOc&list=TLPKEOid0Gx1XfzF4KjiuFgQz7UvJK0L7B

From what I understand, G-sync is coming to every monitor resolution from 1080p to 4k. Since it's going to 4k, that would obviously include IPS? I would think. Someone correct me if i'm wrong. Or maybe that's IZGO.

So it's not like g-sync is limited to 1080p. It's coming to 4k as well, and is working right now on 4k. So for any nuts that want to spend a ton of cash on an IZGO panel, it's coming :p

Was there an AMD demo of free-sync working on 4k? Just curious. I just saw the demo on a laptop running 1366*768. I don't think that's the resolution AMD is aiming for with free-sync, is it? I mean, they're not aiming it towards laptops I'd assume. I'd think they're gunning for desktop screens. But AMD hasn't said either way so I don't really know. Anyone know?
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
I honestly wonder if at some point PC monitors will reach the point of "good enough"

Yes, undoubtedly. Any display technology is ultimately limited by our eye's ability to see it. Beyond a certain threshold, adding more pixels or refresh rate just won't make a difference.

Note, though, that while that limit on resolution is relatively easy to determine (based on display size and distance from viewer), limits on refresh rates are a whole lot more complicated. There is what is called the flicker fusion frequency after which something that is pulsing on and off looks continuous, but that depends hugely on what you're actually looking at - brightness, color, other things in your field of view, etc. And the on-off flicker is then different from anything that's moving. Even if you can't see a flicker for a static image at a particular refresh rate, you may start to notice jumpiness in something that's moving.

But even for refresh rate there is an upper limit, because there's an upper limit to how fast our things can go on the screen for us to pay attention to. The 120Hz-144Hz range is probably going to be enough for a good long while. A 240Hz display will be useful for something like 3D, but there's not going to be nearly as much benefit for normal viewing going from 120 to 240 as there is going from 60 to 120, and going from 60 to 120 isn't going to be nearly as big as going from 30 to 60.

So, yes. There will be a point where monitors have enough refresh rate and resolution that any further improvements is just number porn, and doesn't actually translate to improved user experience. The things talking about cable standards to support 8k displays for example...8k is just going to be total overkill. If your eye can't register the difference, it doesn't help.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
I was just watching a CES presentation done by tech of tomorrow and he was interviewing some guy from nvidia. Funnily enough, that dude employed by nvidia seemed to think that G-sync is a much more advanced implementation. He also went on to mention that there are a lot of "hardcore" PC gamers working at nvidia, and they work hard to bring the best technology for PC gamers.

I would think that an nvidia employee would have a skewed interpretation of things, but that's just my suspicion. So take any interview with an nvidia employee with a grain of salt - He's gotta be biased. I mean. He's going to always be in favor of his employer and never view things objectively. Just like AMD employees, I don't know if they're biased but there's always the possibility. But, I haven't found a video of an AMD employee being interviewed about free-sync. Anyone know an AMD employee that has been interviewed about free-sync? I mean they'll give us the unbiased scoop right? Anyway, despite that bias-ness of the nvidia employee in question, here's a link to the youtube video here:

Like I said. Grain of salt. Nvidia employee:

http://www.youtube.com/watch?v=A_ha6RKoNOc&list=TLPKEOid0Gx1XfzF4KjiuFgQz7UvJK0L7B

From what I understand, G-sync is coming to every monitor resolution from 1080p to 4k. Since it's going to 4k, that would obviously include IPS? I would think. Someone correct me if i'm wrong. Or maybe that's IZGO.

So it's not like g-sync is limited to 1080p. It's coming to 4k as well, and is working right now on 4k. So for any nuts that want to spend a ton of cash on an IZGO panel, it's coming :p

Was there an AMD demo of free-sync working on 4k? Just curious. I just saw the demo on a laptop running 1366*768. I don't think that's the resolution AMD is aiming for with free-sync, is it? I mean, they're not aiming it towards laptops I'd assume. I'd think they're gunning for desktop screens. But AMD hasn't said either way so I don't really know. Anyone know?

Every person in the world is biased, there is no such thing as "unbiased". Just varying degrees of bias.

I agree that Gsync is a more "advanced" implementation - it has a frame buffer and an FPGA. Expensive, complicated, and advanced, yes.

I suppose it might need a larger framebuffer for higher resolutions (but then again, cost of Gsync obviously is not as much of an issue compared to the cost of a 4k monitor. Nvidia will be more than happy to raise the price there.)

The implementation AMD is using has nothing to do with resolution, but rather the display connectivity. It could be 8k resolution, or Imax, or 640x480. All that matters is it has the ability to use VBLANK with a secondary packet sent along with the image to display. Gsync has more of a hardware limitation, since they're avoiding standards and going about it in a roundabout way.

Free sync is, in it's current state, a step away from being vaporware. But I'm glad they have an idea, at least. I'd rather have the hope for a free solution than Nvidia coming right out of the gates demanding you to empty your wallets if you want a fix. Every product is "vaporware" until they get to a certain point.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Well I certainly don't disagree that free-sync is closer to vaporware and proof of concept. In fact I agree with you completely, sushi, that free-sync is vaporware at this point. :) I just wonder why they chose to demonstrate it when they did. Nvidia didn't show g-sync until it was done. I'd assume that AMD wouldn't show free-sync until it's done, but it's increasingly looking like it won't be around until 2015.

We'll see though. I do have to question the statement of nvidia using hardware. Because PCPer questioned AMD's vice president of visual computing, Kodira, and he stated and confirmed that AMD's free-sync will require the monitor to have a variable refresh control board just like g-sync does. Also, FPGA is only being used right now and it will transition to ASIC later in Q1. FPGA is expensive, ASIC isn't. At least, that's what nvidia is stating. So their assumption is that G-sync costs will be substantially lower by the end of Q1 2014 since monitor manufacturers will switch from FPGA to ASIC gsync modules. Does anyone have concrete data points on FPGA versus ASIC costs? If not i'll go dig that up.

Someone correct me if i'm wrong. Since free-sync requires monitors to have a variable refresh aware module, just like gsync (this is per AMD's Kodira, BTW), That would indicate that free-sync isn't free? Wouldn't that means that free-sync costs money in literally the same area that g-sync requires? I dunno. I don't expect free-sync to exist until 2015, so maybe costs will be substantially lower for the control board by that time. Free i'm not sure about however, unless someone can clarify that for me. I could be really missing something here, i'm not sure.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
sushiwarrior,

I'm not clear why the DP 1.3 standard that is the core idea (if there is one) behind FreeSync matters. If Nvidia can do it with DP 1.2, why do we need DP 1.3? FreeSync is still going to require new hardware in the displays themselves, it's not just a cable spec issue, so how does the DP 1.3 standard translate into actual use in a free way? If it requires the display manufacturers to include as-yet-undeveloped hardware to include in their displays (and it does require it) that doesn't exist yet, how is that cost not going to be translated to us, the consumer? How could it possibly be free?

I find it rather odd that people are demanding that Nvidia give them something they undoubtedly spent a good deal of money on internal R&D and hardware manufacturing on for free. They developed it, why shouldn't you pay for it if you want it? Isn't that how companies work - they make something you want, and so you then pay to get what you want? It's their effort, their IP, their product. Why do you deserve it for free? Why do their competitors deserve it for free?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
It could be that nvidia saw the problem, since they have a lot of core PC gamers employed, and wanted to fix that problem. So they tackled it head on. They spent their millions on R+D for their solution which required a variable refresh control board installed in monitors.

Just my personal opinion here, and only that - AMD is passing the buck to everyone else to fix the problem for them. It seems like they're saying, "Hey guys, here's a great idea, go and fix it for us!". This is much like the approach AMD took with HD3D. I'm not sure what ended up happening to HD3D though - anyone have an idea? Either way, like you said, both free-sync and g-sync requires compatible monitors to have a variable refresh control board. Maybe there are charity hardware companies that will provide that hardware variable refresh control board for free. I don't know. But I kinda doubt it.

I guess the key difference is that nvidia spent millions in R+D in developing G-sync. So you're right, they can't freely give it away. Because of AMD's financial weakness, i'm thinking that AMD isn't in a position to spend millions of dollars in R+D like nvidia can do. So they're asking everyone to fix the problem for them. At least that's the impression I get. AMD didn't really do any design on free-sync, they literally just took an eDP laptop and tested it. ** again this is speculation on my part but all of AMD's actions thus far seem to indicate this.

It's too early to talk about anyway since free-sync is pretty far from being done at this point.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Yeah, I agree, nobody in the world uses or would upgrade to a 1080p or a TN panel. I guess we need some concrete data points, perhaps sales figures from the largest etailer in the 300+ million strong populated US? I don't know who the top etailer is, i'm coming up empty handed. So I gathered some amazon.com top 100 data, even though amazon.com sales data is completely irrelevant. At least, i've been told that amazon.com is irrelevant here.

http://www.amazon.com/Best-Sellers-E...ics/1292115011

For some reason nearly every monitor on there is 1080p or less? And most of them are TN? I don't think the data is correct. I do recall someone telling me that the #1 position on this list sells hundreds of units per minute on amazon. But who knows. They could be lying.

Someone help me out here. This top 100 selling list is updated hourly, can't seem to find the 1440p and 4k panels on there though. If amazon is to be believed, that would indicate that the vast majority of users are upgrading to 1080p or less TN panels. That can't be right. They're all upgrading to 4k. I think their database is erroneous, though. I'm sure some 4k panels should be there in the top 100, but I only found a couple of models in the top 100,000 selling rank range. But, like I said, i'm sure the data is erroneous. Amazon.com claims the data is updated hourly? Who knows if they're being truthful though?

Anyone else have concrete data points? I understand that 4k panels are the new thing now and are selling like hotcakes, but according to the Steam hardware survey - which pinpoints gamers specifically - 1440p monitors are around the 300$ mark now and are IPS. But for some reason, only around .5% of all steam users are using 1440p with greater than 70% using 1080p. Again, while this is from Steam's December 2013 data i'm thinking that Steam has erronous or manipulated data. Not sure what's going on there. Steam has been known to provide false data, at least i've heard that around here. Not quite sure.

I do think freesync was demo'ed on a 1366*768 panel. I'm not sure what the sales data is on 1366*768 panels, i'm coming up short on finding concrete data points on this resolution. How well does 1366*768 panels sell? Anyone? Just curious. I don't think AMD would target that resolution for free-sync whenever it's released in 2015, though, would they?

Nice wall of text trying to discredit something that was never said.

We all agreed 1080p is the most used monitor around. The whole point I was making is that since so many people already have 1080p monitors they would be silly to upgrade to another expensive 1080p tn monitor just for gsync. Displays last a long time and the point that was that since most people keep their displays for such extended periods they would be smart to wait for 4k, 20nm GPUs to run 4k, and more news on gsync and free sync.

Instead of automatically attacking people read what they are saying.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Because PCPer questioned AMD's vice president of visual computing, Kodira, and he stated and confirmed that AMD's free-sync will require the monitor to have a variable refresh control board just like g-sync does. Also, FPGA is only being used right now and it will transition to ASIC later in Q1. FPGA is expensive, ASIC isn't. At least, that's what nvidia is stating. So their assumption is that G-sync costs will be substantially lower by the end of Q1 2014 since monitor manufacturers will switch from FPGA to ASIC gsync modules. Does anyone have concrete data points on FPGA versus ASIC costs? If not i'll go dig that up.

Someone correct me if i'm wrong. Since free-sync requires monitors to have a variable refresh aware module, just like gsync (this is per AMD's Kodira, BTW), That would indicate that free-sync isn't free? Wouldn't that means that free-sync costs money in literally the same area that g-sync requires? I dunno. I don't expect free-sync to exist until 2015, so maybe costs will be substantially lower for the control board by that time. Free i'm not sure about however, unless someone can clarify that for me. I could be really missing something here, i'm not sure.

Yes, you're completely wrong ^_^ Freesync requires a controller which can RECEIVE the signals for variable refresh rates.

All that is needed for this to work, as AMD explained it, was an eDP connection between the discrete GPU and the display, a controller for the screen that understands the variable refresh rate methods of eDP 1.0 specifications and an updated AMD driver to properly send it the signals. The panel can communicate that it supports this variable refresh technology to the graphics card through the EDID as resolutions and timings are communicated today and then the graphics driver would know to send the varying vblank signals to adjust panel refresh times on the fly.

The panel controller must support a feature in the display specification. Since all of the "prediction" and "logic" happens on the PC's end, there is no need for an ASIC/FPGA to do this or a framebuffer to store anything. Nvidia's Gsync module does more than just understand variable refresh rates. Like I said earlier, it's more advanced. My opinion is it is more advanced than is necessary, and is an example of poorly planned engineering - they have a good implementation of a poor idea, while AMD has a poor implementation of a good idea.

Freesync is "free" in the same way that getting a display that supports DP or DVI or something similar is "free". Any extra cost is likely due to the niche nature of supporting the feature, rather than the difficulty in understanding the signals. I am doubtful that would require any significant amount of logic.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Nice wall of text trying to discredit something that was never said.

We all agreed 1080p is the most used monitor around. The whole point I was making is that since so many people already have 1080p monitors they would be silly to upgrade to another expensive 1080p tn monitor just for gsync. Displays last a long time and the point that was that since most people keep their displays for such extended periods they would be smart to wait for 4k, 20nm GPUs to run 4k, and more news on gsync and free sync.

Instead of automatically attacking people read what they are saying.

Clearly most people are "upgrading" to 1080p panels per sales data. Like you said. Monitors last a while. I know most of my friends have lower than 1080p panels, and their last purchase was in 2007-2008.

But that's the beauty of having choice on the market. Choice with freesync and g-sync, although free-sync won't exist until 2015 it seems like. G-sync has monitors in production, now, with resolutions up to 4k. 1440p to 4k. So there's consumer choice there as well. But I do think that most people do upgrade to 1080p. I know plenty of hardcore competitive gamers that did a sidegrade for lightboost. Seems most people love lightboost, although I think a few in this thread may hate it.

I don't disagree with you in terms of market choice. G-sync has all of the bases covered (every resolution), and i'd assume that free-sync will have those bases covered in 2015 as well.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I agree I think Hawaii and Kepler are good ideas, but given a few hours and an easy bake oven I could do better.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yes, you're completely wrong ^_^ Freesync requires a controller which can RECEIVE the signals for variable refresh rates.

Well you should align this statement with the statement provided to PCPer by AMD's VP of visual computing. Cuz that's not what they were told. PCPer discussed this in depth in one of their podcasts last week - A new variable refresh controller is required, and no monitor currently on the market has this. So it has to be added. So it isn't free, unless i'm missing something. Adding something that doesn't currently exist doesn't seem free, although there could be charity companies that will do this for AMD. I'm not entirely sure.

Adding something to monitors that doesn't currently exist.

I'm thinking of how this can be free.

Coming up short here.

G-sync module isn't free because nvidia is adding something to monitors that doesn't currently exist. Desktop monitors do not have variable refresh aware control panels. All of which was confirmed by AMD's Kodira. Unless we're to think that nvidia's engineers are idiots and added it for completely no reason. There's always that possibility. :) You never know. Maybe they're over-engineering g-sync. Because their engineers are stupid. Always a possibility I guess, right?
 
Last edited: