Although I've been thinking what that experience would be like using an all amd build with a Gsync monitor lol.
Freesync and gsync are overrated.
ex: take your current monitor, flip gsync off, flip vsync on, and push your fps over 90. It is unlikely you will notice the difference.
on some monitors* they have additional low latency settings that can be toggled on when gsync/freesync is turned off. If yours is one of these you could get improved results flipping gsync off. ufo** test results get interesting.
*looking at you samsung with your pwm strobing
** https://www.testufo.com/
It might be a small camp. IDK.
Hmm... Freesync 2 premium lets me control the actual configuration of my monitor on per game/app setting and ensures HDR is good to go.
Honestly, it's a huge deal and I wouldn't want to switch away from it now. It's more than just the frame rates, but that is so noticeable as well.
I am still firmly in the camp that you need to be playing with an HDR enabled monitor if you truly care about the visuals of your games. It might be a small camp. IDK.
The purpose of variable rate refresh is for when you FPS drop BELOW your refresh rate. If your refresh rate is 60Hz, and you drop to 55, GSync/FreeSync are there to make it so you don't notice. Or, lets say you are gaming at 120Hz, and you drop to 110, it will be there to make it so you don't notice.
Yup it's suppose to make it so you don't see it. I can tell you now I booted up watch dogs 2 and at 1440p highest settings you can definitely tell when it goes from 70fps to 50. It's very subtle because of the gsync but it's there. It just handles it smoothly despite the drop.
HDR on windows is a mixed bag. I run SDR on the desktop even with my HDR monitor.Hmm... Freesync 2 premium lets me control the actual configuration of my monitor on per game/app setting and ensures HDR is good to go.
Honestly, it's a huge deal and I wouldn't want to switch away from it now. It's more than just the frame rates, but that is so noticeable as well.
I am still firmly in the camp that you need to be playing with an HDR enabled monitor if you truly care about the visuals of your games. It might be a small camp. IDK.
Just a different camp. I'm in the anti-blur camp. I like this test. The tesufo moving photo tests with a Street map. With a strobbing display, you can read the street names. plain and clear with 0 eye stress. That really is night and day. Even with 120hz you have little chance to read the names, you need 120hz + strobbing.
My refresh rate is 144 Hz. Anyone using a Nvidia 3000 series or AMD 6000 series will be able to peg the fps well above 60 Hz all the time. And likely will be running a 144 Hz or better monitor. 144 hz monitors are less then $200, buying a $600+ graphics card and using it with a 60 hz monitor seems odd.
Gsync and freesync is all but unheard of on TVs and many TVs are still stuck at 60 Hz. Flipping vsync on allows any of the above cards to peg at 60 Hz. Radeon anti-lag or nvidia's equivalent is more valuable in those scenarios, allowing the system to extract everything it can out of a weak display.
I have found that I run freesync off, vsync on, and strobe on. Strobe is way nicer then free sync when things start moving.
Maybe gsync is way better, but on my freesync monitor I can definitely see it. With free sync it automatically disables when it drops below 48 fps. Gsync disables 30 fps. Either way in my experience just push the fps past 90 and embrace strobe.
It would seem the proper thing to do would be to try and sell my current g sync only ultrawide and get an ultrawide that has the newer module which supports both free sync and gsync. This is of course if I decide to rebuild and go all AMD.
Holding out for independent reviews on the AMD GPU's and then a little longer to see the timeframe of that supposed 3080Ti.
The GSync Compatible/FreeSync displays don't have a module. Its a VESA standard thats built into the display controller. Its why they are so much cheaper than the older displays that used a module.
I am such a weakling. I figure 10GB probably isn't enough for the next few years, but I guess I'll find out the hard way if it isn't. My local Microcenter had some 3080 stock come in yesterday while I was in the store, and I couldn't resist the new shiny. Also, the size of these cards don't hit you until you hold one. Holy cow they are huge and chonky!
Does sound a lot less than what I paid for this 34 Alienware monitor. I do want at least 34 or 36 inch and ultrawide would be nice. I think LG has some nice ultrawides if I do happen to ever get rid of my current monitor.Yeah man, the monitors that Stuka and I have are in the $280-$350 range.
My monitor has been ~$330 at Best Buy lately. 32" 1440p Curved VA Freesync Premium Pro with Ultrasharp build quality and similar panel warranty. It's just a tremendous value IMHO. I know it isn't an ultrawide (I also wanted an ultrawide) but it checks a ton of boxes.
Hey, if you are paying MSRP for one at least, no judgement at all. They new, shiny and even if they do consume a lot of power top tier. Enjoy!
Haha, you know you're a hardware junkie when you keep a Microcenter credit card in your pocket.MSRP minus 5% because I have their credit card. $777.73 out the door after taxes
Yep my Zotac 3080 just about fit easily in my Antec E-ATX case, hope they don't increase card length much or I will have to get a new case next time around.I am such a weakling. I figure 10GB probably isn't enough for the next few years, but I guess I'll find out the hard way if it isn't. My local Microcenter had some 3080 stock come in yesterday while I was in the store, and I couldn't resist the new shiny. Also, the size of these cards don't hit you until you hold one. Holy cow they are huge and chonky!
Bruh... Tarkov at 3440x1440 fills 12GB on my Vega with HBCC on (stutters if off) and eats all my 16GB ram on newest Reserve map. Textures maxed but other settings on med/low/off. 10GB is low end amount. I expect SUPER cards soon.
At what settings though? Turn down texture quality and suddenly VRAM isn't an issue anymore. But for people who want to run at the absolute highest quality settings, its a very important consideration. For an anecdote, I had a 780ti for a while, and it could run r6 siege just fine but I absolutely could not run it with the high resolution textures enabled, but it was no problem for my buddy with an 8gb 290.But I also know people that play it on 8GB cards without stuttering.
Haha, you know you're a hardware junkie when you keep a Microcenter credit card in your pocket.
Still should be taken into consideration even if not optimised. This game is 4 years old. There are people who want to play it. Especially on newest map. And one step down on textures in this game is making it look like 2005 game.That game is also infamous for being terribly optimized. But I also know people that play it on 8GB cards without stuttering.
Well, that makes at least 3 games where 10 GB is not enough.
If these were midrange cards midrange settings would be reasonable.