Sigh... Vega 64 dissapointment..

Candymancan21

Senior member
Jun 8, 2009
278
3
81
So not too long ago i bought a 1070ti at impulse due to its 470$ price tag, me having a r9-290 on 2560x1440 on my new freesync monitor was killing my fps.. But i forgot about freesynce when i got this 1070ti...

So I bought a Asus Strix ROG Vega 64 OC edition... And did next day shipping.. Since I had both Vega 64 and a 1070ti and RMA policies are 30 days on Amazon and Newegg i decided why not test them both and see which one is actually a better card ?

Well the games I used were Subnautica, Wolfenstein 2, Total War Warhammer, World of Warships, The Witcher 3 and Mechwarrior Online, which are the games i normally have been playing.
All the games were maxed out in details, running at 2560x1440 resolution and my monitor is freesync and 144hz..


In Subnautica the Average FPS for the Vega 64 during a path i made in game using a little string like marker to follow Was 65fps, Freesync worked.. but the game had very bad ingame micro stuttering that hurt my eyes and drove me insaine, the game was running at that good framerate but it looked like it was running at 15fps.. I turned freesync off and that didnt do anything. Some Searches online noticed the Vega 64 has microstutter in this game.. On top of this i had to use a driver release that was 5 releases back in order for the game to even launch, some serious driver problems.
The Nividia 1070ti was getting an average of 70fps, and have 0 Microstutter, but again no freesync


In Wolfenstein 2 the New colossus, average FPS on the Eva's Hammer was 130 or so with spikes in the 150's. Freesync looked BEAUTIFUL, but sadly it was doing that bullcrap flickering that freesync does. I recorded a video of it and will upload it. The Flickering on freesync i think its a bandwidth and driver issue, it stops doing it at 1920x1080 or in windowed mode but does it in full screen.. Its terrible looking and looks like a long tube florescent light is flickering slightly.. With no freesync it ran fine, but then of course you have that input lag or weird slowdown you notice on objects almost like a ghosting but not, if you guys have seen freesync on and off you know what im saying.
.
The Nvidia card was getting roughly the same FPS 130-150 and of course no freesync but no complaints other then that..


In Total War Warhammer 2, The Vega 64 gets an average of 60fps in the World map benchmark, 55fps in the battle map benchmark and the 2nd battle benchmark it got 49. The Vega 64 also in a real game on the compaign map has horrible micro stutter just like in subnautica.. one second its smooth as butter the next it looks like youre getting 15fps but youre not and your eyes start hurting.. Freesync flickers SOMETIMES only 20% of the time maybe
The 1070ti, on the campaign benchmark got 73 average, the first battle benchmark got 61, and the 2nd battle benchmark got 53. No microstutter at all ran smooth as silk in the same campaign map that the vega 64 stuttered in.. But again no freesync so youre stuck with vsync or fast sync.


In Witcher 3, both cards ran smoothly and had 0 issues, freesync worked great, but the average fps for the Vega 64 was 75 in Khaer Morghen and the 1070ti was 80.
In Mechwarrior online, the Vega 64 had serious Micro suttering and the game looked like i was again 15fps and was killing my eyes during gameplay.. Yet the fps was 60-70+, freesync seemed to work but with the microstutter i couldnt tell jack.. 1070TI got the same framerate but ran smooth as butter.
In World of Warships, the Average FPS on Vega 64 was 70 the same map and same moving position the Nvidia card was 81.. In THE MAIN menu at port the Nvidia card gets 10fps higher as well. Freesync works great and looks beautiful in this game. (i remember it used to have freesync flicker on my r9-290.)


I also forgot to mention i did play Residentevil 7 as well and freesync did work in the game as well and fps was in the 130-140 range on both cards.


So in every single game the 1070ti got more fps then the Vega 64.. And this is using the stock boost clocks on both cards.. Also the Strix Vega 64 cant even reach its advertised boost clock of 1590mhz that asus claims.. When you start a game it only goes to 1550 immediatly but then slowely drops down and down to a good medium around 1440mhz and 82C !!! on the core.. And this is with Asus's Fancy 3 fan cooler 40% better contact heatsink design on the Strix ROG... The Nividia 1070ti is getting an Average of 10-20% more fps in all my games i played over the Vega 64 except in Wolfenstein 2.

So my question is this, The Vega 64 is clearly slower for me on my system.. I dont get where all these benchmarks are coming from saying the Vega 64 is faster and the 1070ti is competing with the Vega 56.. Thats a flat out lie.. At least from what ive experienced.. The Vega 64 also STILLLL suffers from freesync flickering.. Why did i just spent $600 on a card thinking i was going to RMA the 1070ti becuase i boughgt it by accident.. only to find out in 2 of the games i tested that it flickers. And the micro stuttering.. in 3 games, subnautica, warhammer 2, and mechwarrior online.. I mean this microstutter is game breaking.. It LOOKS bad... and hurts your eyes.. have you guys ever seen a game run at 15-20fps.. thats what this looks like yet the games were getting 60+ fps.

The Vega 64 runs so hot 82c and this was just playing the games for 30min or so each.. Imagine if my room was warm from 6 hours of gaming.. It wont even reach advertised clock speeds.. and actually is downclocking itself 200mhz almost.

Can someone please try to convince me NOT to RMA the Vega 64... Why shouldnt I ? Why should I keep a card that got lower frame rate over a card that was 130$ cheaper.. I thought to myself.. eh 10-20% is only 5-10fps less.. Freesync looks so good who cares.. but, why should i keep a card that flickers using freesync lol ? And why does this thing get soo freaking hot at 82c maxing out the fan speed even with 3 120mm fans on the case door alone not mentioning the 6 other case fans i have lol.

I dunno, sorry for a long post im sure i got wont get any replies.. I have 2-3 days to decide which card to keep.. Please help me out here.. I really REALLY want the Vega 64.. freesync if you have seen it working vs standard vsync looks 10x amazing.. games run like its real life.. its like watching a video at 640x480 then comparing it to 1440p on youtube. Its that much better... I had a friend who knows nothing about this stuff watch wolfenstein 2 with freesync off and on, and she thought the game made her feel sick with freesync off and just using vsync due the input lag and that weird ghosting like affect it causes.. And with freesync on she said it looked like it was flowing like a smooth stream of water.

But with all the negatives.. the lower frame rates, freesync flickering and the micro stutters.. why should i keep this thing ? Originally i thought i could deal with half my games not working with freesync and the other half working.. but the micro stutters is what put that last nail in the coffin half way down for me.

And before you say well if i can afford a $600 Vega64 and a $470 1070ti, so why not get a 1080ti.. Yes I have the money but that is only for 1 or the otherand i needed the 500or 600 back on one.. I cant spend $800 for a 1080ti
 
Last edited:

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Freesync with your monitor obviously has a value. No-one (other than you) can quantify that value in terms of dollars.

I got my Vega 64 for considerably less than a 1080 before the 1070Ti was released. So I was lucky with hitting price/performance curve, and it seems you paid over price/performance for the Vega (which is normal for the mining craze). But when I bought, and still now, I believe in the "fine wine" aspect of AMD GPUs. That is I believe it takes a while for AMD hardware capabilities to be utilized. In the past AMD GPUs have increased performance up to 10% year on year vs the competition. So in the long run AMD has appeared to be the better choice. But in the short term nVidia options look better.

I guess you have to balance it all out and make a decision. For you the nV option is faster and cheaper so that seems an obvious decision. But freesync advantages should also be considered, and personally I rate tear-free gaming highly. And if you're going to keep the GPU long term the fact AMD ages better can maybe come into consideration (although less important and more speculative).
 

Jackie60

Member
Aug 11, 2006
118
46
101
Just remember the Vega like a fine wine will just keep getting better. I haven’t owned an AMD card since my 295x2s and have been consistently impressed with every Nvidia card since (980TIs,Titan XPs and 1080TIs). Vega has been a laughably bad gaming card reminiscent of the 3870 and I pity Intel if Raja produces the same sort of abortion for them. I used to buy AMD when I could possibly justify it but they are not worth looking at in GPU terms now imho. I really can’t see the situation changing as they seem focused on creating cards that require an awful lot of effort from developers to work well. Don’t subsidise overpriced mediocrity, send the Vega back. AMD have charged a fortune for their products when ahead as have NVIDIA so I won’t feel sorry for them and buy out of charity for either company. Driver support is another thing Nvidia does get right, reward success not failure. The fine wine analogy was tongue in cheek btw but there is the ‘overclockers dream’ factor as well as the ‘poor Volta’ element to take into account.....Oh wait!
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Just remember the Vega like a fine wine will just keep getting better. I haven’t owned an AMD card since my 295x2s and have been consistently impressed with every Nvidia card since (980TIs,Titan XPs and 1080TIs). Vega has been a laughably bad gaming card reminiscent of the 3870 and I pity Intel if Raja produces the same sort of abortion for them. I used to buy AMD when I could possibly justify it but they are not worth looking at in GPU terms now imho. I really can’t see the situation changing as they seem focused on creating cards that require an awful lot of effort from developers to work well. Don’t subsidise overpriced mediocrity, send the Vega back. AMD have charged a fortune for their products when ahead as have NVIDIA so I won’t feel sorry for them and buy out of charity for either company. Driver support is another thing Nvidia does get right, reward success not failure. The fine wine analogy was tongue in cheek btw but there is the ‘overclockers dream’ factor as well as the ‘poor Volta’ element to take into account.....Oh wait!
AMD "fine wine" is real and provable, but the extent of this is up for debate. Especially with regards to future projections.

I fear for people who stick their heads in the sand and ignore reality by completely denying its existence.

But as I stated in my post it's "less important and more speculative" than other factors in the purchasing decision. So it's odd to have people jump into this thread and pick out the least important aspect of my post to rage over...
 
  • Like
Reactions: ryan20fun

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
An issue might be playing all the games maxed out especially gamse with NVs CrapWorks (real name is GameWorks) on top which are supposedly there for enhanced graphics but in reality are there to cripple performance on AMD cards mainly thought using to much tessellation. NV cards also lose performance just less then AMDs. So it's certainly possible than in some of the games displaying just 1 or 2 of these feature will solve the issue with the vega card.

Besides that Vega 64 is a waste of money. You can overclock a vega 56, especially also the memory and undervolt it and then you get near vega 64 performance at much lower power (and noise). AMD cards usually need more tweaking.
 

Guru

Senior member
May 5, 2017
830
361
106
Vega 64 is much better than the GTX 1080 in Resident Evil 7 and Wolfenstein 2. I have no clue how you managed to get same fps in those two games with a 1070ti, when Vega handily beats the 1080 in both those games.

Total War Warhammer has DX12 option, on that AMD cards run much better than Nvidia ones on DX11.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,979
839
136
Big Vega also needs a LOT of power to perform adequately. Depending on your PSU, you might need to rethink your cabling and do NOT daisy chain the connections. Use two separate runs. I'm not sure what's causing the flickering though... Neither of my Vega 64's do that... and I'm using Freesync too.
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
OP first of all thanks for the info.

I was wondering if you checked the gpu load of the two cards. I mean it is known that the AMD driver has a higher cpu overhead and maybe your cpu is not fast enough for the Vega 64. It is not bad per se, but it's not a ryzen2/coffeelake either.

Maybe a 3dmark cpu overhead test would shed some light as well.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
Big Vega also needs a LOT of power to perform adequately. Depending on your PSU, you might need to rethink your cabling and do NOT daisy chain the connections. Use two separate runs. I'm not sure what's causing the flickering though... Neither of my Vega 64's do that... and I'm using Freesync too.

Not really. If you OC vega 56 and undervolt it, performance/watt is very similar to a 1070.
 
Mar 11, 2004
23,031
5,495
146
OP, I feel like there's something not quite right. I'm not saying you should be seeing spectacularly better results, but I have a hunch that you might need to tweak some settings and you'll see a noticeable improvement. Even then I won't say it'll be worth it, and I personally can't justify spending $600 for any card, and the 1070Ti is a good card so it being almost 25% cheaper is kinda a no brainer in comparison, even with the Freesync monitor.

Sorry if you've already done this stuff, but try tweaking some settings. Make sure Radeon Chill is off, make sure Power Efficiency is off (not sure if its an option on the Vega or if it was just the Polaris cards). Check to see if there's any framerate cap/limiter things on. Make sure GPU workload is on Graphics. I'll let others chime in on Freesync options (wasn't sure if stuff like VSync and triple buffering need to be adjusted there, I assume you'd want both off). Then in Wattman, up the Power Limit (its with the GPU Temp settings) all the way (think it goes up to 50, even if you're not overclocking this seems to improve performance, and coupled with undervolting you see better temps, better efficiency, and better performance too - especially sustained) and adjust the temp and fan to what you'd prefer (maybe set it to 80 max so it might kick the fans on higher), and then check for what are solid undervolt settings. That will help temps and boost clocks and sustained clocks.

Another possibility is that you got a card that has an issue (like the HSF wasn't properly seated, IIRC that was an issue with some of the Vega cards, especially the aftermarket ones, as there's a bit of a gap between the top of the HBM and the top of the core causing the HSF to not make proper contact). The fact that its not even hitting the boost clocks its supposed to is a sign that there's some sort of an issue. Could be power related like the other user commented, could be air flow in your case is causing elevated temps (or that its dumping so much heat; undervolting will help, even when not gaming; on the RX480 I have it makes a very noticeable difference, I've had updates that reset the driver settings even at idle).

One last point, since it seems there's a solid chance we'll be seeing new Nvidia cards in a month or so, I have to say I'd send both back and use the 290 a bit longer and see what the 1170 offers. If nothing else, the 1070Ti, or 1080 (or maybe even 1080Ti) will likely be more affordable so you can either get a better card, or will save some money that you can put towards a GSync monitor if you want to.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Heads up, I have a Gsync+1080ti combo right now and I still get flickering when you go under the Gsync threshold. This is the same between the technologies, except that FreeSync minimum threshold is per-monitor, not uniform across all monitors.

I get flickering in Total War Warhammer II, for example, after you end turn and it's computing the other AI turns and the FPS drops to about 7 or 9 (by design). You can perceive some flickering.

The fact that you have good ventilation yet it still overheats even while throttling is concerning though. That doesn't sound right
 

Jackie60

Member
Aug 11, 2006
118
46
101
AMD "fine wine" is real and provable, but the extent of this is up for debate. Especially with regards to future projections.

I fear for people who stick their heads in the sand and ignore reality by completely denying its existence.

But as I stated in my post it's "less important and more speculative" than other factors in the purchasing decision. So it's odd to have people jump into this thread and pick out the least important aspect of my post to rage over...
I wasn’t responding to you and actually hadn’t read your post I was just cataloging the stuff AMD has previously said about their cards. I accept that the architecture usually shows its merit with age but lately the longer term improvements seem to have stalled. I used to always back AMD but have been very disappointed with their recent uncompetitive offerings. The power of AMD cards is demonstrated by their mining prowess but failing to translate that into gaming performance is what really irks me. Apologies if you think I was in any way raging at your post, I wasn’t and it wasn’t about you. I was raging against statements from AMD that have been overly optimistic at best and outright deceptive at their worst hence the reference to Fiji being an ‘overclockers dream’ (it wasn’t) and ‘Poor Volta’ which implied Vega might be superior to a future Nvidia architecture (it wasn’t) rather than inferior to a previous Nvidia architecture as was the case.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
AMD "fine wine" is real and provable, but the extent of this is up for debate. Especially with regards to future projections.

I fear for people who stick their heads in the sand and ignore reality by completely denying its existence.

But as I stated in my post it's "less important and more speculative" than other factors in the purchasing decision. So it's odd to have people jump into this thread and pick out the least important aspect of my post to rage over...

Vega 64 at release (August 2017) vs. GTX 1080: https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/31.html
Vega 64 on May 22 vs. GTX 1080: https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/31.html

TL;DR Even with Wolf 2 included in the May 22nd benchmarks and obviously not at Vega 64's release, no ground has been gained. In fact, Vega 64 ever so slightly LOST ground.

The fine wine analogy WAS true. It was due to the differences in architectures and developers maximizing console performance then porting to PC, but the console optimization effect is already baked into the new cards (Vega, Polaris, Pascal) and therefore we're seeing Polaris only eeking out minor gains vs. GP104 and Vega basically still only trading blows (by and large) vs. GTX 1080.

The days of AMD gaining 10-15% vs. Nvidia 6-9 months down the road is gone, at least until next gen consoles and the coinciding generation of GPU's come out.
 
  • Like
Reactions: Muhammed
Mar 11, 2004
23,031
5,495
146
Vega 64 at release (August 2017) vs. GTX 1080: https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/31.html
Vega 64 on May 22 vs. GTX 1080: https://www.techpowerup.com/reviews/ASRock/RX_580_Phantom_Gaming_X/31.html

TL;DR Even with Wolf 2 included in the May 22nd benchmarks and obviously not at Vega 64's release, no ground has been gained. In fact, Vega 64 ever so slightly LOST ground.

The fine wine analogy WAS true. It was due to the differences in architectures and developers maximizing console performance then porting to PC, but the console optimization effect is already baked into the new cards (Vega, Polaris, Pascal) and therefore we're seeing Polaris only eeking out minor gains vs. GP104 and Vega basically still only trading blows (by and large) vs. GTX 1080.

The days of AMD gaining 10-15% vs. Nvidia 6-9 months down the road is gone, at least until next gen consoles and the coinciding generation of GPU's come out.

That's within margin of error and Vega gained at 4K compared to the 1080.

I think that is true of Vega, but I think AMD likely has moved on (meaning they're not focused on improving Vega currently released). I'm not sure if that's because there's something wrong in the chips or their implementation of the memory and interposer or something that is not really fixable via software, or just because its practical to focus on their next products (if there are big performance improvements, it'd be better to leave them til the next product release to show the most gains, and then backport them if that's possible). Mobile Vega will likely operate at a much more sweet spot for efficiency, and Vega Instinct is on 7nm - so it is likely tailored much differently and has more compute focused bits (which might have been the problem in gaming/consumer Vega). Navi will be interesting, as it'll possibly be Vega reworked properly (if nothing else more effort put to pushing up clock speeds, and of course 7nm offering substantial improvement over 14nm), or it might be a bigger shift (and so needs more development already). I have a hunch it'll mostly be Vega reworked, with them adding InfinityFabric and possibly pushing a new multi-chip solution (but not as drastic as the modular, multi-die operating as a single larger one idea that people were expecting with the early info about Navi; by multi-chip, I'm saying something like Crossfire, but also for pairing APUs with dGPUs for mobile and OEM designs). I expect that Navi will be GDDR6 for all consumer stuff (maybe not some mobile but that'd likely be some partnership with Intel), and HBM for pro (where if you really want top line performance you could buy the "Frontier" like cards.

Didn't Raja say that they were busy reworking the driver/software to actually take advantage of the new stuff they'd added (even with Polaris there were stuff that wasn't being utilized fully)? I'm sure him leaving has thrown a huge wrench into that. They already said they're not going to bother doing the work for the one thing, saying they'd leave it up to developers to enable, which yeah, I'd guess won't happen a lot until the next gen consoles since they'll likely include it as standard). If they're still doing that (reworking the software/driver side), the focus is likely on APUs, since that's the place where the benefits would help the most, and it'd be the simplest (and then build off from there for the larger chips).
 

Candymancan21

Senior member
Jun 8, 2009
278
3
81
This isnt a flicker when under Freesync ranges.. my monitors range is 48-144hz for freesync.. in wolfenstein 2 i capped fps to 143 because i got over 140 with Vega 64, and guess what sit there on the Evas hammer "submarine base" in a spot getting say 115fps and the monitor flickers.. Hell IT FLICKERED in the damn game menus too

I recorded a video of it flickering in the game and in the menus if you guys wanna see just just ask away and ill upload it to prove to you it isnt a freesync range flicker its a bullshit flicker with AMD cards, and no i had the flicker with my r9-290 but i originally thought it was a freesync range flicker because the r9-290 was getting 30-50fps in games on this 1440P monitor.. But again the Vega 64 gets way more fps so it isnt a FPS range issue.

Here is just one big thread on the AMD's forums.. its been going on for years .. And there is no fix in sight.. and AMD is aware of it and they havent posted back since they originally said they would look into it.. No explanation.. just silence because they know there is no way to fix it.. Its flawed

https://community.amd.com/thread/215821.

I spent 3 days TWEAKING, for sometimes 10 hours a day until 5am am in the morning and sleeping until 4pm and going at it again when i woke up trying to get the flickering to go away, i tried 6 different drivers, 8 different video games, and they all flickered except the witcher 3 and that was it.

Im not about to spend weeks tinkering and tweaking redactedto get something thats suppose to work to work because AMD cant even figure it out themselves.. The older threads when they became aware of it they "fixed" it with the 18.3.2 drivers i think it was that one it had like 4 fixes for flickering, and guess what it didnt fix jack squat.. Im not ranting like some Nvidia fan boy as i said ive had nothing but ATI and AMD cards for almost 2 decades now, minus a 8800GTS.. Freesync is failure in my eyes and it shouldnt be "hit or miss" or need "tweaking" to get to work when you spend redacted1000$ on a monitor and video card.


So i RMAd the Vega 64 back to newegg and because im a 14 year long customer they gave me my cash back vs store credit "there crappy business practices i wont shop there again" and i RMAd the 1070ti to amazon.. I just said screw it and i bought a 1080TI for $770 at micro center and guess what.. my fps is so high i dont need vsync or fast sync in games im getting 120-140fps in 80% of my games now. Sure i have a freesync monitor but thats ok.. There are no good Gysnc monitors that are 27 inches, curved, with a 120ppi like this one has so whatever.

The Vega 64 was a trash card anyway, it never reached its boost clock speeds even when i managed to cool it down to 75c using a giant ass room fan blowing on the damn thing it wouldnt overclock past 3% before the drivers crashed.., and it also has micro stuttering. When looking at GPU-Z the core clocks and specially memory clocks were jumping all over the damn place.. Memory would jump 800-950-800-950-800-950 at 100% gpu load and it wouldnt just fricken stop.. I also cannot explain why a game would run and look like it was getting 15fps but fraps stated say 70.. if i turned Vsync on to check, fps would in fact drop to 15fps proving the game was running at that frame rate and that would explain why it looked like it hurting my eyes and clearly running slow.. Alt tab the game and boom its fast again, then a few minutes later it did it again.

My R9-290 did this in subnautica only but the Vega 64 did it in 3 games.. The 1080ti i have and the 1070ti didnt do it.

And it wasnt my OS or HDD or anything, i reinstalled the OS and started fresh..

So why in the redactedwould anyone buy a Vega 56 or 64 if you have to mess with voltage unvolting it, flash the bios to pretend youre 56 is faster then a TI1070, and have to deal with using more power and having your card run at 83c, have to deal with flickering, and stutters and other redatcedis beyond me.. Im sorry but this video card is TRASHH and yes this is from an AMD FANBOY.

Just like their cpu's have been for 10 friggen years now YEA TEN and you guys damn well know it.. The RYZEN still sucks in gaming my almost 4 year old cpu is faster so wtf.

Im DONE with AMD that was it, the VEGA 64 was my way of showing support to the Underdog, but you know what.. Its GOOOD to have an Intel CPU that i never need to upgrade for years.. and to have a GPU that is so fast and has NO DAMN issues.. Wish AMD the best and i hope they stay in the fight because Nvidia is showing their redactedside by not releasing the next gen GPU and holding tech back because how slow AMD is.. This just shows without competition there is no innovation.. But god damn i cant give amd another dime until they finally prove to me once and for all they come out with something that works as advertised and is MAYBE once in a decade faster then Intel and nvidia





No profanity allowed in tech.


esquared
Anadtech Forum Director
 
Last edited by a moderator:

Candymancan21

Senior member
Jun 8, 2009
278
3
81
Heads up, I have a Gsync+1080ti combo right now and I still get flickering when you go under the Gsync threshold. This is the same between the technologies, except that FreeSync minimum threshold is per-monitor, not uniform across all monitors.

I get flickering in Total War Warhammer II, for example, after you end turn and it's computing the other AI turns and the FPS drops to about 7 or 9 (by design). You can perceive some flickering.

The fact that you have good ventilation yet it still overheats even while throttling is concerning though. That doesn't sound right


Yes thats what the flickering looks like on the Vega 64, i had the same thing flickering when you end turn and fps drops to 7-9 or something as the cpu does its turns... But when FPS came back up after end turn.. freesync would continue to flicker.. it wouldnt stop LOL.. it did this in world of warships, world of tanks, wolfenstein 2, DOOM, subnautica SOMETIMES, and also in mechwarrior online.. Imagine you seeing that flicker you saw on the end turn in warhammer 2 on your nvidia system 24.7. Youd be pissed too wouldnt you ?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Yes thats what the flickering looks like on the Vega 64, i had the same thing flickering when you end turn and fps drops to 7-9 or something as the cpu does its turns... But when FPS came back up after end turn.. freesync would continue to flicker.. it wouldnt stop LOL.. it did this in world of warships, world of tanks, wolfenstein 2, DOOM, subnautica SOMETIMES, and also in mechwarrior online.. Imagine you seeing that flicker you saw on the end turn in warhammer 2 on your nvidia system 24.7. Youd be pissed too wouldnt you ?
Yeah that seems like something is wrong - my brother has a FreeSync system and it works just like my Gsync one with respect to flickering.
 

Muhammed

Senior member
Jul 8, 2009
453
199
116
New Vega 64 is much better than the GTX 1080 in Resident Evil 7 and Wolfenstein 2. I have no clue how you managed to get same fps in those two games with a 1070ti, when Vega handily beats the 1080 in both those games.
That was old news, 1080 is now equal to Vega 64 in Wolf 2 after game patches and driver updates. Same for RE7.
The fine wine analogy WAS true. It was due to the differences in architectures and developers maximizing console performance then porting to PC,
It was true during the Kepler era, as Kepler aged badly. Not because AMD gained on top of it, it's because Kepler just regressed in new games.

Fiji suffered the same fate also, especially during 2017 and now 2018.
https://hardforum.com/threads/furyx-aging-tremendously-bad-in-2017.1948025/
My R9-290 did this in subnautica only but the Vega 64 did it in 3 games.. The 1080ti i have and the 1070ti didnt do it.
Most of the games you play are not AAA titles, AMD drivers don't care or optimize for these titles as much as NVIDIA, as AMD doesn't have the resources to optimize and fix every single title out there. That's why you will see your Vega 64 trailing or equaling a regular 1070 in dozens of games already on the market, like FrostPunk, Conan Exiles, Crysis 3 .. etc.
 
Last edited:

gamervivek

Senior member
Jan 17, 2011
490
53
91
This isnt a flicker when under Freesync ranges.. my monitors range is 48-144hz for freesync.. in wolfenstein 2 i capped fps to 143 because i got over 140 with Vega 64, and guess what sit there on the Evas hammer "submarine base" in a spot getting say 115fps and the monitor flickers.. Hell IT FLICKERED in the damn game menus too

I recorded a video of it flickering in the game and in the menus if you guys wanna see just just ask away and ill upload it to prove to you it isnt a freesync range flicker its a bullshit flicker with AMD cards, and no i had the flicker with my r9-290 but i originally thought it was a freesync range flicker because the r9-290 was getting 30-50fps in games on this 1440P monitor.. But again the Vega 64 gets way more fps so it isnt a FPS range issue.

Here is just one big thread on the AMD's forums.. its been going on for years .. And there is no fix in sight.. and AMD is aware of it and they havent posted back since they originally said they would look into it.. No explanation.. just silence because they know there is no way to fix it.. Its flawed

https://community.amd.com/thread/215821.

I spent 3 days TWEAKING, for sometimes 10 hours a day until 5am am in the morning and sleeping until 4pm and going at it again when i woke up trying to get the flickering to go away, i tried 6 different drivers, 8 different video games, and they all flickered except the witcher 3 and that was it.

Im not about to spend weeks tinkering and tweaking redactedto get something thats suppose to work to work because AMD cant even figure it out themselves.. The older threads when they became aware of it they "fixed" it with the 18.3.2 drivers i think it was that one it had like 4 fixes for flickering, and guess what it didnt fix jack squat.. Im not ranting like some Nvidia fan boy as i said ive had nothing but ATI and AMD cards for almost 2 decades now, minus a 8800GTS.. Freesync is failure in my eyes and it shouldnt be "hit or miss" or need "tweaking" to get to work when you spend redacted1000$ on a monitor and video card.


So i RMAd the Vega 64 back to newegg and because im a 14 year long customer they gave me my cash back vs store credit "there crappy business practices i wont shop there again" and i RMAd the 1070ti to amazon.. I just said screw it and i bought a 1080TI for $770 at micro center and guess what.. my fps is so high i dont need vsync or fast sync in games im getting 120-140fps in 80% of my games now. Sure i have a freesync monitor but thats ok.. There are no good Gysnc monitors that are 27 inches, curved, with a 120ppi like this one has so whatever.

The Vega 64 was a trash card anyway, it never reached its boost clock speeds even when i managed to cool it down to 75c using a giant ass room fan blowing on the damn thing it wouldnt overclock past 3% before the drivers crashed.., and it also has micro stuttering. When looking at GPU-Z the core clocks and specially memory clocks were jumping all over the damn place.. Memory would jump 800-950-800-950-800-950 at 100% gpu load and it wouldnt just fricken stop.. I also cannot explain why a game would run and look like it was getting 15fps but fraps stated say 70.. if i turned Vsync on to check, fps would in fact drop to 15fps proving the game was running at that frame rate and that would explain why it looked like it hurting my eyes and clearly running slow.. Alt tab the game and boom its fast again, then a few minutes later it did it again.

My R9-290 did this in subnautica only but the Vega 64 did it in 3 games.. The 1080ti i have and the 1070ti didnt do it.

And it wasnt my OS or HDD or anything, i reinstalled the OS and started fresh..

So why in the redactedwould anyone buy a Vega 56 or 64 if you have to mess with voltage unvolting it, flash the bios to pretend youre 56 is faster then a TI1070, and have to deal with using more power and having your card run at 83c, have to deal with flickering, and stutters and other redatcedis beyond me.. Im sorry but this video card is TRASHH and yes this is from an AMD FANBOY.

Just like their cpu's have been for 10 friggen years now YEA TEN and you guys damn well know it.. The RYZEN still sucks in gaming my almost 4 year old cpu is faster so wtf.

Im DONE with AMD that was it, the VEGA 64 was my way of showing support to the Underdog, but you know what.. Its GOOOD to have an Intel CPU that i never need to upgrade for years.. and to have a GPU that is so fast and has NO DAMN issues.. Wish AMD the best and i hope they stay in the fight because Nvidia is showing their redactedside by not releasing the next gen GPU and holding tech back because how slow AMD is.. This just shows without competition there is no innovation.. But god damn i cant give amd another dime until they finally prove to me once and for all they come out with something that works as advertised and is MAYBE once in a decade faster then Intel and nvidia





No profanity allowed in tech.


esquared
Anadtech Forum Director

Did you monitor the hotspot temperatures? I don't get memory downclocking unless power limited by going highly negative(<-20%) on the wattman settings or throttling due to hotspot going over 105C.

AMD's boost clock is really 'boost' clock and you'd hit it sparingly while nvidia's boost clock is basically a floor over which the card would normally operate.

And in regards to microstuttering, I think 1080Ti might be worse at it because when looking at duderandom84's comparison videos, nvidia cards have more variable frametimes,



Both Ryzen and GCN are a clockspeed bump away from catching up to their competition, at least before the software side comes into picture.

 

slashy16

Member
Mar 24, 2017
151
59
71
There is no reason to buy a vega based card unless you are into mining. They are terrible cards on every level compared to the Nvidia alternative.
The only AMD cards worth purchasing today are the RX580/570 series.
 

kondziowy

Senior member
Feb 19, 2016
212
188
116
When looking at GPU-Z the core clocks and specially memory clocks were jumping all over the damn place.. Memory would jump 800-950-800-950-800-950 at 100% gpu load and it wouldnt just fricken stop..
I have seen something like this on my card but only if I applied too little voltage for memory or overclocked too much - it liked to jump 800-1100MHz.

At stock I have no problem and I don't even bother undervolting it lately because it works so great on out of the box profiles. I always did this on my past cards to get temperatures down. Now temperatures are low, and noise is low. But that's on Sapphire Nitro. It has no problem keeping Vega cool. I had Asus Fury and had awesome experience also. But I knew Vega Strix is not the way to go. And why would you buy it if reviews were showing it is not even faster than reference? Imagine how people with Gigabyte OC version feel. May be even worse. And Powercolor had problems with heatsink making no contact with HBM2 at all. Who knows if they fixed this. Nobody tested it later. There is just literally only Sapphire Nitro and Pulse to buy for Vega.