• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GN: Beating the 2070 with moded Vega 56

Paratus

Lifer
Steve gives Vega 56 Jermey Clarkson levels of powah using a registry hack (241% up from 50%).

At that point Vega 56 @1710/950 beats stock 2070 and comes close to Oc’d 2070 in several games

Of course it’s using 200W more power but still. 😀.


vega-56-mod-f118-4k.png
 
It actually does better than that chart shows if you look at avg FPS. Beats 1080ti and 2080FE. But at 200 extra watts, doubt anyone would rush out to get one.
 
Man have the forums changed over the years. I remember when free performance through overclocking was always welcomed. Power and heat considerations were always ancillary concerns. And something we happily addressed, since it is a hobby after all. The raw performance for the dollar was the main point. Certainly the silent PC crowd were unlikely to jump in, but the rest of us were always gung ho.I was considering getting a 56 just for 64 level performance on the cheap. Kept putting it off because all of our displays are still 1080p, but now I am going to order one here shortly, along with a freesync 1440p monitor. Won't be pushing it as hard as they are here, but enough to beat stock 64 for certain.
 
It actually does better than that chart shows if you look at avg FPS. Beats 1080ti and 2080FE. But at 200 extra watts, doubt anyone would rush out to get one.
Would it beat a max overclocked 1080ti and 2080fe?
I can't watch the vid.
 
Man have the forums changed over the years. I remember when free performance through overclocking was always welcomed. Power and heat considerations were always ancillary concerns. And something we happily addressed, since it is a hobby after all. The raw performance for the dollar was the main point. Certainly the silent PC crowd were unlikely to jump in, but the rest of us were always gung ho.I was considering getting a 56 just for 64 level performance on the cheap. Kept putting it off because all of our displays are still 1080p, but now I am going to order one here shortly, along with a freesync 1440p monitor. Won't be pushing it as hard as they are here, but enough to beat stock 64 for certain.

then the forum got invaded by people paying german power rates while living in phoenix with a 30 year old air conditioner.

also, the vega is still expensive - after all the work modding you could just get a 1080 and put a mild overclock on it and still be faster.
 
Man have the forums changed over the years. I remember when free performance through overclocking was always welcomed. Power and heat considerations were always ancillary concerns. And something we happily addressed, since it is a hobby after all. The raw performance for the dollar was the main point. Certainly the silent PC crowd were unlikely to jump in, but the rest of us were always gung ho.I was considering getting a 56 just for 64 level performance on the cheap. Kept putting it off because all of our displays are still 1080p, but now I am going to order one here shortly, along with a freesync 1440p monitor. Won't be pushing it as hard as they are here, but enough to beat stock 64 for certain.
I remember the infamous Nvidia FX 5800 ultra, which had all sorts of jokes about heat and power usage. A google shows it used approx. 75W. We are now discussing a card that must be using 400W!!!
 
LoL and with recent 2080Ti massive problems with artifacts ( https://imgur.com/h0m1LZq ) there is no real upgrade path from Vega64. Guess I'll keep it for a few years... I didn't expect to have no upgrade path at the same price after next gen release.
 
I remember the infamous Nvidia FX 5800 ultra, which had all sorts of jokes about heat and power usage. A google shows it used approx. 75W. We are now discussing a card that must be using 400W!!!
I had a 5800U, traded it to the now infamous Rollo (covert, but not really, viral marketer for Nvidia) for his 9800 pro at the time. Still the loudest vid card I ever owned. When they say it sounded like a hair blow dryer, they were not joking, it was LOUD! We did the deal here in this forum.

I have Ryzen 1600 + 580 8GB freesync system and a Ryzen 1600 + GTX 1060 6GB system, and the freesync experience is better, even in games that favor the 1060. Freesync is something reviews and many users gloss over sometimes, but my experience is that it offers the smoothest experience for the best price. Not going to buy a 1080 when I know modded 56 and freesync monitor will deliver a buttery smooth experience.
 
I kinda wish I got a Vega 64. Stock clocked vs my OC'd 1070, it is 33% faster in R6 Siege; about 90 fps more.

It just seems like a fun card to mess around with. I definitely wouldn't give it this registry hack, but having the option is really nice. I hate how locked down NV is compared to AMD.

It seems like Vega is providing very smooth frametimes. I could have gotten a 240hz Freesync for $240 refurbished. Same price I paid for my 180hz gsync refurbished. Those deals are gone into the wind now.

I have this feeling a Vega 56/64 owner will be able to hold onto their card a long time.
 
Man have the forums changed over the years. I remember when free performance through overclocking was always welcomed. Power and heat considerations were always ancillary concerns. And something we happily addressed, since it is a hobby after all. The raw performance for the dollar was the main point. Certainly the silent PC crowd were unlikely to jump in, but the rest of us were always gung ho.I was considering getting a 56 just for 64 level performance on the cheap. Kept putting it off because all of our displays are still 1080p, but now I am going to order one here shortly, along with a freesync 1440p monitor. Won't be pushing it as hard as they are here, but enough to beat stock 64 for certain.

Don't you know? Its all about ball bearings these days... I mean performance per watt. Efficiency man! Save the earth! Except when that doesn't matter because raw performance means everything!
 
I dont think many people care about minor differences in power draw between cards, but when its massive (+100w), it can be a legit concern. Because usually with very high power draw, more rubust cooling is required, and managing higher noise levels, temps.
 
I had a 5800U, traded it to the now infamous Rollo (covert, but not really, viral marketer for Nvidia) for his 9800 pro at the time. Still the loudest vid card I ever owned. When they say it sounded like a hair blow dryer, they were not joking, it was LOUD! We did the deal here in this forum.

I have Ryzen 1600 + 580 8GB freesync system and a Ryzen 1600 + GTX 1060 6GB system, and the freesync experience is better, even in games that favor the 1060. Freesync is something reviews and many users gloss over sometimes, but my experience is that it offers the smoothest experience for the best price. Not going to buy a 1080 when I know modded 56 and freesync monitor will deliver a buttery smooth experience.

Does the 1060 system have G-Sync or not? You don't mention it but the comparison is apples & oranges if not.
 
Does the 1060 system have G-Sync or not? You don't mention it but the comparison is apples & oranges if not.
Exactly! I do not have a G-sync monitor. Why is that? Because the price premium is too high. And the comparison would be more normally priced apples to really expensive apples (similar flavor/experience, but one cost a lot more) The whole point is value for my money, hence my overclocking providing free performance and smoothest experience at the best price comments. I ordered a AOC 32" 1440p 75Hz freesync monitor for $230, I could not find a similar spec G-sync anywhere close to that price. If you see one hit me up with a link in a PM so I can grab it for my 1060 system.
 
  • Like
Reactions: IEC
Man have the forums changed over the years. I remember when free performance through overclocking was always welcomed. Power and heat considerations were always ancillary concerns. And something we happily addressed, since it is a hobby after all. The raw performance for the dollar was the main point. Certainly the silent PC crowd were unlikely to jump in, but the rest of us were always gung ho.I was considering getting a 56 just for 64 level performance on the cheap. Kept putting it off because all of our displays are still 1080p, but now I am going to order one here shortly, along with a freesync 1440p monitor. Won't be pushing it as hard as they are here, but enough to beat stock 64 for certain.

Well the forums have definately changed not like you think. When was the day that people would buy a card ranked 6 or 7 from the top for 90% of the $$ and then overclock it not even beating a 1080ti stock let alone overclocked. Sure Opteron 165 that overclocked with less than stock volts was great, the gfx cards that could unlock shaders so great! I have never seen anyone buy hardware so far from the top and overclock 200% and still lose to most the main cards. whats the point other than to say its been done! The purpose of overclocking is extra performance for same price! when you have to up the volts so high that you need custom cooling and your hardware will degrade its just for the single time of doing it and not keeping those settings perm. What do i love even more than overclocking? lower watt higher performing parts! Why overclock the best if you are serious you buy the best and overclock that.

If there was a way to cool it for a low price then yes this might be a good deal.
 
Where is new 1080ti for 10 percent more i.e. $440, than a red dragon 56 at $400?
 
Well the forums have definately changed not like you think. When was the day that people would buy a card ranked 6 or 7 from the top for 90% of the $$ and then overclock it not even beating a 1080ti stock let alone overclocked. Sure Opteron 165 that overclocked with less than stock volts was great, the gfx cards that could unlock shaders so great! I have never seen anyone buy hardware so far from the top and overclock 200% and still lose to most the main cards. whats the point other than to say its been done! The purpose of overclocking is extra performance for same price! when you have to up the volts so high that you need custom cooling and your hardware will degrade its just for the single time of doing it and not keeping those settings perm. What do i love even more than overclocking? lower watt higher performing parts! Why overclock the best if you are serious you buy the best and overclock that.

If there was a way to cool it for a low price then yes this might be a good deal.

First off I guess you weren’t around when people would buy x800GTOs unlock the extra pipes, OC, and get something in the neighborhood of the X800XL or X800XT for minimal price. It was still several cards back from the top end (X850XT/PE, X800XT PE, GTX 6800U)

Second performance brackets are nothing like they used to be. Top end used to be priced from $380-$450, then it was $650. Now:
  • $3000 Titan Volta
  • $1400 Titans
  • $1000-1200 2080Tis
  • $800-$900 2080s


Lastly and I’ve been saying this for while, AIB Vega 56s, Vega 64, 1070TI, 1080, and 2070s are ALL basically in the same performance bracket. That performance bracket is now 3-5 places from the top depending on how you want to count. So no one should be expecting a 56 to tie a card that costs 2-3 times as much.

The GN article is simply an interesting example of how far Vega can pushed and to me how well the boards are built that they can handle that power.
 
Yes the Vega cards have usually good quality components but imo this comparison makes only sense because the new cards is way to expensive.

Its that simple to me unfortunately. To me it's depressing but good you guys take it from the positive side looking at the oc Vega and not the overpriced garbage.
 
2vhvjlt.jpg


The difference is more like 280-300 watt range when you consider each graduation in the graph is 80 watts and there is typically 3 to 4 bars difference in power consumption between the RTX and RX. The RTX 2070 is consuming somewhere around the 300watt range when the benchmark is running while vega 56 is consuming 600watts.

That's a big excessive considering there is only 100 dollar difference in price between the two cards. Even overclockers in the past would be choking up about this difference since power supplies were much smaller then.
 
Back
Top