Getting Vega 56 and 1070ti at same price which one should i buy ?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Getting Vega 56 and 1070ti at same price which one should i buy ?


  • Total voters
    38
  • Poll closed .

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
I wish that Steam Hardware Survey could detect and report VRR enabled monitors and monitor refresh rates. Then we could really see how much of an impact internet wisdom has on people's HW purchases.

It's ridiculous and tiresome - "dude you can't play an FPS without that kewl Razer mouse!', "dude you can't play an FPS without a 144Hz monitor, heck you need a 240Hz one for CS:GO", "Get 4K bro, the new Assassin's Creed looks so shiny".

This really needs to stop.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
677
136
I wish that Steam Hardware Survey could detect and report VRR enabled monitors and monitor refresh rates. Then we could really see how much of an impact internet wisdom has on people's HW purchases.

It's ridiculous and tiresome - "dude you can't play an FPS without that kewl Razer mouse!', "dude you can't play an FPS without a 144Hz monitor, heck you need a 240Hz one for CS:GO", "Get 4K bro, the new Assassin's Creed looks so shiny".

This really needs to stop.

That was the same rabbit hole another poster was going down. The OP doesn't need a new video card to "dude can play an FPS". A 280x is still a functional card at 1080p and lowered settings. If someone offered me a top of the line 4k 144hz vrr monitor or the new Titan X collectors edition today I'm taking the monitor without question. It really boils down to this - the OP wants a 1070ti/V56, if the OP is stuck at 1080p 60hz both cards are hamstrung by the monitor and may as well get a 1060/580 and save the money. If the OP is interested in a monitor upgrade then a 1070ti/v56 is a solid match for a 1440p 144hz monitor. If the OP decides to upgrade the monitor then vrr tech needs to be brought into the discussion which means the choice of cards should also be looked at in relation to the monitor choice. It's not simply a "you have to have vrr now!" argument. It's being on 1080p 60hz in 2017/2018 and wanting a mid/high end card to pair it with argument. If someone is monitor shopping today then vrr needs to be discussed. This isn't much different than someone asking if they should by a 1070ti/V56 for their i5-2300 rig.
 
  • Like
Reactions: tential

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
That was the same rabbit hole another poster was going down. The OP doesn't need a new video card to "dude can play an FPS". A 280x is still a functional card at 1080p and lowered settings. If someone offered me a top of the line 4k 144hz vrr monitor or the new Titan X collectors edition today I'm taking the monitor without question. It really boils down to this - the OP wants a 1070ti/V56, if the OP is stuck at 1080p 60hz both cards are hamstrung by the monitor and may as well get a 1060/580 and save the money. If the OP is interested in a monitor upgrade then a 1070ti/v56 is a solid match for a 1440p 144hz monitor. If the OP decides to upgrade the monitor then vrr tech needs to be brought into the discussion which means the choice of cards should also be looked at in relation to the monitor choice. It's not simply a "you have to have vrr now!" argument. It's being on 1080p 60hz in 2017/2018 and wanting a mid/high end card to pair it with argument. If someone is monitor shopping today then vrr needs to be discussed. This isn't much different than someone asking if they should by a 1070ti/V56 for their i5-2300 rig.
You buy a more powerful GPU for either
  • Playing at higher resolution with same refresh rate
  • Playing at the same resolution with higher refresh rate
  • Both
All of these options require getting a suitable monitor, or not if your monitor preferences are already set. Whether or not that monitor should have vrr is a different question that boils down to subjective preference. It is absolutely ok to get a 1070 for keeping FPS over 100 all the time in Quake Champions, for example, when previously you were only getting 60 FPS with lowered settings, even if your monitor is still 1080p 60Hz. That is because tolerance for screen tearing is a subjective matter.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
677
136
You buy a more powerful GPU for either
  • Playing at higher resolution with same refresh rate
  • Playing at the same resolution with higher refresh rate
  • Both
All of these options require getting a suitable monitor, or not if your monitor preferences are already set. Whether or not that monitor should have vrr is a different question that boils down to subjective preference. It is absolutely ok to get a 1070 for keeping FPS over 100 all the time in Quake Champions, for example, when previously you were only getting 60 FPS with lowered settings, even if your monitor is still 1080p 60Hz. That is because tolerance for screen tearing is a subjective matter.

1) this is a massive pile of contradictions
2) a 1060 exceeds the requirements you listed
 
  • Like
Reactions: ZGR

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
1) this is a massive pile of contradictions
2) a 1060 exceeds the requirements you listed
1) No it isn't.
2) Barely

QC was just an example off the top of my head - you won't get >100fps on Battlefront 2, Destiny 2 with a RX 580/GTX 1060, heck you can't even keep a locked 60 in Nioh with those cards, if gaming at highest detail is your priority.

"Get a RX580/GTX1060 for 1080p 60Hz monitor" is not a one-size-fits-all advice.
 
  • Like
Reactions: PeterScott

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Running a game at 100 fps on a 60hz monitor without adaptive sync is tear city. I do not understand the point of that...

Gaming at ultra on a 1080p60hz display looks awful compared to a 4k60 display at high settings. I cannot see the reasoning behind upgrading a GPU past RX 580/GTX 1060 levels if the user refuses to upgrade their ancient display.
 
  • Like
Reactions: Elfear

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
<10 ms frame latency with tears is still better than 16.7-33 ms frame latency without tears.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
<10 ms frame latency with tears is still better than 16.7-33 ms frame latency without tears.

If the monitor supports a higher refresh rate than 60hz, I completely agree. Otherwise, I see no point in doing that
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
If the monitor supports a higher refresh rate than 60hz, I completely agree. Otherwise, I see no point in doing that
Even if the monitor is 60 Hz, more FPS is always better. Tolerance for screen tearing is subjective like I said, but everybody would feel the difference higher FPS makes.
 

Thinker_145

Senior member
Apr 19, 2016
609
58
91
What a load of BS. You can easily bring a 1070 to it's knees on 1080p60hz. Jesus Christ where do these stupid arguments come from?

144hz you say? Tell me again how many games you actually play on that frame rate. None of the AAA games that's for sure unless you like to play with seriously compromised IQ settings.

1080p60hz is not only still an incredibly demanding setting but it also helps in the longevity of a card to stick to that resolution. 144hz is fine since you can always play at a lower FPS but once you go 1440p you are permanently stuck in a much more expensive upgrade cycle. You think 1080p looks trash? Well try some SSAA or better yet shove your elitist ass somewhere else.
 

Guru

Senior member
May 5, 2017
830
361
106
GTX 1060 6gb or RX 580 8gb are absolutely fine for 1080p 60hz gaming, in fact you'd be able to max 95+% of the games, no doubt!

But if you want to max all games and have more longevity then a 1070 is definitely a better buy if you have the money.

Getting a 1080 for a 1080p 60hz display might be overkill, sure you do get even more longevity, but there is little reason other than longevity to invest so much when the added framerate would essentially be wasted, you could invest in a 1080p 120hz monitor for example.
 
  • Like
Reactions: tential

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
How long have you guys been playing at 1080p60? It has been a cheap resolution for nearly 10 years. Here we are arguing in late 2017 when there are affordable 4k monitors (under $400) that make 1080p look incredibly bad. No amount of AA will fix a low native resolution.

This is so weird to me that an enthusiast forum is so attached to such an old resolution. We are talking about 1070 ti/Vega 56 performance here. These two GPU's are excellent at 4k60.
 
  • Like
Reactions: tential

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
How long have you guys been playing at 1080p60? It has been a cheap resolution for nearly 10 years. Here we are arguing in late 2017 when there are affordable 4k monitors (under $400) that make 1080p look incredibly bad. No amount of AA will fix a low native resolution.

This is so weird to me that an enthusiast forum is so attached to such an old resolution. We are talking about 1070 ti/Vega 56 performance here. These two GPU's are excellent at 4k60.

This kind of snobbish PC Master Race, ego stroking, mine is better than yours nonsense, gets tiresome fast.

My monitor is 1920x1200 and is the perfect pitch for text without scaling, since I use my computer for more than just gaming. It looks perfectly fine and sharp. Hell until recently most movie theaters were actually running less resolution than that on 50 foot screens, and no one was complaining.

There is nothing wrong with 2K resolution screens. They look great.

When I do get a new monitor I will be looking for ~32" 1440p. It's the same pitch, the added resolution just gives a larger screen.

4K for a computer monitor, I won't do because it would need to be too large to handle text without scaling.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
This kind of snobbish PC Master Race, ego stroking, mine is better than yours nonsense, gets tiresome fast.
You tried extremely hard there to take his comments in an offensive manner.
How long have you guys been playing at 1080p60? It has been a cheap resolution for nearly 10 years. Here we are arguing in late 2017 when there are affordable 4k monitors (under $400) that make 1080p look incredibly bad. No amount of AA will fix a low native resolution.

This is so weird to me that an enthusiast forum is so attached to such an old resolution. We are talking about 1070 ti/Vega 56 performance here. These two GPU's are excellent at 4k60.

People simply don't understand that maxing out 1080p to tick every setting isn't how you max out fidelity(I was in this crowd until I experimented with VSR). Increasing resolution/refresh rate can increase IQ far more than ticking that last FPS killing feature. You can only educate so much.
I'd imagine people with high end gpus on 1080p/60 fps are getting less visual fidelity than a 1440p/144hz balanced settings gtx 1060/rx 580 user.....

About the bold/underlined.... so true. You see a higher resolution and it just looks amazing. So crisp. I can't wait for 8k/150+ hz.

We need a new thread for this conversation... I thought this was a completely different thread.

Answer to this thread:
Answering the question posed directly without taking into consideration any external factors:
"Either GPU.... it's essentially the same thing in practice."
 
Last edited:
  • Like
Reactions: ZGR

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Increasing resolution/refresh rate can increase IQ far more than ticking that last FPS killing feature. You can only educate so much.

Stop implying you are smarter than everyone else, so that you need to educate the rest of the ignorant masses, while all the while assuming your opinion is fact.

I have seen higher resolution, it's higher resolution not the second coming.
 

Guru

Senior member
May 5, 2017
830
361
106
Resolution doesn't increase graphical fidelity and depending on the size of the monitor it may even be worse. For example 1440p on a 28' is going to have less pixels per inch than a 21' 1080p monitor.

Second higher resolution doesn't improve graphical fidelity even when on a smaller monitor, nor does it straighten rough edges. What it does do is give more fluency, which if you compare high enough resolutions like 4k vs 1080p its going to look slightly better.
 
  • Like
Reactions: PeterScott

amenx

Diamond Member
Dec 17, 2004
3,911
2,133
136
Higher resolutions allow you to increase the size of your monitor. Thats its main advantage for me. When you can see tiny details on a 32"+ 4k display that do not appear well on a 24" 1080p, I might count that as part of 'graphics fidelity'. No matter how good 'graphics fidelity' may be, imo its enhanced on larger higher res monitors (more immersive, involving).
 
  • Like
Reactions: ZGR and tential

DXDiag

Member
Nov 12, 2017
165
121
116
There are several types of fidelity, Temporal, Spatial and graphical.

Temporal covers the fps and the tearing/stuttering free experience, this is important, and it affects the ability to enjoy the game, you get this by having the latest Adaptive Sync displays, with the highest refresh rates. eg: 166HZ GSync monitor.

Graphical covers how each pixel is lit/shadowed or receive detail such as textures and polygons, this is important as it affects immersion in the game, you want to create the best possible virtual world, by maxing each graphics settings available. You obtain this through better processing. eg: 5 GHz 8 core CPU and a 1080Ti.

Spatial covers the resolution of the game, and also the Anti-Aliasing portion, the higher the resolution the sharper the image, less aliasing and the more details you can see at large monitor sizes. HDR is a property of the monitor as well, and is made through spatial technologies in the monitor itself (dimming, brightness). This affects the ability to see the game without distraction, You get this fidelity by buying a bigger HDR screen and more processing. eg: 40 inch 4K HDR enabled screen.

Every fidelity type is important, you get all of them by having the best screen possible with the best hardware. The screen portion is mostly passive, it won't cost you processing power, HDR, G.Sync, 166Hz. However the hardware side is costly, you want the best GPUs for native resolution and max graphics.

The best fidelity is obtainable like this:
1- 34 inch 4K 120Hz+ screen, (screen size to resolution ratio is high here to maximize sharpness and reduce aliasing)
2- TitanXp or 1080Ti, sadly this is your only choice, as this is the most powerful single solution
3- The screen should be HDR enabled (1000 nits) with G.Sync (your only choice after 1080Ti)

You can of course compromise by choosing a smaller screen:
1- 27 inch 1440p 120Hz+ screen, HDR enabled
2- Vega 64 or GTX 1080/ Vega 56 or GTX 1070Ti
3- The Screen should be FreeSync or G.Sync depending on your choice of GPU