AMD Polaris 10 Samples work at 1.27 GHz

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Better solution for bottlenecking is to upgrade to higher res. At 1440p you also end up with better image quality. Seeing 980ti owners still on cheap 1080p displays just rubs me the wrong way, but sadly there are still many of them.

RS, any 1440p benches from that site?

I agree. It's ludicrous how some are sticking to 22-24" 1080p 60Hz monitors and eyeing a GTX1070/1080 upgrade. The context has been completely lost on these PC gamers.

Not from that site but TechSpot just did a review of 980Ti SLI and it shows incredible bottlenecks on Skylake i7 6700K @ 4.5Ghz at 1440p with slower DDR4 memory. We know that Skylake i7 6700K @ 4.5Ghz with DDR4 2133 is still faster than anything other than an i7 4790K with DDR3 2400mhz. That means every CPU below that is automatically bottlenecking 1070/1080 level cards at 1080p.

Test System Specs
Intel Core i7-6700 Skylake @ 4.50GHz
Asrock Z170M OC Formula
G.Skill TridentZ 8GB (2x4GB) DDR4-4000
2x GeForce GTX 980 Ti SLI
Samsung SSD 950 Pro 512GB
Silverstone Strider Series ST1000-G Evolution 1000w
Windows 10 Pro 64-bit

ARMA 3 sees a 10% increase in minimum frame rates when going from DDR4-3000 to 4000,
ARMA3.png


BlackOps.png

Civilization.png


Although DDR4-4000 wasn’t a huge step forward over the 3600MT/s memory, it was a whopping great step from 3000MT/s delivering a 19% greater minimum frame rate.
Fallout.png


TheDivision.png


Witcher.png


GTX1070/1080/980Ti OC are all CPU limited at 1080p. These are 1440p cards. Right now, Skylake i7 6700K @ 4.5Ghz doesn't bottleneck a 980Ti level card at 1440p.

GTX980Ti.png


As GPUs get even more powerful (think 2017 Big Pascal/Vega), the CPU and DDR4 bottlenecks will be even greater. People buying i5 6500 + DDR4 2133mhz or still using Sandy/Ivy i5/i7 CPUs are not accepting reality if they think their CPUs aren't limiting the full potential of GTX1070 and higher level cards at 1080p.

http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page4.html

This is the first time in 5 years or so that CPU and memory bottlenecking is becoming a 100% real issue that needs to be discussed and almost everyone is avoiding it since they are too busy discussing perf/watt and the launch of Pascal GPUs.

If one makes the argument that they are using a 1080p 120-144Hz monitor, they will need the fastest CPU+DDR4 memory possible to maximize the full potential of that monitor. OTOH, if someone is using a 1080p 60hz monitor, this level of GPU power is simply wasted outright.

AMD needs to deliver the message that Polaris 10 is the perfect 1080p/60Hz card while 1070 and Vega level is 1080p 120-140Hz and 1440p and above. But their marketing department isn't that competent...
 
Last edited:

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
RS, would these same memory bottlenecks be present in a quad channel X99 Broadwell-E machine? I realize that Skylake is a bit faster per core than Broadwell but would the quad channel memory at least remove the memory bottleneck?

Just curious since I'm seriously considering going with a Broadwell-E system and either a GTX 1070/1080 or Top Polaris 10 GPU in July so that I can still go free Win10 and sell my current system to my father. I'm on 1440p @60hz so it still remains to be seen if Polaris 10 will be the right upgrade for me. I suspect Polaris 10 will just have to be a placeholder if I go with it but will still be a decent upgrade from my current GTX 780.


I agree. It's ludicrous how some are sticking to 22-24" 1080p 60Hz monitors and eyeing a GTX1070/1080 upgrade. The context has been completely lost on these PC gamers.

Not from that site but TechSpot just did a review of 980Ti SLI and it shows incredible bottlenecks on Skylake i7 6700K @ 4.5Ghz at 1440p with slower DDR4 memory. We know that Skylake i7 6700K @ 4.5Ghz with DDR4 2133 is still faster than anything other than an i7 4790K with DDR3 2400mhz. That means every CPU below that is automatically bottlenecking 1070/1080 level cards at 1080p.

Test System Specs
Intel Core i7-6700 Skylake @ 4.50GHz
Asrock Z170M OC Formula
G.Skill TridentZ 8GB (2x4GB) DDR4-4000
2x GeForce GTX 980 Ti SLI
Samsung SSD 950 Pro 512GB
Silverstone Strider Series ST1000-G Evolution 1000w
Windows 10 Pro 64-bit

ARMA 3 sees a 10% increase in minimum frame rates when going from DDR4-3000 to 4000,
ARMA3.png


BlackOps.png

Civilization.png


Although DDR4-4000 wasn’t a huge step forward over the 3600MT/s memory, it was a whopping great step from 3000MT/s delivering a 19% greater minimum frame rate.
Fallout.png


TheDivision.png


Witcher.png


GTX1070/1080/980Ti OC are all CPU limited at 1080p. These are 1440p cards. Right now, Skylake i7 6700K @ 4.5Ghz doesn't bottleneck a 980Ti level card at 1440p.

GTX980Ti.png


As GPUs get even more powerful (think 2017 Big Pascal/Vega), the CPU and DDR4 bottlenecks will be even greater. People buying i5 6500 + DDR4 2133mhz or still using Sandy/Ivy i5/i7 CPUs are not accepting reality if they think their CPUs aren't limiting the full potential of GTX1070 and higher level cards at 1080p.

http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page4.html

This is the first time in 5 years or so that CPU and memory bottlenecking is becoming a 100% real issue that needs to be discussed and almost everyone is avoiding it since they are too busy discussing perf/watt and the launch of Pascal GPUs.

If one makes the argument that they are using a 1080p 120-144Hz monitor, they will need the fastest CPU+DDR4 memory possible to maximize the full potential of that monitor. OTOH, if someone is using a 1080p 60hz monitor, this level of GPU power is simply wasted outright.

AMD needs to deliver the message that Polaris 10 is the perfect 1080p/60Hz card while 1070 and Vega level is 1080p 120-140Hz and 1440p and above. But their marketing department isn't that competent...
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
RS, would these same memory bottlenecks be present in a quad channel X99 Broadwell-E machine? I realize that Skylake is a bit faster per core than Broadwell but would the quad channel memory at least remove the memory bottleneck?

.

No it doesn't. Quad channel is almost useless, it can't replace faster memory in games.
There's a huge gap in memery bandwidth in ST and MT and it seems that games really need all of that bandwidth available for a single thread.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
No it doesn't. Quad channel is almost useless, it can't replace faster memory in games.
There's a huge gap in memery bandwidth in ST and MT and it seems that games really need all of that bandwidth available for a single thread.
Same in dx12?
 

wege12

Senior member
May 11, 2015
291
33
91
Would a bottleneck be present gaming at 1080p and up using an i7 5820k @ 4.6 GHz, Fury X oc'd to 1160 MHz core and 545 MHz HBM and 16GB DDR4 @ 2800 MHz?
 
Last edited:

ZZZAAA

Member
May 17, 2016
161
0
0
How is the 4790K faster than a 6700K? Also, are Skylakes limited to 2133 DDR4 or what? I happen to have a 6700K with 2133 (which hopefully OC's to at least 2400).
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Would a bottleneck be present gaming at 1080p and up using an i7 5820k @ 4.6 GHz, Fury X oc'd to 1160 MHz core and 545 MHz HBM and 16GB DDR4 @ 2800 MHz?

It depends. If you are targeting 120Hz or higher then sure.

ps. you really can't assume an OC of 4.6GHz. Mine doesn't go above 4.3GHz with the best cooling, not that a paltry 300MHz would make a noticiable difference.

Same in dx12?

The quad channel memory is only useful if you run many threads that require lots of memory bandwidth, gaming loads just aren't like that so the benefit of the quad channel set-up over a dual channel is minimal.

ps. to allievate the memory bottleneck in games on a socket 2011-3 computer you have to overclock the uncore. In fallout 4 the most memory bottle-necked game that I know of overclocking the un-core is even more important then overclocking the core
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,753
749
136
I am assuming that 5760x1080 won't exactly bottleneck a 4770K @ 4.8GHz, 32GB 2666MHz CL10 DDR3 & 480X. Might have to upgrade to Zen 8C16T/7700K on release to avoid Russians 1080p doom...

I'll jump to 4K when Vega hits, not before.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I am assuming that 5760x1080 won't exactly bottleneck a 4770K @ 4.8GHz, 32GB 2666MHz CL10 DDR3 & 480X. Might have to upgrade to Zen 8C16T/7700K on release to avoid Russians 1080p doom...

I'll jump to 4K when Vega hits, not before.

that's triple the 1080p resolution and it's even higher then what I have.(3440x1440) No way you will be CPU bound.

ps. 5760x1080 is 6MP and 4K is 8MP but due to the aspect ratio the performance might be very similar between the two resolutions.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
RS, would these same memory bottlenecks be present in a quad channel X99 Broadwell-E machine? I realize that Skylake is a bit faster per core than Broadwell but would the quad channel memory at least remove the memory bottleneck?

Just curious since I'm seriously considering going with a Broadwell-E system and either a GTX 1070/1080 or Top Polaris 10 GPU in July so that I can still go free Win10 and sell my current system to my father. I'm on 1440p @60hz so it still remains to be seen if Polaris 10 will be the right upgrade for me. I suspect Polaris 10 will just have to be a placeholder if I go with it but will still be a decent upgrade from my current GTX 780.

I would imagine the quad-channel memory on BW-E will provide sufficient bandwidth for BW-E. The comparison of Skylake to BW-E is a bit different since they are different architectures.

You can buy 16GB (4X4GB) DDR4-3200 for $85 now. DDR4-3200 is the sweetspot right now as far as pricing goes. The next level up, DDR4-3400 costs $130, which is too much.

For 1440p/60Hz, I'd pick an after-market 1070. 1080 is too expensive and looking at FPS, not just % difference (sure 1080 will beat 1070 by 20-25% on a % chart), the 1070 is more than fast enough.
http://wccftech.com/nvidia-geforce-gtx-1070-titan-x-killer/

Then in 2018-2019, just sell the 1070 and get a faster Volta/AMD's equivalent GPU.

The question you should be asking is whether or not BW-E i7 6800K even makes sense. In games it might lose to an i7 6700K OC. Let's wait for reviews. Besides Asrock X99 Extreme 4, almost all mobos are more expensive. 4x4GB DDR4 3200 costs more than 2x8GB DDR4 3200. The i7 6800K will also cost more than i7 6700K. You also need just a $20 air cooler to hit 4.6-4.8Ghz on an i7 6700K. Add up all the costs differences required for X99 today and you may end up with a situation where you are better off with an i7 6700K + $20 cooler + 16GB DDR4 3200 and a $130 Z170 board rather than the X99 platform and a more expensive $50-60 cooling system needed to hit 4.5-4.8Ghz clocks on it.

When we recommended i7 5820K over i7 6700K for 6 months, the reverse was true. From a cost perspective, the i7 5820K platform made a lot of sense due to jacked up prices of i7 6700K, very expensive DDR4 3200 at the time and very few great $130 Z170 mobos.

Would a bottleneck be present gaming at 1080p and up using an i7 5820k @ 4.6 GHz, Fury X oc'd to 1160 MHz core and 545 MHz HBM and 16GB DDR4 @ 2800 MHz?

The bottleneck here is better described as CPU-limited. Your GPU is way too fast for 1080p 60Hz gaming and it's also bottlenecked by DX11 API.

Look
Fury X vs. 980

1080p
9.4% faster

1440p
22% faster

4K
27% faster
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/26.html

At 1080p 60Hz, the DX11 API overhead and your CPU+memory isn't fast enough to keep up with a Fury X. A much slower GTX980 graphics card is almost as fast.

This is why I call 1080p 60Hz 'peasant resolution' for $350+ GPUs of 2016. The resolution isn't demanding enough for such high-end cards. You need a 1440p 60Hz, 1440p 144Hz, or 4K monitor upgrade. :D :thumbsup:

1080p 60hz needs to die or be relegated to budget builds. :thumbsdown:

How is the 4790K faster than a 6700K? Also, are Skylakes limited to 2133 DDR4 or what? I happen to have a 6700K with 2133 (which hopefully OC's to at least 2400).

Yes, did you not see the graphs I provided?

Skylake requires DDR4 3000-4000 to shine. Even DDR4 3000 isn't fast enough for i7 6700K OC.

Every single review at launch that tested i7 6700K at DDR4-2133/2400mhz made me conclude erroneously that Skylake was crap. The reality is every single one of those reviewers never tested the true potential of a Skylake CPU.

To get the most out of Skylake, get DDR4 3200 or faster
http://www.purepc.pl/pamieci_ram/te...pamieci_ram_wybrac_do_intel_skylake?page=0,11

I am assuming that 5760x1080 won't exactly bottleneck a 4770K @ 4.8GHz, 32GB 2666MHz CL10 DDR3 & 480X. Might have to upgrade to Zen 8C16T/7700K on release to avoid Russians 1080p doom...

I'll jump to 4K when Vega hits, not before.

No, you are fine. 480/X isn't strong enough for 3x1080p monitors and your CPU. You are better off getting a 1070/1080 for those 3 screens due to Simultaneous Multi-Projection. Read up on that tech; it's amazing.
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
I hardly would recommend 1070 for those 3x1080p monitors... it will resist only 2 flawlessly. The best case is 1080.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
1080p 60Hz is less CPU bottlenecked than 1080p 144Hz as most CPUs can get 60 FPS minimums, while many fewer are going to get 80+ FPS minimums. The actual bottleneck is the refresh rate.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I hardly would recommend 1070 for those 3x1080p monitors... it will resist only 2 flawlessly. The best case is 1080.

Imho, 1080 makes no sense just like 970 SLI/R9 290/290X CF >>> 980. $760 1070 SLI is $150 more than the worst 1080 and will smash the 1080. The only high-end cards from NV that ever make sense are the Big Die (780Ti/980Ti). 1080 won't be fast enough for 1440 144Hz, not fast enough for 4K, but 3584-3840 Big Pascal = 1070 SLI easy.

Looking back, we already have proof too.

680 never outlasted the 670
980 never outlasted the 970

The $600-700 mid-range x04 cards are good for benchmarks on the web and not much else. In the real world 1080Ti / 1070 SLI are the way to go unless one has $$ to burn to get 2x 1080 SLI right away. 1070 OC ~ 980Ti OC will slice through 1080p/1440p 60Hz games until Volta anyway.

Another way to look at it, buy a $400 1070, then buy a $400 Volta 2070. Have great performance over the entire 4 years. 1080 makes sense for top 3% of gamers and those who are experts at reselling cards with minimal loss in resale value by perfectly timing the market every gen. It took only 2.5 years before $699 780Ti level of performance could be purchased in a $250-280 GTX970/R9 390! Already in just 12 months the $650 980Ti lost $200 in resale value as one would be lucky to sell one for $450 now. When 1070 comes out in 2 weeks, 980Ti won't be worth it above $350, or almost half its launch I retail price last summer. Ouch. This applies to AMD too. Fury X won't even be worth $325 next month.

The more expensive next gen mid-range cards are, the more they will be stomped in price/performance by 2017 cards.
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,753
749
136
No, you are fine. 480/X isn't strong enough for 3x1080p monitors and your CPU. You are better off getting a 1070/1080 for those 3 screens due to Simultaneous Multi-Projection. Read up on that tech; it's amazing.

Well it would be 2x 480X really, going from 2 Powercolor PCS+ R9 290's @1150/1400 (max stable clocks). I don't think it'll be a huge performance uplift but the 8GB and lower power consumption will be very welcome.
 

CakeMonster

Golden Member
Nov 22, 2012
1,414
515
136
What happened with memory performance in the last 12 months? I remember reading a ton of tests around SL release and I pretty much concluded that memory was not where it was worth putting my money. Is it only these new titles or did I miss something? I've played at 1600p for years so I always check the review scores for 1440/1600p..
 

C@mM!

Member
Mar 30, 2016
54
0
36
Well it would be 2x 480X really, going from 2 Powercolor PCS+ R9 290's @1150/1400 (max stable clocks). I don't think it'll be a huge performance uplift but the 8GB and lower power consumption will be very welcome.

The SMP stuff is pretty well much necessary for VR, and won't be surprised if Polaris supports something similar.

However if your doing eyefinity, just get the biggest largest GPU you can. If you can't get the performance you want at that point, add another.

That being said, ask yourself if 1x21:9 monitor would suffice. Better support, lower requirements, and no bezels to deal with. I certainly made the switch.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
What happened with memory performance in the last 12 months? I remember reading a ton of tests around SL release and I pretty much concluded that memory was not where it was worth putting my money. Is it only these new titles or did I miss something? I've played at 1600p for years so I always check the review scores for 1440/1600p..

Like DDR3, DDR4 wasn't much of an upgrade when first released. Now it is because we have affordable 3 ghz+ ddr4; whereas before, it was slow and overpriced.

It is gonna keep getting faster and cheaper.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What happened with memory performance in the last 12 months? I remember reading a ton of tests around SL release and I pretty much concluded that memory was not where it was worth putting my money. Is it only these new titles or did I miss something? I've played at 1600p for years so I always check the review scores for 1440/1600p..

RAM.png


What happened is reviewers didn't test Skylake properly and we didn't have DDR4 4000 memory available August 2015.

Excel.png

7-Zip.png

Photoshop.png

HandBrake.png


Skylake is the first Intel CPU architecture/platform in a decade that actually benefits massively from faster memory. Keep in mind, if you have a 1070/980Ti level card for 1440p gaming and crank everything to the max, it's still going to be GPU limited for the most part in the most demanding games.

GTX980Ti.png


However, it's important to keep in mind that faster DDR4 truly does benefit SKL as many end up keeping the CPU/platform for 4-5 years. As DDR4 3600-4000 becomes more affordable, it'll be a good upgrade.
 

Heatshiver

Member
Jun 9, 2013
39
1
71
As long as AMD brings its A game, and NVIDIA adjusts accordingly, this could be a huge win-win for those of us waiting for price drops.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
after reading rs's posts, I just realize my buying rules are perfect :) 1080p + under 300$ gpus.