New GTX 1080 owner here in need to some help understanding a few things...

Le Québécois

Senior member
Dec 1, 1999
560
3
81
So just before Xmas, I was both "wise" (I saw the 1070 prices go up so I decided to buy a 1080 in case it did the same) and lucky (got the card for only 520$), I bought a msi gtx 1080 gaming x 8g and coming from a old HD 7970 GHz, I love it.

Now at the time, I was still using an old 22" 1680x1050 @60Hz monitor so I didn't pay too much attention to it but I thought that card was under performing for that low resolution so I kind of assumed my old i5 3570K @ 4GHz was the bottleneck and left it at that.

A few weeks ago, I finally bought a new monitor to go with the card, an acer predator xb271hu. Love it.

Of course, with the new resolution, I started testing my games again seeing that at 1440, the CPU shouldn't really be a bottle neck, or if it was, 5% (from what I've been reading anyways).

To my surprise, in many of the games I've benchmarked, I'm well under what I should get, at least according to my research which leaves me confused.

My worst results are mostly with Ubisoft's game (Siege, Wildlands and Far Cry 5) getting performance closer to the 1070ti(sometimes, slower). With other games, like Deus Ex Mankind Divided, Dirt Rally or Company of Heroes 2, it seems to be closer(barely a few frames slowers) to the normal 1080.

Which brings me to my second interrogation. My card isn't a "base" version so I thought my card might be throttling or something like that but no, it's actually the opposite which is also confusing. The msi gtx 1080 gaming x 8g in OC mode should have a boost at 1847MHz but according to both MSI Afterburner and GPU-Z, I'm boosting at 1961GHz(minimum but I saw it go as high as 1999). If you're wondering, I've never seen the temperature go above 72C so it stays pretty cool.

So here are my questions:

1-Did I really overestimate my CPU that much(even slightly oced) to the point that in the worst case scenario, it looks like I have a base 1070? And if not, wth is going on here?

2-Well, wth is going on with my GPU? I'm not complaining about the extra OC considering the low temperature and not having to change anything (like voltage). Still, I'd like to understand.

PS: While this isn't my only source, it's a good video to see what I mean when I say that I don't believe my CPU should be that huge of a bottleneck at that resolution:

https://www.youtube.com/watch?v=6dHCQOt5Nns&t=

I'm mostly looking at the base clock results.

Thank you in advance for any help.
 

Campy

Senior member
Jun 25, 2010
785
171
116
What kind of benchmarking have you been doing? Are they built-in benchmarks in the games or have you just been playing the game normally?

To answer your second question, Pascal GPUs have something called GPU Boost 3.0, which will change the core clock dynamically to give you maximum performance, as long as you have thermal and power limit headroom. A card such as yours which has a good cooler on it will tend to clock pretty high on its own.
 
  • Like
Reactions: Le Québécois

DigDog

Lifer
Jun 3, 2011
13,496
2,122
126
how did you get a 1080 for 500 bucks ? that card should be over 700.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
What kind of benchmarking have you been doing? Are they built-in benchmarks in the games or have you just been playing the game normally?

To answer your second question, Pascal GPUs have something called GPU Boost 3.0, which will change the core clock dynamically to give you maximum performance, as long as you have thermal and power limit headroom. A card such as yours which has a good cooler on it will tend to clock pretty high on its own.

I've using the built in benchmarks when there's one available as it's usually the best way to get stable results that you can compare with other peoples or articles.

For the GPU Boost, I thought it was similar to the other "turbo" technologies out there. So you're telling me that my card can actually boost itself beyond the manufacturer's boost numbers? That's pretty cool. And yes, this is a great cooler, I love it. That's why I went with this model, quiet and cool.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
how did you get a 1080 for 500 bucks ? that card should be over 700.

Long story short, I bought it from Amazon.ca the week before xmas, it was on sales at 750 CAD(only 50$ more than the basic cooler reference models). At the time, it was supposed to come with Destiny 2 but for some reasons I never quite understood, they were never able to send me the code for the game so instead, they offered a 10% refund, meaning I got it for 675 CAD. In USD, which is what I assume most people are using around here, this is/was(the exchange rate has been stable in the last couple of months) around 520-530 USD. 10 days later, it was up to 1250 USD so I'm pretty happy with the price I paid:p.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
Your i5 is bottlenecking your card.

Have you watched the video or got any data to point to because everything I was able to find tells me that it shouldn't makes that huge of a difference at 1440.

I'm not trying to argue with you, I'm just saying that all the information I was able to find so far shows me that the slower CPU(which is slightly OCed to 4GHz) shouldn't have that huge of an impact as the one I'm seeing.

I'm just looking to find a definitive answer as to what's happening. If it really is the CPU without a doubt, I'll be "happy" (in the sense that I'll know why and will just wait for my next system upgrade).
 

Adawy

Member
Sep 9, 2017
79
24
41
Do you mind telling me your RAM? I wanna know the capacity, frequency and timing.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
My system RAM ?

Sure thing:
Got 16GB : 4 x 4GB of G.Skill f3-12800cl9d running at XMP timing (not using the profile in the BIOS but using the same timings) so 1600MHz @ 9-9-9-24
 

Adawy

Member
Sep 9, 2017
79
24
41
As I thought, it's your RAM.

All right, now you have 2 choices:

1) Get a faster DDR3 RAM, 2400 to 2666MHz CL10.
2) Wait for Ryzen 2, check the reviews, see what platform is best for you and get an entirely new one with fast RAM (3200Mhz CL14 is recommended).
 

DigDog

Lifer
Jun 3, 2011
13,496
2,122
126
Long story short.

i was asking because there's stories around of people buying "great deal" cards online and they are fake cards from china, older models flashed to look like new.

despite what my egregious colleague here Adawy thinks, your ram is fine. ram is never really a problem, unless you run out of it, which is what is happening with your VRAM in the 460 cards.

i politely disagree with @happy medium as i've ran DX10 games with a celeron, unless you are playing stuff like Supreme Commander it's likely that your games offload most of the work on the card, and a 3750 @ 4k is a very respectable processor.
http://cpu.userbenchmark.com/Compare/Intel-Core-i5-3570-vs-AMD-Ryzen-3-2200G/m793vsm441832
people around here don't really like userbenchmarks, but you can see that your 3750k isn't too far from a brand new 2400.

it's just the cards. 1Gb of VRAM is just not enough, and you can simply see this yourself by getting something like Process Explorer and running your game, and then watching the graphs. Those which are maxed out is there where you have a bottleneck.
 
Last edited:
  • Like
Reactions: Le Québécois

Campy

Senior member
Jun 25, 2010
785
171
116
For the GPU Boost, I thought it was similar to the other "turbo" technologies out there. So you're telling me that my card can actually boost itself beyond the manufacturer's boost numbers? That's pretty cool. And yes, this is a great cooler, I love it. That's why I went with this model, quiet and cool.

Yeah pretty much the stated numbers are what the card is guaranteed to reach but in reality all GPUs will go higher than that as long as the cooling is sufficient. If you increase your power limit and fan curve you can increase your avg. core clock without even using +clock offsets.
 
  • Like
Reactions: Le Québécois

Jackie60

Member
Aug 11, 2006
118
46
101
It’s your CPU and RAM, Wildlands is very well threaded as is Far Cry 5 afaik. I am pretty sure you’d have 20-25% better results on a 6700k at 4/4.4ghz or better and with 2600mhz memory or faster. Moving from I7 920 at 4ghz to 4770k at 4.6 to 5960x at 4.6 was a lesson in how much faster newer architectures are in games. GPU is most important But CPU is also. I assume you are running games from a decent SSD?
 
  • Like
Reactions: Headfoot

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Siege loves fast low latency RAM. It also performs badly with quad cores.

Going from an i5 4690k to my current i7 5775c got rid of my CPU bottleneck.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
As I thought, it's your RAM.

All right, now you have 2 choices:

1) Get a faster DDR3 RAM, 2400 to 2666MHz CL10.
2) Wait for Ryzen 2, check the reviews, see what platform is best for you and get an entirely new one with fast RAM (3200Mhz CL14 is recommended).

It's my RAM?

I know it can have some somewhat relatively high performance impact when using an IGP for gaming but with a discrete card, the difference is usually negligible. Well, unless things changed drastically in the last few years.

You have any sources to back this up? Don't be afraid to link some heavy technical stuff, I've been a member here for almost 20 years, I'm used to read long and very technical articles. I do plan on changing my mobo/cpu/ram in the next 18 months or so any suggestions for good reading on the topic would be welcome.

PS:It's not so much questioning the validity of what you say and more about reading it for myself so I can understand the how and the why.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
i was asking because there's stories around of people buying "great deal" cards online and they are fake cards from china, older models flashed to look like new.

despite what my egregious colleague here Adawy thinks, your ram is fine. ram is never really a problem, unless you run out of it, which is what is happening with your VRAM in the 460 cards.

i politely disagree with @happy medium as i've ran DX10 games with a celeron, unless you are playing stuff like Supreme Commander it's likely that your games offload most of the work on the card, and a 3750 @ 4k is a very respectable processor.
http://cpu.userbenchmark.com/Compare/Intel-Core-i5-3570-vs-AMD-Ryzen-3-2200G/m793vsm441832
people around here don't really like userbenchmarks, but you can see that your 3750k isn't too far from a brand new 2400.

it's just the cards. 1Gb of VRAM is just not enough, and you can simply see this yourself by getting something like Process Explorer and running your game, and then watching the graphs. Those which are maxed out is there where you have a bottleneck.

I had a really long day so it's not impossible I'm just too tired to understand your reply, if it's the case, I apologize in advance for any confusion.

I bought my card a week before the cryptomining rage started to affect the 1080 series. At that time, all the 1080 were selling for similar price(if you don't account for the extra -10% I got because of the Amazon screw up with the game code) so I highly doubt I bought a fake card, especially considering the seller was Amazon(it wasn't a marketplace item) and that serial number registering process went without any problem with MSI.

That said, I don't know much about the fake cards thing so would there be a way for me to confirm if it's a "real" card or not?

Or are you just telling me this to explain why you were asking (and not to tell me I might have a fake card)?

Now comes the confusing part. The rest of your post seems to be written as if I had a GTX 460 or am I just too tired to understand what you wrote?
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
Yeah pretty much the stated numbers are what the card is guaranteed to reach but in reality all GPUs will go higher than that as long as the cooling is sufficient. If you increase your power limit and fan curve you can increase your avg. core clock without even using +clock offsets.

OK, I understand, thanks for explaining it in simple terms. I'm glad I chose that model knowing that. You gave me a good answer to my second question!

Any ideas on the first question(why I'm getting result closer to a 1070Ti than my actual card)?
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
I'm going to go against the general opinion of this thread, but I think your CPU/RAM is what is causing the lower performance. Now granted, the 3570k is still a capable CPU, but it doesn't compete with a Kaby Lake / Coffee Lake i5 (or i7). The problem is finding the "proof" you mention with every response. The CPU was released in 2012, so there are not many major review sites who are still comparing modern CPUs to older CPUs. When Sky Lake CPUs launched, most hardware review sites compared them to Sandy Bridge CPUs, and saying it was finally worth upgrading from those CPUs based on the performance gained.

So what we are left with here to answer your question, is just a bunch of user Youtube videos, and many "should I upgrade my 3570k" type posts across various message boards. First, I looked at Anandtech Bench to see if they had results comparing your CPU to a 7600k. Unfortunately, they only had a couple of results, with no direct game data: https://www.anandtech.com/bench/product/701?vs=1828

So next I looked to Youtube and the message boards: https://www.reddit.com/r/buildapc/comments/5zfwoo/is_an_upgrade_to_a_7700k_worth_it_with_a_3570k/
http://cpu.userbenchmark.com/Compare/Intel-Core-i5-7600K-vs-Intel-Core-i5-3570K/3885vs1316
https://www.youtube.com/watch?v=JQHlMLgqMT8

And what I can see, is your CPU is around 25% - 35% slower than say a i5-7600k in single thread performance. Along with DDR4 offering more bandwidth, it is my opinion that is what holding back your gaming performance. So my advice would be to keep gaming and not worrying about the lower FPS so much, or go ahead and pull the trigger on a Coffee Lake build. As far a factory overclocked cards vs. Founder Edition cards, every review I have seen shows a small increase in performance, but nothing that dramatic. Of course different games and engines behave differently, so there are a few games here and there where it is more impactful than others.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
It’s your CPU and RAM, Wildlands is very well threaded as is Far Cry 5 afaik. I am pretty sure you’d have 20-25% better results on a 6700k at 4/4.4ghz or better and with 2600mhz memory or faster. Moving from I7 920 at 4ghz to 4770k at 4.6 to 5960x at 4.6 was a lesson in how much faster newer architectures are in games. GPU is most important But CPU is also. I assume you are running games from a decent SSD?

Did you watch the video I linked in my original post?

If not, there's a screenshot from the Wildlands benchmark:

vlcsnap-2018-04-01-23h58m56s213.png


As you can see, while there is a difference of 7-8% between the 2600K and 8700K (both running at their stock speed, meaning the 8700K is clocked at a much higher frequency), well, that's just it, 7-8%, not 20%. Not only that but that's with a 1080Ti, not a 1080, where the difference should be even less noticeable considering it's almost gone when looking at a 1070.

At 1920x1080, I would agree with you but when we get to 2560x1440, the CPU seems to much less of a factor, at least when not using a very old and slow CPU like say, a Pentium 4 :p.

As for my drives, Siege is installed on my OS drive, a Sandisk Extreme Pro 480GB, while the other games are installed on my "gaming" drive, a Sandisk Ultra 3D 1TB. Both are pretty much some of the best SATA drives on the market. I'm not exactly sure how it would affect performance once the loading has been done.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
Heres some recent CPU benches with a GTX 1080. No 3570k but a 2600k should give a rough approximation at 1440p.

http://www.guru3d.com/articles_page...0kthe_2018_review_time_for_an_upgrade,19.html

You sir pretty much read my mind... or the article I just stumbled upon while I was writing my other replies.

My previous tests, with the exception of Far Cry, were done a few weeks ago with older drivers(late December for the drivers), when I first got the monitor(late February), and for the most part, compared to articles that were done and written using drivers versions that could be more than a year apart and very different from what I had.

The results we see in this article are pretty much on par with the results I am now getting, well, with the 3 games we both tested at least(Wildlands, Shadow of Mordor and Deus Ex Mankind Divided). They're using newer drivers version that are pretty close to what I've installed today to run a new batch of benchmarks.

I find it odd they went through the trouble of testing 1440 yet makes no mentions of these results in the conclusion. Still, in the article, the 1440 speak for themselves and their findings are also very similar to the hardwarecanucks video I posted in my first post except this time, they're actually using a 1080, just like me, and not a 1070 or 1080ti like hardwarecanucks did.

Looking at their 720 and 1080 results, it also account for the "lower" than expected result I was getting when I first for the card while using a 1680x1050 resolution monitor.

I'm glad to see that I didn't do any mistakes in my original hypothesis/research back in November when I first began thinking about finally(that old 22" served me well for almost 12 years) buying a 2560x1440 monitor and a fast video card to go with it. Well, a least for gaming.

I guess my previous results were either due to some "bad" drivers or a bad drivers installation. Considering I did a clean install, I'm surprised but still, I don't see why I had the results I had before.

I now feel like a fool for making some mistakes(probably a bad driver install) and wasting some of your time. I apologize for that.

In retrospect, I should have reran my whole benchmark suite before writing my post last night. Again, I apologize for that.


Thank you to everyone who tried to help me and in one case, succeeded in helping me understand the frequency "boost" behavior I was seeing. I really appreciate all the time you put into doing some research and writing those replies.

It's always a pleasure to use these forums and interact with this community. Thanks again! :)

EDIT: I'd like to point out that I'm very aware we're looking at averages here and that the minimums will be lower with a slower CPU but it will suffice for now.
 
Last edited:

Le Québécois

Senior member
Dec 1, 1999
560
3
81
I'm going to go against the general opinion of this thread, but I think your CPU/RAM is what is causing the lower performance. Now granted, the 3570k is still a capable CPU, but it doesn't compete with a Kaby Lake / Coffee Lake i5 (or i7). The problem is finding the "proof" you mention with every response. The CPU was released in 2012, so there are not many major review sites who are still comparing modern CPUs to older CPUs. When Sky Lake CPUs launched, most hardware review sites compared them to Sandy Bridge CPUs, and saying it was finally worth upgrading from those CPUs based on the performance gained.

So what we are left with here to answer your question, is just a bunch of user Youtube videos, and many "should I upgrade my 3570k" type posts across various message boards. First, I looked at Anandtech Bench to see if they had results comparing your CPU to a 7600k. Unfortunately, they only had a couple of results, with no direct game data: https://www.anandtech.com/bench/product/701?vs=1828

So next I looked to Youtube and the message boards: https://www.reddit.com/r/buildapc/comments/5zfwoo/is_an_upgrade_to_a_7700k_worth_it_with_a_3570k/
http://cpu.userbenchmark.com/Compare/Intel-Core-i5-7600K-vs-Intel-Core-i5-3570K/3885vs1316
https://www.youtube.com/watch?v=JQHlMLgqMT8

And what I can see, is your CPU is around 25% - 35% slower than say a i5-7600k in single thread performance. Along with DDR4 offering more bandwidth, it is my opinion that is what holding back your gaming performance. So my advice would be to keep gaming and not worrying about the lower FPS so much, or go ahead and pull the trigger on a Coffee Lake build. As far a factory overclocked cards vs. Founder Edition cards, every review I have seen shows a small increase in performance, but nothing that dramatic. Of course different games and engines behave differently, so there are a few games here and there where it is more impactful than others.

Hey, thanks for taking the time to do all those research and writing this reply.

I read the links you posted and the "mistakes"... well, it's not actually mistakes, more like a hypothesis that are proven wrong... but the problem I saw the most when doing my original research last year (and in the links you posted) was that most people would assume that the 1920x1080 results would somehow translate to 2560x1440 or at least be somewhat similar when in reality, with most games, well actions games(meaning not RTS or strategy games where a good CPU is needed), at that WQHD resolution, it's so demanding on the GPU that the CPU is more often than not, "waiting" on the GPU to do its thing.

Even today, 2560x1440 displays aren't that common. According to Steam, only 5% of the displays are at a resolution above 1920x1200.

When ask about how something should perform, people go with what they know and use and extrapolate from that. Without doing all those research in november, I would probably have given the same answer most people gave as it's the most logical thing to do.

It doesn't help that pretty much all the review sites either review new CPUs with the fastest GPU on the market to remove all GPU bottleneck or review their GPUs with the fastest CPU, again to make sure it's not the bottleneck. This makes perfect sense for reviewing new hardware but it gives a unrealistic, or at least, incomplete, picture of what most users will end up actually using.

This sadly has the result of creating an information gap where almost no ones test configurations where more than one components can be the bottleneck, in my case, where a slow CPU could in theory very well render new GPUs benchmarks useless simply because of the extra non GPU related bottleneck.

I spent a lot of time doing my original research(I only posted a single video because it showed what I had found but I can assure you, I spent weeks reading and reading on the topic) and had the hypothesis that at a WQHD resolution, my CPU shouldn't matter that much so long as I was willing to accept the occasional major frame rate drops that would occurs when the CPU is the actual bottleneck. Turns out my hypothesis was mostly correct, but I couldn't still understand the slightly lower result I was getting hence why I decided to finally ask for help here... ... and then I read that article just a few hours ago and retested my benchmarks.

Of course, I could have been wrong in my hypothesis which is why I created this post. I was starting to believe it was the case and was preparing to buy a new cpu/mobo/ram.

Again, a big thank you to everyone who spend time to help me :).

EDIT: In case someone is curious as to how I came up with my hypothesis... The idea came from the new Xbox one X. The CPU in the original Xbox one is slow, at least compared to a PC, even compared to my old i5 3570k. To my surprise, the one in the X is still pretty slow, yet by super boosting the GPU, they were able to take something that made games run at 1080@30fps into something that could run games in 4K(still at 30fps) so this got me thinking on how much today's games rely on the GPU and how little they use the CPU.
 
  • Like
Reactions: Campy

DigDog

Lifer
Jun 3, 2011
13,496
2,122
126
right;

i have to apologize to pretty much everyone, i mixed up two threads, one where a guy has a i7 and two 460s, and this one, so my reply for the one went into the other. please disregard all my comments regarding ram, vram, aaand everything else.
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
I spent a lot of time doing my original research(I only posted a single video because it showed what I had found but I can assure you, I spent weeks reading and reading on the topic) and had the hypothesis that at a WQHD resolution, my CPU shouldn't matter that much so long as I was willing to accept the occasional major frame rate drops that would occurs when the CPU is the actual bottleneck. Turns out my hypothesis was mostly correct, but I couldn't still understand the slightly lower result I was getting hence why I decided to finally ask for help here... ... and then I read that article just a few hours ago and retested my benchmarks.

Of course, I could have been wrong in my hypothesis which is why I created this post. I was starting to believe it was the case and was preparing to buy a new cpu/mobo/ram.

There's only one sure fire way to figure out your hypothesis. To run all your benchmarks, and then replace the motherboard/CPU/RAM with something current, and rerun them all again. Outside of that, it's all just all speculation on what is causing it. Like I mentioned in my first reply, all the "legit" or traditional hardware review sites are not comparing a i5-8600k to a i5-3570k, or any other older CPU. They generally only test new CPUs against the previous 1-2 generations of CPUs, and at this point they all involve using DDR4 builds. So we can only base our guesses based on the average the roughly 8-10% gain in performance each generation gets us.

Anyways, good luck with your quest.
 

Campy

Senior member
Jun 25, 2010
785
171
116
GamersNexus did some good benchmarking about a year ago where they compared new CPUs to sandy bridge, you might find it interesting.

Another thing you could do is overclock your 3570K higher, it should go to 4.3-4.5GHz at least, no? If your results improve then you know CPU/Memory is part of the problem.