New GTX 1080 owner here in need to some help understanding a few things...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Guru

Senior member
May 5, 2017
830
361
106
You are using a 3000 series cpu with DDR3 ram, as far as I remember some of the fastest DDR3 ram at the time was 2133mhz, even with that you are still way off from the DDR4 standard which is 2400MHz.

And you are looking at benchmark tests, usually these are done on clean machines with nothing but windows OS installed and the basic set of latest drivers. So right off the gate they are going to have slightly better fps than anyone running an everyday machine, sometimes bloated with various applications, windows bloat, maybe fragmented drives, possibly even few viruses running in the background.

Second the 3570k is anywhere from 10 to 20% slower than a new processor, even clock for clock its going to be about 10-15% slower than the newest processors like the 8600k or 8700k. Add in another 5 to 10% slow due to your ram, add in the OS bloat factor as well, you are also running an antivirus, background tasks, etc... and that is another 5% or so performance hit.

So there you have it.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
2600k holds up a lot better than 3570k too, despite the relative age. The hyperthreading helps in the games which are CPU limited, for the most part. That Gamersnexus video is a great example of that. When I switched from my 2500k @ 4.5, ddr3 1600 same timings as you, which is closer to your CPU (faster by a bit) to my current 5820k @4.4 w 3200 ddr4 I gained a significant amount of FPS in certain games. Namely Fallout 4 I nearly doubled my FPS with the same videocard.
 
Last edited:

Le Québécois

Senior member
Dec 1, 1999
560
3
81
There's only one sure fire way to figure out your hypothesis. To run all your benchmarks, and then replace the motherboard/CPU/RAM with something current, and rerun them all again. Outside of that, it's all just all speculation on what is causing it. Like I mentioned in my first reply, all the "legit" or traditional hardware review sites are not comparing a i5-8600k to a i5-3570k, or any other older CPU. They generally only test new CPUs against the previous 1-2 generations of CPUs, and at this point they all involve using DDR4 builds. So we can only base our guesses based on the average the roughly 8-10% gain in performance each generation gets us.

Anyways, good luck with your quest.

I'm slightly confused by this reply.

This it the second time you dismiss anything that support my hypothesis as not being "legit" review sites. First it was HardwareCanucks, a review site that's been around for 12 years and now, Guru3D. Why is that? Is there a problem in their testing methodology? Are they known to be biased?

They're the only ones who tested with the settings and configuration I needed to see.

If Anandtech did an article on how old CPUs perform in games compared to modern architecture at a WQHD resolution, you can be sure I would read it, but sadly for me, they haven't written such an article. Not a single link you have posted has direct comparison at 2560x1440. They're all using 1920x1080. They're not the only ones, they're all the same. That's big problem because my hypothesis has nothing to do with 1920x1080.

You can be sure I'll do my tests again when the time to upgrade my CPU will come.

Anyhow, thanks again for trying to help, I appreciate it :).
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
I'm slightly confused by this reply.

This it the second time you dismiss anything that support my hypothesis as not being "legit" review sites. First it was HardwareCanucks, a review site that's been around for 12 years and now, Guru3D. Why is that? Is there a problem in their testing methodology? Are they known to be biased?

They're the only ones who tested with the settings and configuration I needed to see.

If Anandtech did an article on how old CPUs perform in games compared to modern architecture at a WQHD resolution, you can be sure I would read it, but sadly for me, they haven't written such an article. Not a single link you have posted has direct comparison at 2560x1440. They're all using 1920x1080. They're not the only ones, they're all the same. That's big problem because my hypothesis has nothing to do with 1920x1080.

You can be sure I'll do my tests again when the time to upgrade my CPU will come.

Anyhow, thanks again for trying to help, I appreciate it :).

I didn't dismiss anything. You asked a question, and I gave you my honest opinion. That's all I can do since I don't have your hardware to compare to my builds at my residence. But for example, I run a 3rd gen i3 in my Folding@Home rig with a GTX 1080, and the CPU holds it back some, and I don't get as many PPD (points per day) with it as would if it were in my Sky Lake / Kaby Lake / Ryzen builds. It's not a huge difference, but probably in the neighborhood of 40k points lost day.

As far as Anandtech doing a review like you mentioned, that isn't going to happen. When Sky Lake launched, they compared it to Sandy Bridge CPUs, and offered their opinion if the new Sky Lake CPUs were compelling enough for people to finally upgrade their 2500k/2700k CPUs, and their answer was yes. Since bigger sites aren't interested in doing the comparison you want to see, all is left to smaller tech sites / enthusiasts on their YouTube channels. If they are legit or not, that is up to you to decide as I have no dog in this fight.

The only way to prove your hypothesis is to take your exact hardware right now and run all the benchmarks on it, record your results, and then run it with a new CPU/motherboard/RAM to obtain the new results. At that point you can make the determination if your 3570k build is holding you back or not.

EDIT:

It looks like the Anandtech review of Sky Lake did include a 3770k, which is faster than what you have, and seems to fall behind once some of the game details are turned up to ultra settings. Not a huge difference, but enough to drop performance down similar to what you are experiencing. Since they only tested using 1080p at the time, there are no results for 1440p or 4k.

https://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/16

76355.png


76343.png
 
Last edited:
  • Like
Reactions: Le Québécois

Le Québécois

Senior member
Dec 1, 1999
560
3
81
GamersNexus did some good benchmarking about a year ago where they compared new CPUs to sandy bridge, you might find it interesting.

Another thing you could do is overclock your 3570K higher, it should go to 4.3-4.5GHz at least, no? If your results improve then you know CPU/Memory is part of the problem.

Yep, read it a while back, it's one of the rare hardware site to have done something like this. The problem is that sadly, they're not pushing things beyond the Full HD.

Overclocking was actually my first step when I bought the GTX 1080. I did made a huge difference when I was still playing at 1680x1050. I had no problem pushing the CPU up to 4.6 and beyond that, it was still stable but I had to push the voltage too high for my comfort zone.

When I got the new monitor, I retested again and found no difference in the averages I was getting. It did however make a difference in those pesky minimum drops. In Siege, while getting the same average(give or take 1-2 fps), the minimum went from 15fps at 3.6, to 40 at 4.2. In the end, I decided to leave it at 4.0, which was the highest I could get without increasing the voltage at all. This got rid of those annoying very low drops. I have no problem playing with the occasional 30-40 fps as long as the average is high.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
I didn't dismiss anything. You asked a question, and I gave you my honest opinion. That's all I can do since I don't have your hardware to compare to my builds at my residence. But for example, I run a 3rd gen i3 in my Folding@Home rig with a GTX 1080, and the CPU holds it back some, and I don't get as many PPD (points per day) with it as would if it were in my Sky Lake / Kaby Lake / Ryzen builds. It's not a huge difference, but probably in the neighborhood of 40k points lost day.

As far as Anandtech doing a review like you mentioned, that isn't going to happen. When Sky Lake launched, they compared it to Sandy Bridge CPUs, and offered their opinion if the new Sky Lake CPUs were compelling enough for people to finally upgrade their 2500k/2700k CPUs, and their answer was yes. Since bigger sites aren't interested in doing the comparison you want to see, all is left to smaller tech sites / enthusiasts on their YouTube channels. If they are legit or not, that is up to you to decide as I have no dog in this fight.

The only way to prove your hypothesis is to take your exact hardware right now and run all the benchmarks on it, record your results, and then run it with a new CPU/motherboard/RAM to obtain the new results. At that point you can make the determination if your 3570k build is holding you back or not.

EDIT:

It looks like the Anandtech review of Sky Lake did include a 3770k, which is faster than what you have, and seems to fall behind once some of the game details are turned up to ultra settings. Not a huge difference, but enough to drop performance down similar to what you are experiencing. Since they only tested using 1080p at the time, there are no results for 1440p or 4k.

https://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/16

76355.png


76343.png

Ah yes, this is actually the first article I read when I started doing my research. It's what gave me hope that at 1440, the difference would be even smaller.

Thanks for clarifying that "legit" review thing for me.

Don't worry, I'm not delusional and know very well how old is my CPU and that it will hold me back in many things but I was hopping, and so far it seems to be true, that for gaming at a high resolution, with today's games, it wouldn't make much of a difference.

My plan has always been to update my whole system so I will eventually eventually end up being able to test it with a faster CPU. I came here to see what others would think so I could look at things from a different point of view and that's exactly what I got.
 
  • Like
Reactions: UsandThem

Le Québécois

Senior member
Dec 1, 1999
560
3
81
2600k holds up a lot better than 3570k too, despite the relative age. The hyperthreading helps in the games which are CPU limited, for the most part. That Gamersnexus video is a great example of that. When I switched from my 2500k @ 4.5, ddr3 1600 same timings as you, which is closer to your CPU (faster by a bit) to my current 5820k @4.4 w 3200 ddr4 I gained a significant amount of FPS in certain games. Namely Fallout 3 I nearly doubled my FPS with the same videocard.

I wouldn't say a lot better but yes, I agree, in CPU intensive games, I'm sure HT makes a difference. Lucky for me, at least so far, all of those games that I do play are more slow paced types of games like, say, Civilization or Endless Legend just to name 2. With these, I don't really mind it if it's slower. On the other end of the spectrum, we have more action oriented games and so far, those don't seems to be affected much by my old CPU.

As I said in another of my post, I always planned to change the CPU too, just not right now. I usually change the CPU/mobo/RAM when someone else in my family needs a faster computer and so far, everyone seems happy with what they got. I'll probably change this year when both AMD and Intel will release their "next" gen and my older system will probably end up with my mother, she's still using my very "old" Q6600 so I think it's about time she gets something faster :p. ( then again, a Q6600 seems to be OK for checking emails and looking at her grand-children videos)
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
Yep, read it a while back, it's one of the rare hardware site to have done something like this. The problem is that sadly, they're not pushing things beyond the Full HD.

Overclocking was actually my first step when I bought the GTX 1080. I did made a huge difference when I was still playing at 1680x1050. I had no problem pushing the CPU up to 4.6 and beyond that, it was still stable but I had to push the voltage too high for my comfort zone.

When I got the new monitor, I retested again and found no difference in the averages I was getting. It did however make a difference in those pesky minimum drops. In Siege, while getting the same average(give or take 1-2 fps), the minimum went from 15fps at 3.6, to 40 at 4.2. In the end, I decided to leave it at 4.0, which was the highest I could get without increasing the voltage at all. This got rid of those annoying very low drops. I have no problem playing with the occasional 30-40 fps as long as the average is high.

Your FPS should not be that low in Siege.
 

Le Québécois

Senior member
Dec 1, 1999
560
3
81
Your FPS should not be that low in Siege.

Yes, sorry, didn't write this properly. My average is around 95 with a mix between ultra and very high at 1440. From what I'm reading, that's pretty close to what I should be getting. The 15 and 30-40 I mention are the minimums I would get before and after OCing the CPU. So while the average didn't change much, those pesky drops in frame rate you get from time to time went from 15 to 40 with the OC, or if you prefer, from extremely bad to acceptable. Remember, that's an absolute minimum, I'm nowhere near that when I play 99% of the time. I hope this helps understand what I meant.

EDIT: Siege is my worst case scenario in all the games I've tested, usually, the minimum(like in Wildlands for example) are only 10% slower.
 
Last edited:

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
You might want to try a hexacore or octocore CPU for faster throughput.
The 20 series of cards will be hitting shelves soon and then the 10 series will see a drop in price.
You can find new returned ones on the market now for $500.00. Fry's had a couple in that price last time I was there.