Merits of G3258 / H81 / Win10 64-bit overclocking, in 2016? (Best "Budget OC" kit?)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
56,583
10,224
126
Just a note, for anyone coming across this thread in Google:

The Zalman CNPS5x 92mm tower heatsink, is INCOMPATIBLE (Edit: Scratch that, I got it to WORK) with the ASRock B150 K4/Hyper ATX mobo.

The mounting ring for 115x sockets, has protrusions on the side (like an AMD socket retainer), that the heatsink clips onto.

Well, on two sides, the mobo has a row of chokes, extending out from under the VRM heatsinks, and sticking up high enough, that you CANNOT get the clips for the heatsink down under the protrusions, to snug them in. It just doesn't work.

So, a stock heatsink works, with pins. I haven't tried a Hyper 212-family cooler yet.

Edit: Well, mounting it at a 90deg angle DOES NOT WORK. But I noticed, that the mounting ring, has a few off-axis protrusions. I didn't realized at first what they were for, I thought they were a three-prong AMD mount, somehow. But... you CAN use those, to mount the cooler off-axis, which is what I had to do.

I DID get it to work with the B150 K4/Hyper, and temps with my i5-6400 @ 4.51Ghz (167.0 BCLK, 1.410V vcore) are hitting 84C under OCCT:CPU 64-bit load.

That's a lot better than with the stock copper-cored heatsink, which would hit 95C within 2 minutes, and shut off the test. So far, it has gone 13 minutes, and no errors or thermal trip point (95C).

I think, now, with this heatsink, I can keep 4.51Ghz as my 24/7 OC. Hope that the 1.410V vcore isn't going to fry anything, long-term.

Edit: OCCT errored out, error on core #2, after 23 minutes or so. So, I guess my OC was erroring out before due to temps, so I didn't get to test it for a longer period.

I had a choice, boost vcore a tad (to 1.420V), or lower the BCLK, so I lowered the BCLK by one, to 166.0 BCLK. It means that I won't hit my magical 4.5Ghz, but oh well. Stability is key!

Edit: Hard freeze! (But HDD light continued to blink? I tried to load GPU-Z, and UI froze.) Anyways, now trying 165.0 BCLK and 1.400V vcore, 1.300 vDIMM.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Edit: The performance of the KBL Pentium G4620 (2C/4T), along with the advanced media-decoding capabilities (HEVC Main10, etc.), may just turn the tide against the venerable G3258 though. Unfortunately, that requires waiting until Jan. 2017 to find out, while supplies of the G3258 and suitable H81 OCing mobos dwindle.

Yes, OCing those KBL Pentiums would be nice (particularly if we see a $64 KBL Pentium with 2C/4T).

However, I think there needs to be a motherboard that is fairly cheap to go along with it for at least two reasons:

1. Cheap CPU + relatively expensive board (for OC) might be a worse value compared to relatively expensive CPU + cheap board.

2. If Intel is able to disable Bcl overclocking the system built should still end up a decent enough value when the Pentium 2C/4T is in the stock clock state.

P.S. I also think it would be really good if the board used for bcl clocking Pentium 2c/4T had a "one key overclocking" option (example here) to make set-up easier for beginners. Maybe even have the one key overclock tuned to the cooper cored 95W Intel cooler (or some other type of reference cooler---perhaps even one that doesn't blow air on the VRMs)?
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Virtual Larry, here is my humble vote for best budget Overclock kit from a Case, PSU and CPU cooler standpoint:

1. Fractal design Core 1000 (Normally goes on sale at Newegg for $29.99 free shipping)
2. EVGA 600W (Normally goes on sale at Best buy for $29.99 free shipping, sometimes $34.99 free shipping)
3. Cryorig M9i CPU cooler (normal price at Newegg is $19.99 free shipping)

Now if only we could get a relatively cheap 2C/4T OC board to go along with that....using decent enough VRMs to handle the airflow provided by that combination of parts.

P.S. I'll bet with a high enough clocked Pentium (or i3 7350K) it could repurpose a GTX 970 which is currently the most popular dGPU on the Steam hardware survey.

hws_nvidia.gif
NVIDIA GeForce GTX 970
4.97%

-0.01%

trans.gif

hws_nvidia.gif
NVIDIA GeForce GTX 960
3.68%
0.00%

trans.gif

hws_nvidia.gif
NVIDIA GeForce GTX 750 Ti
3.18%
+0.02%

trans.gif

hws_intel.gif
Intel HD Graphics 4000
3.08%
+0.06%

trans.gif

hws_intel.gif
Intel HD Graphics 3000
2.10%
0.00%
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,583
10,224
126
Yeah, that sounds good. I'm hoping that, either, someone (ASRock, again? MSI?) comes out with a BCLK-enabled board (cheap!), for Kaby Lake, OR.... it's possible to shove a Kaby Lake into a Z170 or other BCLK-enabled board, and get it to work. (Sub-optimally, due to needing a BIOS flash for full KBL compatibility.)
 
  • Like
Reactions: cbn

cbn

Lifer
Mar 27, 2009
12,968
221
106
Regarding budget OC kits for re-purposing 28nm dGPUs....

Looking through the Anandtech FS/FT forum I found a GTX 680 2GB for $50 shipped in the second listing:

https://forums.anandtech.com/thread...iphone-4s-and-a-samsung-galaxy-2-tab.2493062/

That seems like a decent enough deal considering GTX 680 2GB is faster than GTX 960:

https://www.techpowerup.com/reviews/MSI/GTX_960_Gaming/29.html

perfrel_1920.gif


Which is in turn a tiny bit faster than GTX 1050 2GB;

https://www.techpowerup.com/reviews/MSI/GTX_1050_Gaming_X/27.html

perfrel_1920_1080.png


So perhaps GTX 680 2GB is about 9% faster in measured frame rate than a GTX 1050 2GB? (ie, in between a GTX 1050 2GB and GTX 1050 Ti 4GB)

My 4.3 Ghz Pentium G3258 won't handle that level of GPU (it stutters with a GTX 660) but I imagine a OC KBL Pentium 2C/4T would be a good match.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I'm hoping that, either, someone (ASRock, again? MSI?) comes out with a BCLK-enabled board (cheap!), for Kaby Lake.

I like that idea.

I also wonder (for motherboards that are a bit higher end) if having two PCIe x16 slots would be a good idea? This for DX12 Multi-adapter:

http://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview

MixedGPU2_575px.jpg


Think re-purposed older model dGPU + newer dGPU?

Although with that mentioned, I am not sure how soon we would see DX12 multi-adapter become widespread? It might be some time away.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,439
5,787
136
Although with that mentioned, I am not sure how soon we would see DX12 multi-adapter become widespread? It might be some time away.

I don't think it will ever be particularly widespread. It must be a massive pain to get working properly (with the thousands of potential configurations you could make from any arbitrary combination of two GPUs), and only a handful of people would ever use it. I would rather see developers focus on using the integrated GPU for e.g. physics tasks, instead of trying to get two dGPUs working.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
P.S. I also think it would be really good if the board used for bcl clocking Pentium 2c/4T had a "one key overclocking" option (example here) to make set-up easier for beginners. Maybe even have the one key overclock tuned to the cooper cored 95W Intel cooler (or some other type of reference cooler---perhaps even one that doesn't blow air on the VRMs)?

Found another example of one key overclocking here.

8p.jpg


Maybe a way of implementing such a button in a DIY scenario would be to have a replacement 5.25" bay faceplate with the button attached? (this connected to a header on the motherboard).

Or perhaps the overclock button could be on the I/O panel located at the back of the case?

Or maybe just have a software overclock button in Windows or Linux? This showing that the overclock is enabled (or needs to be enabled via pushing the button).

Or perhaps a combination of 1 and 3 or 2 and 3 is the best option? This where pushing the overclock button on the 5.25" faceplate or on the I/O panel (rear of the computer) is what causes the computer to select the OC profile....with the WIndows (or Linux) software merely displaying whether or not the overclock is enabled or not.

EDIT: Here is a really fancy 5.25" bay overclocking panel by ASUS:

https://www.asus.com/Motherboard-Accessory/OC_Panel/overview/

OC-Panel_-Normal-Mode1.jpg
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
cbn said:
Although with that mentioned, I am not sure how soon we would see DX12 multi-adapter become widespread? It might be some time away.

I don't think it will ever be particularly widespread. It must be a massive pain to get working properly (with the thousands of potential configurations you could make from any arbitrary combination of two GPUs), and only a handful of people would ever use it. I would rather see developers focus on using the integrated GPU for e.g. physics tasks, instead of trying to get two dGPUs working.

If one or two very major titles gets it....then I think it will eventually slowly spread.

With that mentioned, I think the biggest obstacle to adoption is lack of PCIe lanes (or more accurately lack of equal distribution of PCIe lanes to the PCIe x 16 slots).

In contrast, I think the biggest obstacle for iGPU multi-adapter would be CPU throttling.

EDIT: Some data below on various PCIe slots bottlenecking older video cards.

https://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html

perfrel.gif


Keeping in mind many Z170 boards have PCIe 3.0 x 4 for the second slot, that is actually not bad. (Though at the same time consider a mere GTX 1050 Ti is actually a bit faster than a GTX 680....See post #30)
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,439
5,787
136
If one or two very major titles gets it....then I think it will eventually slowly spread.

With that mentioned, I think the biggest obstacle to adoption is lack of PCIe lanes (or more accurately lack of equal distribution of PCIe lanes to the PCIe x 16 slots).

In contrast, I think the biggest obstacle for iGPU multi-adapter would be CPU throttling.

EDIT: Some data below on various PCIe slots bottlenecking older video cards.

https://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/23.html

perfrel.gif


Keeping in mind many Z170 boards have PCIe 3.0 x 4 for the second slot, that is actually not bad. (Though at the same time consider a mere GTX 1050 Ti is actually a bit faster than a GTX 680....See post #30)

Older test data is based on older games. Modern games demand much more from the PCIe bus, as they have a lot more data to transfer. Look at the number of games that won't run on a 1GB card any more! Not to mention, these average framerate metrics don't show the worst of it- they don't capture stuttering, the way that frametime variance does.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,583
10,224
126
This thread is kind of languishing. Anyways, I bought another pair of G3258 CPUs to go with the two Biostar H81 boards I bought at the start of the thread. That will give me six G3258 OC rigs. (Two with Gigabyte H81 boards, four with Biostar H81 boards.)

Too bad that the G3258 iGPU won't do 4K, I don't think. That would really make these nice HTPCs. (But, I do have three GTX950 2GB OC cards to throw in them, and those have HDMI2.0 ports on them, as does my RX 460 4GB Nitro card in this i5-6400 box I'm currently using.)

I guess the G3258's time in the "budget gaming rig" spotlight is basically over. Heck, even only 4C/4T Intel i5 CPUs are starting to see their twilight for gaming purposes, with games like Watch Dogs 2, and BF 1. Certainly, the Kaby Lake unlocked i3-7350K CPU is going to be the last hurrah for dual-cores. Shame it's so expensive. I could see it being popular at $100, and prompting people to spring for a "Z" overclocking board just to support it, leaving open the possibility of upgrading to a 4C/8T i7 CPU on that platform in the future. Too bad Intel doesn't seem to see that opportunity, and is charging too much for the unlocked i3.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Last edited: