Best value "Entry Level" gaming PC.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cbn

Lifer
Mar 27, 2009
12,968
221
106
I dont really see the point of going to the expense of PC gaming and then trying to run it on marginal hardware, especially when so much more performance can be had for the cost of a game or two

I don't think PC gaming necessarily has to be expensive.

There are actually good games that are free to play on both Steam (eg, Team Fortress 2) and Origin. (I even got Battlefield 3 for free when Origin was giving that one away for a limited time)

...And there are all the sales that go on too. (eg, I got Crysis 2 for one dollar on a humble bundle sale, etc.)

EDIT: Then there are also freemium games like "War face" (built with Crytek's engine)--> http://en.wikipedia.org/wiki/Warface which I have played a number of times without feeling the need to spend any money to be competitive.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Madpacket, regarding your benches:

How much RAM did you allocate to iGPU?

Currently 1GB. Likely overkill for this GPU but with 8GB of total system RAM allocating 1/8th isn't to the video card seemed fine.

I'm setting up the new H81I board tonight instead of running more benchmarks. This way I can run benchmarks on both systems in parallel.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
But I also dont think it is fair to project onto me that I dont think anyone should be gaming unless they have a high end system, for that certainly is not the case. In fact my system is far less powerful than an R9.

Fair enough. I should written R7, instead of R9. (Although it is usually true that overclocked G3258 is being used with at least a 760 GTX as the "powerful GPU" to determine if it is worthy to play games. 760 GTX > R9 270 and R9 270X and probably R9 280 as well)
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Fair enough. I should written R7, instead of R9. (Although it is usually true that overclocked G3258 is being used with at least a 760 GTX as the "powerful GPU" to determine if it is worthy to play games. 760 GTX > R9 270 and R9 270X and probably R9 280 as well)

760 is about equal to a 270X, it trades blows with a 280 in some games and loses by more than 30% in others.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
Currently 1GB. Likely overkill for this GPU but with 8GB of total system RAM allocating 1/8th isn't to the video card seemed fine.

I'm setting up the new H81I board tonight instead of running more benchmarks. This way I can run benchmarks on both systems in parallel.

dont go high on fixed memory for intel igp, it tends to causes issues and to not use it at all because it goes directly to DVMT.

Not sure if that changed on recent drivers.

About the celeron, guys im sure the memory is more of a bottleneck than the cpu itselft, you can do a DC DDR3-1333 vs DC DDR3-1600?
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
The thing is... it does not use it unless you disabled the DVMT, not sure if that changed on new drivers, i realised that on a HD3000, you can check the igp fixed/dvmt mem usage on gpu-z, i endended up using about 32/64MB fixed because of that. On HD3000 artifacts starts at 1GB fixed, again, not sure if that was fixed, it was a well known issue.

things like this where fairly common.

Sintiacutetulo_zpsd0de0995.png


and thats a HD4600.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Comparing the Dirt 3 results (using the game's the built-in benchmark) listed in posts #113 and #119:

Athlon 5350 (Madpacket's system with 8GB RAM and 1GB allocated to GPU):

Ultra Low pre-set (with Multisampling OFF):

1920x1080 = 30.15 FPS, 24.33 MIN
1366x768 = 49.67 FPS, 40.87 MIN
1024x768= 61.00 FPS, 46.89 MIN
800x600 = 69.00 FPS, 50.69 MIN

Low pre-set (with Multisampling OFF):

1920x1080 = 22.99 FPS, 18.73 MIN
1366x768 = 37.87 FPS, 28.44 MIN
1024x768 = 45.76 FPS, 34.53 MIN
800x600 = 52.90 FPS, 38.21 MIN

Medium pre-set (with Multisampling OFF):

1280x720 =
AVG: 29.21 FPS
Min = 24.21 FPS

vs.

Pentium G3258 @ G3220 speeds (My system with 4GB RAM @ 1333, 512 MB allocated to iGPU):

Ultra Low pre-set (with Multisampling OFF):

1920x1080 = 39.44 FPS, 32.74 MIN
1366x768 = 67.1 FPS, 53.57 MIN
1024x768= 82.41 FPS, 63.97 MIN
800x600 = 102.41 FPS, 76.03 MIN

Low pre-set (with Multisampling OFF):

1920x1080 = 30.37 FPS, 25.02 MIN
1366x768 = 50.67 FPS, 40.17 MIN
1024x768 = 60.04 FPS, 48.39 MIN
800x600 = 73.03 FPS, 55.33 MIN

Medium pre-set (with Multisampling OFF):

1280x720 =
AVG: 31.99 FPS
Min = 27.19 FPS

Overall, the Pentium G3258 @ G3220 speeds beat the Athlon 5350 at every data point tested. However, I think it is important to point out the results were closest at the Medium pre-set (1280 x 720 resolution): the Pentium got 31.99 Avg FPS and the Athlon 5350 got 29.21. (Pentium's avg frame rate was only 9.5% higher than Athlon 5350 avg frame rate at this resolution and preset).
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Ill try to have some benchmarks with A4-4000 and Athlon 5150 this weekend. I can try the Celeron G1840 next week.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Pentium's avg frame rate was only 9.5% higher than Athlon 5350 avg frame rate at this resolution and preset).

I think this shows the GPU on the 5350 has the fillrate necessary to hang with the GT1 in non CPU intense scenarios. More testing needed.

So I have my H81 board all setup with a fresh install of Windows 8.1 ready to go. Unfortunately I somehow hosed my Windows install on my 5350 last night so I'll have to do a clean install there as well. So both systems will be on even ground at least.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Ill try to have some benchmarks with A4-4000 and Athlon 5150 this weekend. I can try the Celeron G1840 next week.

Perfect. We should have most of the relevant entry chips covered, perhaps I'll pick up the new 7300 next week unless someone here has a 6400K to test with?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I think this shows the GPU on the 5350 has the fillrate necessary to hang with the GT1 in non CPU intense scenarios. More testing needed.

I don't this has to do with fillrate because if we look at the following 1080p results:

Athlon 5350:

1920 x 1080
ultra low: 30.15 FPS, 24.33 MIN
low: 22.99 FPS, 18.73 MIN


Pentium G3258 @ G3220 speeds:

1920 x 1080
ultra low: 39.44 FPS, 32.74 MIN
low: 30.37 FPS, 25.02 MIN

At both 1080p low and 1080p ultra low presets Pentium's average frame rate is ~30% faster.

But compare 1280 x 800 (medium) to 1080p (low) and Wow, what a difference in the size of the delta:

Pentium G3220 @ G3220 speeds:
1920 x 1080 (low preset): Avg 30.37 FPS, 25.02 MIN
1280 x 720 (medium preset): Avg: 31.99 FPS, Min = 27.19 FPS

Athlon 5350:
1920 x 1080 (low preset): Avg 22.99 FPS, 18.73 MIN
1280 x 720 (medium preset) :Avg: 29.21 FPS, Min = 24.21 FPS

Moving from 1080p (low) to 1280 x 720 (medium), the Pentium doesn't gain much in frame rate going from 30.37 avg FPS to 31.99 avg FPS. However, the Athlon gains a lot making the same transition going from 22.99 avg FPS (on 1080p low) to 29.21 avg FPS (on 1280 x 720 medium).

So I am thinking this difference is more about the AMD being better at shading/textures than the Intel, rather than fillrate. In fact, I have to wonder what would happen if we were to lower resolution to 1024 x 768 or 800 x 600 and increase preset to high or ultra high?
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Regarding how low resolutions look on monitors, I have noticed quite a difference in how various settings look depending on the monitor.

At the moment, I have three monitors:

1. Acer 21.5" LCD 1920 x 1080 with a 16:9 aspect ratio
2. Dell 17" LCD 1280 x 1024 with a 5:4 aspect ratio
3. Dell 21" CRT (made by Sony) with 4:3 aspect ratio

For example, I've noticed experimenting with Skyrim at 800 x 600 (lowest settings, no AA) that it actually looks surprisingly good full screen on the 21" Dell CRT.

But move that 800 x 600 over to the 16:9 Acer and I noticed more jagged lines (using the same settings) when running full screen. Fortunately, Skyrim has the option to run lower resolutions in a window. This helps, but results in a rather small window. (I think if my 1080p LCD screen were larger maybe 24" or 27", I'd be much happier with the size of my 800 x 600 window.)

So if playing at lower resolutions, monitor selection is something to consider.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
So I am thinking this difference is more about the AMD being better at shading/textures than the Intel, rather than fillrate. In fact, I have to wonder what would happen if we were to lower resolution to 1024 x 768 or 800 x 600 and increase preset to high or ultra high?

Here are some more Dirt 3 results (with same hardware set-up) to add to the ones in post #119.

800 x 600:

medium (multisampling off): 51.99 avg FPS, 44.25 min FPS
high (multisampling off): 41.57 avg FPS, 34.23 min FPS
ultra (multisampling off): 17.29 avg FPS, 15.28 min FPS

1024 x 768:

medium (multisampling off): 37.02 avg FPS, 32.11 min FPS
high (multisampling off): 30.95 avg FPS, 25.92 min FPS

P.S. Regarding the visuals from the higher presets, I do think moving from ultra low to low is the most noticeable. Moving to medium and beyond (at these low resolutions) I did not notice much change.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Hey guys,

I didn't mean to let this thread die. Just been busy with family and work stuff. I had some time to take two crappy new videos on the 5350 on the weekend but just uploaded them now, please take a look.

Skyrim at 720P, Medium details: - http://youtu.be/sSqFnT_c7D8
Battlefied 4 Single Player - http://youtu.be/S-9ULHTGO50

I also did some testing on the G3258 with Skyrim at the same graphic settings and it was noticeably faster. Further testing will be done sometime this week, I plan on benching Titanfall and Crysis 3 on both systems.

Thanks again CBN for testing.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
For example, I've noticed experimenting with Skyrim at 800 x 600 (lowest settings, no AA) that it actually looks surprisingly good full screen on the 21" Dell CRT.

Having run some tests on my FW900 CRT I concur that games at low resolutions look a ton better than on my LCD screen. I think this is mainly due to the lack of interpolation that non native resolutions LCD's have issues with.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I plan on benching Titanfall and Crysis 3 on both systems.

I've tested Crysis 2 on my G3258 @ G3220 speeds and I had a hard time with FPS (@ 800 x 600 with lowest everything stays just above 30 FPS, sometimes hitting 40 FPS during gameplay on the first level of the game).

So it will be interesting to see how both processors do on Crysis 3.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Were you able to get FPS to go appreciably higher with lower resolution and/or detail settings?

Yes much higher FPS when lowering the details on both the 5350 and the 3258 with Skyrim.

So it will be interesting to see how both processors do on Crysis 3.

I've already played through the first mission on the G3258 with Crysis 3. It wasn't pretty but still surprisingly playable on the lowest absolute settings.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Some hardware I am on the lookout for sale prices:

MSI A55M-E33 FM2+: This is the FM2+ board is most often seen in the sale priced bundles at various retailers (Newegg, Tiger Direct, Fry's). It has also been listed as low as $25 by itself at Tiger Direct. I hoping to pick one up as soon as it is cheap enough so I can have a representative sample for the bargain iGPU gamer category being investigated in this thread.

13-130-748-TS


Athlon 5350/AM1 Mini-ITX: I would like to find a motherboard that will let me disable cores (which I hope all AM1s are capable of doing) so I can lower clocks and configure the Athlon 5350 as a "Athlon 5150" (by lowering multiplier to 16), a "Sempron 3850" (by lowering multiplier to 13 and lowering iGPU clocks from 600Mhz to 450MHz), a "Sempron 2650" (by disabling two cores and lowering multiplier to 14.5, lowering iGPU clocks from 600 MHz to 400 MHz, downclocking memory from 1600 MHz to 1333 MHz). For the "Sempron" configurations I will probably run the Half Life 2 Loast Coast Benchmark after making an attempt at running the Dirt 3 benchmark.

19-113-364-02.jpg


PowerColor Go! Green AX5450 1GB 64 bit DDR3: This is a passive HD5450 that routinely goes for $10 AR, free shipping at Newegg. I will combine it with my old E2180 (Core 2 duo) system for some low end hardware benchmarks.

14-131-338-TS
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
I think the average frame rate for playable should be 30.

For First person shooters average frame rate should be 60.

(If the average frame rate falls short, so be it. But I think an attempt to reach 30 average FPS should at least be made by lowering resolution and/or detail settings)

Why does everyone say 60 fps for a shooter? 30 FPS is just fine without a doubt. Under 30 yeah I can understand. Games like watch dogs you dont even need a constant 30 fps to play just fine.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Well since this is about entry level gaming PCs, 30fps average for every game should be ok.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Well since this is about entry level gaming PCs, 30fps average for every game should be ok.

For the first person shooters I think an attempt to hit 60 FPS should be made for point of reference. Then let the individual user decide on whether the frame rate gain is worth giving up the resolution and/or detail setting.

I think this will be especially important for multiplayer where folks will want to play at 60 FPS if at all possible.

Of course, with that mentioned how does one determine what settings will be sufficient for 60 FPS multiplayer in a first person shooter? There is no benchmark that simulates those load settings. (ie, even if a processor is tested to achieve 60 FPS at some resolution and detail setting for a single player benchmark, there is no guarantee that same frame rate will hold up in multiplayer.)
 
Last edited: