Gikaseixas
Platinum Member
- Jul 1, 2004
- 2,836
- 218
- 106
I suppose you could try Giga ud5 or better
UD5 is great, i have had no problems at all
UD7 is even beefier, they went all out
I suppose you could try Giga ud5 or better
Not even remotely close to a 2500K. A 2500K can be 50% faster in games than a 8320.
If you look at the shogun 2 results in Shintai's first link, the 8350 is getting less than 30 fps ave and less than 20 minimum, and the 8320 is clocked 10% lower still, so I think a CPU bottleneck at least in that game is entirely plausible.
If you look at the shogun 2 results in Shintai's first link, the 8350 is getting less than 30 fps ave and less than 20 minimum, and the 8320 is clocked 10% lower still, so I think a CPU bottleneck at least in that game is entirely plausible.
Folk on the internet always say that AMD cpus are not good , then post some benchmarks. I actually believed it and went from an FX8350 to i7 3770K. Did not notice a huge increase in frames ... in fact it was single digits
Anyway I recently built an i5 pc for a friend and compared it against the FX8350
Thief on i5 was 51.1 (highest settings) and on the FX it was 43.9. That was the biggest difference
Tomb Raider was 63.9 on the i5 and 60 on the FX
Rubbish gaming CPU ? I don't think so
Folk on the internet always say that AMD cpus are not good , then post some benchmarks. I actually believed it and went from an FX8350 to i7 3770K. Did not notice a huge increase in frames ... in fact it was single digits
Anyway I recently built an i5 pc for a friend and compared it against the FX8350
Thief on i5 was 51.1 (highest settings) and on the FX it was 43.9. That was the biggest difference
Tomb Raider was 63.9 on the i5 and 60 on the FX
Rubbish gaming CPU ? I don't think so
Agreed. I picked up my i5-3570 (non K) for $10 less than the average FX-8350 was listed (Newegg) at the time, so the usual "Intel's cost SO much more" fanboy claims went straight out the window. Even at stock 3.4GHz it regularly scored 5-30% higher fps overall in dozens out of the +200 odd games I own including +20% gains on Skyrim, Oblivion, Starcraft 2, Thief, etc. With a mild OC to 4GHz (same clock), perf was up to 45% higher whilst drawing less than half the power (measured).Someone may find them usable in GPU limited games but not every game is GPU limited, and the big problem with the FX8320 or 50 is that they are inconsistent. You can find games where they are easily outperformed by an i3. Then you find another game where performance is acceptable. And those games where performance is acceptable, you're usually looking at performance of 3 generation old SB level performance. The case where they perform well are very few and far between, whereas the intel platform will give you consistency no matter what and the cost isn't that much greater really.
Folk on the internet always say that AMD cpus are not good , then post some benchmarks. I actually believed it and went from an FX8350 to i7 3770K. Did not notice a huge increase in frames ... in fact it was single digits
Anyway I recently built an i5 pc for a friend and compared it against the FX8350
Thief on i5 was 51.1 (highest settings) and on the FX it was 43.9. That was the biggest difference
Tomb Raider was 63.9 on the i5 and 60 on the FX
Rubbish gaming CPU ? I don't think so
Someone may find them usable in GPU limited games but not every game is GPU limited, and the big problem with the FX8320 or 50 is that they are inconsistent.
Not totally rubbish but lower performing than Intel with a higher power consumption for the same money.
Wow. It is hard to believe that a shared FPU in itself can possibly cause delays of that magnitude. To go from 500 on an i5 to over 12000 on the AMD tells me that there has to be some serious cache/memory thrashing or something on that order of magnitude. This is exactly the kind of obvious bug that could be profiled out, if AMD had any money to spend.
It's not the hardware that's inconsistent, its the software. It's not like an FX-8320 or 50 chip gets asthma or something running a game. The games that exhibit that performance boost are optimized for hyperthreading / compiled for Intel processors. For actual performance of the hardware, Passmark is pretty much spot on that an FX-8350 is a hair slower than an i7 3770.
Now that a heck of a lot more games are being developed for AMD CPU's (thanks to Xbox One and PS4 design wins) -- I doubt that lopsided performance for games will be as noticeable in the future.
Depends on the task. Playing games? Sure.
Multi-threaded apps -- an FX 8350 is probably faster than an i5 at just about all of them.
I don't think it's a CPU issue... more like the motherboard chipset pausing to load data from the hard drive
I've said before that AMD's main problem is not the CPU as I believe the FX 8xxx and 9xxx series are very good CPUs actually. The issue is the ancient motherboard chipsets with 2nd rate SATA and USB performance.
Every activity that depends mainly on the CPU, my FX 8350 excels but If I compare memory, SSD and USB speeds vs my Intel system, the performance in night and day.
So it's the software's fault that it doesn't run well on AMD, not AMD's fault that the CPU doesn't run software well.
BTW, the CPU in the game consoles isn't an FX chip, so optimizing for a console doesn't get the games optimized for a PC.
I've said before that AMD's main problem is not the CPU as I believe the FX 8xxx and 9xxx series are very good CPUs actually. The issue is the ancient motherboard chipsets with 2nd rate SATA and USB performance.
Every activity that depends mainly on the CPU, my FX 8350 excels but If I compare memory, SSD and USB speeds vs my Intel system, the performance in night and day.