Impressed with FX-8350 and the new article at Anand

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

galego

Golden Member
Apr 10, 2013
1,091
0
0
Fascinating how up to a 10% increase of the scheduler has been magically reduced to 1% by a guy that call others names...

As shown by Anandtech the FX hotfixes can increase up to a 10% the framerate of some games. Several games obtain about a 5%

43703.png


Using adequate memory speeds account for another 5%. And summing up only those two effects already gives about a 10% increase for the same chip.

This 10% is about the average difference in performance, reported above, between the FX-8350 and the i7-3770k. It is about the improvement from going from one generation of chips to the following.
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
I might buy an 8350 if they'd make a toaster attachment for the CPU cooler, or if they could incorporate one into the block heater on my diesel pickup.
 

Mallibu

Senior member
Jun 20, 2011
243
0
0
Fascinating how up to a 10% increase of the scheduler has been magically reduced to 1% by a guy that call others names...

As shown by Anandtech the FX hotfixes can increase up to a 10% the framerate of some games. Several games obtain about a 5%

43703.png


Using adequate memory speeds account for another 5%. And summing up only those two effects already gives about a 10% increase for the same chip.

This 10% is about the average difference in performance, reported above, between the FX-8350 and the i7-3770k. It is about the improvement from going from one generation of chips to the following.

You still haven't answered to Gusk about your machine specs :whiste:

Where's that 10%?
image003.png

I can't see it anywhere. Instead I see a 25% better performance from i5 2500 (not even IB) over the hotfixed FX.
Also, all major sites reported ~1% all around gain for the hotfix, you might wanna debate with them and prove everyone wrong.
Above 1333 memory, gains are 1-2% in gaming, and that's beeing generous.
So, stop living in your dream world and be honest with yourself.

It's not reviewers/OSes/mobo/bios/ram/compilers fault FX performs worse. It's the CPU. And even if it wasn't, who cares? People want performance, and will buy whatever gives them the best. They don't care about your conspiracy theories.
 
Last edited:

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
This 10% is about the average difference in performance, reported above, between the FX-8350 and the i7-3770k.

You got your image from this page, so you can see quite easily for yourself that not only was that not the average improvement, not one single benchmark was more than about half of that.

You're being deliberately dishonest.

I jotted down all the before-and-after numbers from the page where you got your graph. The average improvement was 3%. Furthermore, the preceding page in that same article shows that the improvement under light and heavy loads is 0%.

What's the average of 0% and 3% again? A lot closer to 1% than 10%.
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Uh, the compiler chart is time in mins, so hotfix is still an improvement...

Regardless, kind of silly to be talking about original Bulldozer with Piledriver x3xx series being the current available.

Edit1: Or did you know that about the compile graph and were being doubly deceptive? If so, that's not really constructive at all.

Edit2: Woah, instead of editing it out completely why not just type in a mea culpa? For those coming in late, post above me was countering what they believe is cherry picking with a factually incorrect statement regarding a graph from the same source.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
You still haven't answered to Gusk about your machine specs :whiste:

Where's that 10%?
image003.png

I can't see it anywhere. Instead I see a 25% better performance from i5 2500 (not even IB) over the hotfixed FX.
Also, all major sites reported ~1% all around gain for the hotfix, you might wanna debate with them and prove everyone wrong.
Above 1333 memory, gains are 1-2% in gaming, and that's beeing generous.
So, stop living in your dream world and be honest with yourself.

It's not reviewers/OSes/mobo/bios/ram/compilers fault FX performs worse. It's the CPU. And even if it wasn't, who cares? People want performance, and will buy whatever gives them the best. They don't care about your conspiracy theories.

You are who is mentioning conspiracy theories, rest is analising facts.

Nobody said a 10% in DIRT 3. If you go and read my post I said 5% just before linking the DIRT 3 bechmark. Anandtech writes:

DiRT 3 shows a 5% performance gain from the hotfixes.

Maybe some sites report a 1% or 2% average, but that average includes single threaded applications for which the hot fix is useless.

1-2% in gaming for Intel CPUs due to RAM? Agree. Not for AMD FX neither of course for APUs.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
You got your image from this page, so you can see quite easily for yourself that not only was that not the average improvement, not one single benchmark was more than about half of that.

You're being deliberately dishonest.

I jotted down all the before-and-after numbers from the page where you got your graph. The average improvement was 3%. Furthermore, the preceding page in that same article shows that the improvement under light and heavy loads is 0%.

What's the average of 0% and 3% again? A lot closer to 1% than 10%.

This is a complete misunderstanding of my post.

If you read the parts of my post that you snipped, I wrote:

As shown by Anandtech the FX hotfixes can increase up to a 10% the framerate of some games.
Up to 10% does not mean 10% average.

The next page

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested/4

has a table that gives a 10% increase in the game Left4 Dead2. However I did not use that maximum value, I used the 5% measured in DIRT 3.

The part of my message that you are quoting now says something different. Reintroducing the snipped parts, I wrote:

Several games obtain about a 5% [DIRT3 figure] Using adequate memory speeds account for another 5%. And summing up only those two effects already gives about a 10% increase for the same chip.

This 10% is about the average difference in performance, reported above, between the FX-8350 and the i7-3770k. It is about the improvement from going from one generation of chips to the following.

That 10% mentioned in the first paragraph is the sum of the 5% due to the hotfix (e.g. DIRT 3) plus a 5% due to RAM. This total 10% improvement on an FX bulldozer chip is then compared to the 10% difference between the Piledriver FX and the i7, which is mentioned in many places, including one or two pages ago in this same thread when we were discussing linux benchmarks of the FX-8350 and the i7-3770k.

Next time before you call others dishonest you would think twice because maybe there was some misreading or some part was not properly explained. ok?
 
Last edited:

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
The next page

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested/4

has a table that gives a 10% increase in the game Left4 Dead2. However I did not use that maximum value, I used the 5% measured in DIRT 3.

That table doesn't compare just the hotfix. It compares two different operating systems. Apples and oranges -- we don't know what impact the OS change would have on Intel CPUs, for example.

And people are not going to change operating systems to get a few percentage points of improvement on their CPUs.

And... even with all that, you're still cherry-picking and over-emphasizing outliers. You talk about the 10% outlier and not about the others which are in the 2-4% range.

This total 10% is then compared to the 10% difference between the FX and the i7, which is mentioned in many places, including one or two pages ago in this same thread when we were discussing linux benchmarks.

And again, you are cherry-picking small numbers of benchmarks that support your position and ignoring all of the others, in addition to exaggerating the average or overall differences. Others have said they don't see this 10%, and I don't see it either.

Finding one benchmark with a 5% difference and then deciding that you should add another 5% because of some RAM difference you decided was important does not add up to an actual 10% difference.

ETA: Another direct quote you made: "As shown by Anandtech the FX hotfixes can increase up to a 10% the framerate of some games. Several games obtain about a 5%". Neither of those comments is accurate.

Most of us are here not because we love Intel and hate AMD or the other way around. We're here because we're interested in technology. When you distort data to push an agenda, you make meaningful exploration of performance issues impossible. If you're going to reference benchmarks, you should reference them fairly -- which means looking at ALL of them, not just the ones you like.
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
To the poster using the name galego. First post the specs of the computers you own like I do. Do you own either a 3770k or FX 8350?

Mr Kozierok's post:Most of us are here not because we love Intel and hate AMD or the other way around. We're here because we're interested in technology. When you distort data to push an agenda, you make meaningful exploration of performance issues impossible. If you're going to reference benchmarks, you should reference them fairly -- which means looking at ALL of them, not just the ones you like is right on the money.

You see galego I OWN both a 3770k rig AND a FX8350 rig set forth below. In fact I own a 8320 rig OC'd to 4.3 Ghz and a 965BE rig so I'm hardly partial to Intel.

In addition, I've compared the various benchmarks with my 3770k vs the 8350 and my findings bear out that the 3770k is faster. Again, have you actually tested the cpus (3770k and FX8350) against each other or are your postings based upon something you read?
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
That table doesn't compare just the hotfix. It compares two different operating systems. Apples and oranges -- we don't know what impact the OS change would have on Intel CPUs, for example.

The improved scheduler is already included by default in W8. And if my memory does not fail that table is obtained from the AMD blog discussing the scheduler hotfix.

And people are not going to change operating systems to get a few percentage points of improvement on their CPUs.

And... even with all that, you're still cherry-picking and over-emphasizing outliers. You talk about the 10% outlier and not about the others which are in the 2-4% range.

I mentioned the maximum value of 10%, but I did not use it. I used the 5% measured for DIRT 3.

And again, you are cherry-picking small numbers of benchmarks that support your position and ignoring all of the others, in addition to exaggerating the average or overall differences. Others have said they don't see this 10%, and I don't see it either.

The other person believed that I had said 10% in DIRT3 when I was saying 5% in DIRT3. I even added the DIRT 3 benchmark figure to my post!!!

Finding one benchmark with a 5% difference and then deciding that you should add another 5% because of some RAM difference you decided was important does not add up to an actual 10% difference.

ETA: Another direct quote you made: "As shown by Anandtech the FX hotfixes can increase up to a 10% the framerate of some games. Several games obtain about a 5%". Neither of those comments is accurate.

It is not only "one" benchmark. "Several games obtain about a 5%" seems completely reasonable for me. The table mentioned contains two games for which the improvement is a 4%. The page 3 of the same review, contains several benchmarks with a 5%:

DiRT 3 shows a 5% performance gain from the hotfixes.
Crysis Warhead mirrors the roughly 5% gain we saw in DiRT 3
Civilization V's CPU bound no render results show no gains, which is to be expected. But looking at the average frame rate during the simulation we see a 4.9% increase in performance.
And other non-gaming benchmarks in the same page also give a 5%:

Our Visual Studio 2008 compile test is heavily threaded for the most part, however the beginning of the build process uses a fraction of the total available cores. The hotfixes show a reasonable impact on performance here (~5%)
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
As you well know, if you bitcoin/litecoin mine, your budget for a $150-200 video card can (and should) be fairly easily allocated into a $380-410 7970 for dual use. This is also the most cost effective way to support AMD if you're into that.

Also, nice moving the goalposts........ again.

"You should get 8350 for games that aren't cpu intensive" is a terrible argument for using 8350 over 3570k in gaming.

Absolutely, I'd never game with less than a 7970 on a desktop that can support one, but I'm not exactly an average consumer. Tell the average mom or kid that the expensive $450 card is actually cheaper because it pays for itself through bitcoin mining and you will just get blank stares.

I don't think I ever even set any goalposts. My stake in this thread was simply disputing the absurd rhetoric that the FX-8350 was a bad deal because of power usage, and then the conversation sorta flowed into games so I pointed out how the 8350 does fine at running the games in the AT bench. Then someone pulled out a cherry picked result from another review... so how am I the one moving the goalposts?

I never said the 8350 was awesome at all games. I am simply saying that the 8350 is a poor match for a 7970- why bother saving $50 or $100 on your CPU just to go and spend $450 on a video card? It doesn't make a lot of sense. On the other hand, if you are building under a budget, and you are looking at $200 video cards, the 8350 is more than enough CPU for them.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
I never said the 8350 was awesome at all games. I am simply saying that the 8350 is a poor match for a 7970- why bother saving $50 or $100 on your CPU just to go and spend $450 on a video card? It doesn't make a lot of sense. On the other hand, if you are building under a budget, and you are looking at $200 video cards, the 8350 is more than enough CPU for them.

Actually, the article referenced in the OP draws the opposite conclusion, at least at 1440. An 8350 is more than enough for a 7970 in GPU limited situations.
 

jaqie

Platinum Member
Apr 6, 2008
2,471
1
0
Most of us are here not because we love Intel and hate AMD or the other way around. We're here because we're interested in technology. When you distort data to push an agenda, you make meaningful exploration of performance issues impossible. If you're going to reference benchmarks, you should reference them fairly -- which means looking at ALL of them, not just the ones you like.
Definitely and without a doubt this is something so many people seem to misunderstand or not comprehend at all.

I don't even pay much mind to the brand anymore, just the benchies and memory-experience of the particular cpu (and chipset and ram et al)...

Now that you bring it up though, lesee. These are systems I use at least daily:
intel pm 1500 tablet laptop
intel core2duo 2333
amd phenom II x6 @3500

Systems I am building / rebuilding / are in hot 'ready' storage (built and good to plug in and go) usually used once a month to once a week
amd 386dx40
intel 486dx2/66
intel p166
intel p2 400
intel p3m-750 laptop
intel p3 coppermine 1000
intel dual p3 tualatin 1400
amd athlon and duron socket A (multiple, forgot all of them! :eek: )
amd c50 1000 netbook
amd athlon64 x2 1900 notebook
amd athlon64 x2 3800+ 2000
intel pentiumDualCore (early core2duo less L2 lower FSB) 1867 laptop
intel core2duo wolfdale e3200 @3500
intel core2quad 9650 3000
amd athlon II x4 3000
amd dual opteron 8354 (8 cores)
-----------------------------------------
I am not even going to bother trying to remember and list everything I have motherboards and cpus and various parts for but do not have a system built or building that I use them with.... I would run out of room or fall asleep typing...

Clearly I must either be a fangirl of amd or intel, based solely on a single opinion I post in a single thread somewhere... :rolleyes:
 
Last edited:

Mallibu

Senior member
Jun 20, 2011
243
0
0
batman.gif


warhammer.gif


formula.gif


metro.gif


mafia.gif


There goes the "magical schedule improvements" through the toilet flush.

That leaves us with your "slow RAM" argument. Except that, most reviews use identical RAM speeds between CPUS, so your "10%" claim, becomes a nice round 0%. Oh, and if the difference between FX and I7 3770k was only 10%, AMD would be popping champagnes right now. Feel free to keep inventing imaginary lipsticks for that pig, however answer guskline's question and stop pretending you're not seeing it :whiste:
 
Last edited:

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
The improved scheduler is already included by default in W8.

Irrelevant. You cannot compare the performance of one processor between two OS generations without also comparing the other. For all we know, W8 could speed up Intel chips even more. Probably not, but that's the point -- it has to be tested.

I even added the DIRT 3 benchmark figure to my post!!!

Your benchmark cherry-picking is relevant to the exactly 0% of the population who only uses their computers to play "DIRT 3".

It is not only "one" benchmark.

It doesn't matter if it's one or two or three or four. You have to look at the benchmarks as a whole -- not just the ones you think make your football team look good.

Oh, and if the difference between FX and I7 3770k was only 10%, AMD would be popping champagnes right now.

Yeah, AMD is selling the chip for 40% less out of the goodness of their hearts, because they don't really need that extra money. :)
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
http://www.techpowerup.com/158534/new-windows-7-bulldozer-patches-available.html

In initial testing of the upcoming Windows 8 operating system, we’ve seen performance improvements of up to 10% in some applications, when compared to Windows 7. This is because the system correctly recognizes the AMD FX processor architecture and cores. Thanks to close collaboration between Microsoft and AMD, Microsoft recently completed back-porting some of the Windows 8 scheduler code for AMD FX processors into a hotfix for Windows 7."

[...]

The best possible cases for improvement are applications that use ½ cores in your AMD FX processor. In our testing using the AMD FX-8150 processor, we found the best improvement in wPrime, Left 4 Dead 2, and Lost Planet.
The table here

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested/4

gives a 10% improvement for Left 4 Dead 2. Battlefield 3 and CoD Black Ops obtain a 4% each.

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested/3

DiRT 3 shows a 5% performance gain from the hotfixes.
Crysis Warhead mirrors the roughly 5% gain we saw in DiRT 3
Civilization V's CPU bound no render results show no gains, which is to be expected. But looking at the average frame rate during the simulation we see a 4.9% increase in performance.
Other non-gaming benchmarks in the same page also give a 5%:

Our Visual Studio 2008 compile test is heavily threaded for the most part, however the beginning of the build process uses a fraction of the total available cores. The hotfixes show a reasonable impact on performance here (~5%)
The gain depends of the game/app. selected, but this was already emphasized before.

http://www.behardware.com/news/12089/amd-fx-windows-patches-tested-and-available.html

In gaming the gains are quite variable. Crysis 2 went from 36.7 to 40.2 fps with the first patch. With both patches combined the fps count dropped back to 38.5. In Rise of Flight it’s the same story, dropping back to 20.6 fps compared to 20.8 fps with the first patch alone (20.5 fps without any patch). Performances in Starcraft 2 and Shogun 2 are unchanged while Arma II and Anno 1404, which already benefitted from the first patch, are now up to 27.7 fps (27.4 fps with one patch, 26.8 without) and 32.3 fps (31.5 fps with one patch, 29.9 without). Finally F1 2011, which didn’t benefit from the first patch, is now at 56.7 and 59.7 fps.

The overall 3D gaming index is up from 104.1 without a patch to 107.9 with both (107.3 with just the first).
The overall gain using their selection of games is about a 4%.

Here is another game showing about a 5%

132d.jpg



As showed above, Left 4 Dead 2 obtains a 10% improvement from the Hotfixes, the improvement from avoiding underclocked ram is small

l4d2.jpg


but considering the sum of both effects (hotfix+RAM) gives a real performance of about a 12%, which is not to be ignored.

Other games are more sensible to RAM speeds (this is for Bulldozer):

games.png


http://www.madshrimps.be/files/users/leeghoofd/Zambezi/dividers/games.png
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
Irrelevant. You cannot compare the performance of one processor between two OS generations without also comparing the other. For all we know, W8 could speed up Intel chips even more. Probably not, but that's the point -- it has to be tested.

The links that I provided above claim that the improvements listed in that table are due to the scheduler hotfix, which is already included by default in W8. The 10% increase in games as Left 4 Dead 2 is due to the scheduler hotfix:

In initial testing of the upcoming Windows 8 operating system, we’ve seen performance improvements of up to 10% in some applications, when compared to Windows 7. This is because the system correctly recognizes the AMD FX processor architecture and cores. Thanks to close collaboration between Microsoft and AMD, Microsoft recently completed back-porting some of the Windows 8 scheduler code for AMD FX processors into a hotfix for Windows 7."

[...]

The best possible cases for improvement are applications that use ½ cores in your AMD FX processor. In our testing using the AMD FX-8150 processor, we found the best improvement in wPrime, Left 4 Dead 2, and Lost Planet.
http://www.techpowerup.com/158534/new-windows-7-bulldozer-patches-available.html

This is the same reason why the table with Left 4 Dead 2 and the other games is included in the Anandtech review on the bulldozer patches:

http://www.anandtech.com/show/5448/the-bulldozer-scheduling-patch-tested/4

Your benchmark cherry-picking is relevant to the exactly 0% of the population who only uses their computers to play "DIRT 3".

What? People at Anandtech, Toms and dozens of other sites are benchmarking DIRT 3 because it is used by "0% of the population"? Are you kidding?

43703.png



It doesn't matter if it's one or two or three or four. You have to look at the benchmarks as a whole -- not just the ones you think make your football team look good.

As said before I find enough games with about a 5% increase to consider that a fair value. People here

http://www.behardware.com/news/12089/amd-fx-windows-patches-tested-and-available.html

obtained a gaming average of about a 4%.
 
Last edited:

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
gives a 10% improvement for Left 4 Dead 2. Battlefield 3 and CoD Black Ops obtain a 4% each.

Yes, we already discussed that. I'm not sure how many more ways I can say that nobody cares about cherry-picked benchmarks.


And still repeating yourself with cherry-picked benchmarks.

The overall gain using their selection of games is about a 4%.

I'm pretty sure 4% is substantially less than 10%.

Why did you mention only the gaming average and not the application average? Maybe because there the benefit was a whopping 0.7%?

This is why I object to your benchmark portrayals. You point to an article that has a number of benchmarks, then shout from the rooftops about the one or two that you like, and completely ignore the others. That's not the right way to evaluate performance.

The 10% increase in games as Left 4 Dead 2 is due to the scheduler hotfix:

Which, as your own quote states, is the "best possible case".

Nobody cares about best possible cases. They care about average use.

What? People at Anandtech, Toms and dozens of other sites are benchmarking DIRT 3 because it is used by "0% of the population"? Are you kidding?

I think English is not your native language, so I'll just explain that I was not saying 0% of the population plays DIRT 3. I was saying 0% of the population only plays DIRT 3, which is why your constant mention of that one game is mostly irrelevant.

As said before I find enough games with about a 5% increase to consider that a fair value.

Yes, and you conveniently ignore all the ones where there is little or no advantage, and then act like your 5% is the "average", except when you're actually saying that the average is 10%.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Yes, and you conveniently ignore all the ones where there is little or no advantage, and then act like your 5% is the "average", except when you're actually saying that the average is 10%.

I have given a link where the average they obtained after testing different games was of 4%. This is very very close to me taking 5% as average.

I said that 10% was the maximum improvement reported for some games (e.g. Left 4 Dead 2). My exact words were:

the FX hotfixes can increase up to a 10% the framerate of some games.
I explained this to you before. You continue ignoring what I said and now ignore further explanations. Why do you pretend that I said 10% average due to hotfix, when I took 5% as hotfix average?

Why?
 
Last edited:

itsmydamnation

Diamond Member
Feb 6, 2011
3,076
3,908
136
40%? Did you account for the faster RAM in order to make the FX "shine"? :whiste:

do you jsut like making crap up :whiste: , FX get piss port scaling from memory after about 1333/1600.

you have the variables

NB multi (includes L3)
core multi
HT ( amd's Bclock)
HTT multi (hyper transport)
memory multi

out of all of those the only two that really matter is getting the HT as high as possible with the core multi as low as possible while still hitting your max stable clock. unlike deneb/thruban NB overclocking does almost nothing, along with HTT, memory see's very little as well.

Granted i didn't use every benchmark under the sun,
but I did use:
IBT
CB 11.5
ffmpeg with x264 transcoding on 1080P x264 13Mbps L5.1 source.

In all of those memory speed had no noticeable impact. biggest performance improvement was raw clock, followed by reducing the multi while maintaining said clock.
 

Mallibu

Senior member
Jun 20, 2011
243
0
0
I have given a link where the average they obtained after testing different games was of 4%. This is very very close to me taking 5% as average.

I said that 10% was the maximum improvement reported for some games (e.g. Left 4 Dead 2). My exact words were:

I explained this to you before. You continue ignoring what I said and now ignore further explanations. Why do you pretend that I said 10% average due to hotfix, when I took 5% as hotfix average.

Why?

And I have given a link where the difference is 0% accross multiple games, not your cherry picked ones, so do the math.
When hotfix went out, everyone was dissapointed by it, and the general gains, were 1-2%, and as usual you're trying to prove anyone wrong by linking certain 2-3 benchmarks, and ignoring the rest 100 that show 0-1% gains.
The 12% total improvement you're claiming including RAM speed :)awe:), like everyone is using 2133 for IB, and 1333 for FX, is so misinformative I can't even keep a straight face while reading it.

You're on a AMD crusade on these boards, that's pretty obvious but with your lie spreading and conspiracy theories, even AMD fans don't support you on this one.

Do us a favor and answer guskline questions :rolleyes:
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
I have given a link where the average they obtained after testing different games was of 4%. This is very very close to me taking 5% as average.

Even 5% is an exaggeration, because you are ignoring all the games and other applications where there was little or no improvement.

This has been pointed out numerous times. A whole bunch of sites tested these hotfixes thoroughly, and the overall conclusion was "meh".

If you don't want people taking issue with what you write, perhaps you could curtail the blanket declarations like this one: "For current games, the FX-8350 offers essentially the same gaming performance than i7-3770k but at one fraction of the cost!"


do you jsut like making crap up :whiste: , FX get piss port scaling from memory after about 1333/1600.

I think he was being sarcastic.