Question The FX 8350 revisited. Good time to talk about it because reasons.

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hans Gruber

Golden Member
Dec 23, 2006
1,462
546
136
Just curious. How much RAM was involved? In my experience having "too little" memory can have pretty much the same effect.
It's not the amount of memory. 16GB of memory for the computer and in the case of the 7700K it was a 1080 which has 8GB of vram. With my 3570k it was 16GB of computer memory and a GTX970 which is 4GB of vram. When I switched to a 3600 the problems went away. Another problem was the FPS. They were good but frames were being dropped and the game was choppy. The FPS counters do not account for this. It was a CPU issue in BF5. I should note BF1 did not have any effect close to what BF5 had on older CPU's.

The guy with the 7700K went to a 5600x and said the problem went away completely using the 1080 with it.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
It's not the amount of memory. 16GB of memory for the computer and in the case of the 7700K it was a 1080 which has 8GB of vram. With my 3570k it was 16GB of computer memory and a GTX970 which is 4GB of vram. When I switched to a 3600 the problems went away. Another problem was the FPS. They were good but frames were being dropped and the game was choppy. The FPS counters do not account for this. It was a CPU issue in BF5. I should note BF1 did not have any effect close to what BF5 had on older CPU's.

The guy with the 7700K went to a 5600x and said the problem went away completely using the 1080 with it.
Good stuff! Exactly what I have been ranting about for months and months. You have to play the games to understand. Testing games without playing them, for timely clicks, needs to stop being venerated and spammed in forums. None of them can tell you if the scene or NPCs were not rendering properly, if the audio was borked, if the CPU could handle the game while talking with friends on discord. Or if in the middle of MP action, the FPS looked okay even though the experience wasn't.
 
  • Like
Reactions: scineram

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
I have been using the FX8350 @4,6GHz 32GB 1866MHz, with a RX6600, for some 1440p gaming.

Which causes me to recall, that one of the biggest talking points against it, was the platform itself. Turns out all the pearl clutching and bashing concerning PCIE 2.0 was pointless. 4K results are also irrelevant, as the low budget hardware scene has never targeted that res. And assuredly not when this CPU was relevant.

TPU's testing methodology reflects my experience. That being, you lose very little performance - https://www.techpowerup.com/review/amd-radeon-rx-6600-xt-pci-express-scaling/28.html

For games like GTA5, I do feel it pairs better with a GTX1080.
 

burninatortech4

Senior member
Jan 29, 2014
485
189
116
I have been using the FX8350 @4,6GHz 32GB 1866MHz, with a RX6600, for some 1440p gaming.

Which causes me to recall, that one of the biggest talking points against it, was the platform itself. Turns out all the pearl clutching and bashing concerning PCIE 2.0 was pointless. 4K results are also irrelevant, as the low budget hardware scene has never targeted that res. And assuredly not when this CPU was relevant.

TPU's testing methodology reflects my experience. That being, you lose very little performance - https://www.techpowerup.com/review/amd-radeon-rx-6600-xt-pci-express-scaling/28.html

For games like GTA5, I do feel it pairs better with a GTX1080.
16x PCIe 2.0 is equivalent to 4x PCIe 4.0. (Navi 23) Being a laptop chip to begin with, it makes sense that it wouldn't be bottlenecked by lanes. I agree the PCIe 2.0 bashing was unwarranted.
 
Last edited:

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
16x PCIe 2.0 is equivalent to 4x PCIe 4.0. Being a laptop chip to begin with it makes sense that it wouldn't be bottlenecked by lanes. I agree the PCIe 2.0 bashing was unwarranted.
The 6600 series was also laptop based like the 6400 and the 6500? If so, to quote the immortal Johnny Carson - I did not know that.
 

LightningZ71

Golden Member
Mar 10, 2017
1,318
1,361
136
No, its not "laptop based". It was dual use, yes, but it wasn't laptop first.

As for the PCIe lanes, it has 8 lanes that are PCIe 4 capable. Those same 8 lanes are what's used when its in a dual gpu configuration. Everyone accepted a long time ago that 8 lanes was enough for anything but the absolute highest end, which the rx6600 certainly is not.

It is also an 8 GB card. What did we all learn from the 5500xt? The larger memory cards were MUCH less affected by the smaller PCIe link because they do FAR less texture swapping over the PCIe link. This is the beef with the 6500xt. It has a narrow PCIe link AND low vram (4GB). It absolutely shows in benchmarks.

On top of all that, the rx6600 is the lower clocked narrower of the two 6600s, meaning that it is less capable of saturating the PCIe bus than the XT.

I'm glad the gpu works for you.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
No, its not "laptop based". It was dual use, yes, but it wasn't laptop first.

As for the PCIe lanes, it has 8 lanes that are PCIe 4 capable. Those same 8 lanes are what's used when its in a dual gpu configuration. Everyone accepted a long time ago that 8 lanes was enough for anything but the absolute highest end, which the rx6600 certainly is not.

It is also an 8 GB card. What did we all learn from the 5500xt? The larger memory cards were MUCH less affected by the smaller PCIe link because they do FAR less texture swapping over the PCIe link. This is the beef with the 6500xt. It has a narrow PCIe link AND low vram (4GB). It absolutely shows in benchmarks.

On top of all that, the rx6600 is the lower clocked narrower of the two 6600s, meaning that it is less capable of saturating the PCIe bus than the XT.

I'm glad the gpu works for you.
Thanks, that's more in keeping with what I've seen and read.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
Yea 6500 and 6400 are primarily intended for laptop use. I assumed the lane count of 6600 meant it had laptop DNA. I guess I misspoke.
It's all good. I learned long ago, Cunningham's law is a superior method of acquiring new knowledge in forum discussions. Peoples' urge to tell you how wrong you are is irresistible.

Now back to me white knighting Vishera :p Having done testing with Witcher 3 lately, I was disappointed homie's FX board went belly up, and he could not test it here -


He changes both the i5 and amount of ram, and incorrectly concludes the ram was responsible for the frame pacing issues. Having tested a better quad core in the 3200G with 16GB of much faster DDR4, it was the i5 and ram both having nothing left to give. It got a chuckle out of me, because the i5 is pegged at 100% and homie is like "I guess it needs more ram since it's maxed out." If he had tried the i5 with 8GB, a fast GPU, ultra settings, and crowd density maxed, frame pacing would have been ugly around the docks and market where he was testing.

MY FX 8350 performs similar to the 3770K he uses. But because I was using a GTX 1080 with everything including hairworks maxed, I could see 100% spikes briefly in that same area. Both on Roach or sprinting around. The 3200G would suffer more in those spots.

I expressed my surprise about how intensive the game is for its age, and how many cores it can hit. I was stoked to see he had a similar reaction.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
Overclocking the ram, swapping the cooler, and a 5-600MHz CPU overclock, would give it a nice boost. Not overclocking FX is a PCMR felony offense. UK Steve is really a console gamer, so he gets off with a warning this time. ;) And that is a crap cooler.

I haven't played Elden Ring yet, not really into S&M. :p But I have read ram speed is important too. I will test it with 2133MHz and 4.6GHz when I buy it. I will use a GTX 1080 though. Not bothering with a weak GPU for a game like that. Hopefully I can get over 30fps that way; time will tell.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
24,225
7,217
146
OC? With that cooler? We don't want him to melt anything... important... ;)
I did cover that in my post. But I still like the joke. He has AMD compatible coolers laying around. If it fits AM4 it will work with AM3+. I have used everything from a Wraith prism to the current Noctua D15 with FX.

I wasn't going to overclock mine since that has been done to death for the last decade. But I was like

 
Thread starter Similar threads Forum Replies Date
I CPUs and Overclocking 0

ASK THE COMMUNITY