It really depends -- The FX grows more competitive as resolution increases. So if someone is gaming @ 4K, where every game is GPU bound (upgrading from a FX 8 core is pointless).... The FX runs toe-to-toe with considerably more expensive CPU's.
If someone is planning to game in low resolutions, that's when Intel's superior single threaded power kicks in. But personally, I'll never game at a lowly 1080p ever again. I'm not even crazy over the way 1440p looks. I'm a 2160p guy.
And at 2160p -- there is absolutely no difference between my FX-8320 and my i7 4790K.... Because the GPU is holding me back at that resolution.
So if you are going to play @ 4k --
all your money should go into the video card(s). As long as you've got 4+ threads, the CPU is a moot point.
http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/4/
As for a Celeron or Pentium, I'd never recommend it. Sure you've got a good upgrade path -- but dual cores are obsolete. I personally recommend the i3 at the bare minimum, and suggest to stretch the budget to an i5 to friends doing Intel builds.
This is a fallacious argument. The FX doesn't grow more competitive as resolution increases, but rather, it's harder for a GPU to maintain a high framerate as resolution increases.
Let's take a hypothetical GPU bound scenario:
Game @ max settings
4K - 30fps (average)
1140p - 45fps
1080p - 60fps
Medium-high
4K - 45fps
1440p - 60fps
1080p - 75fps
Medium
4k - 60fps
1440p - 75fps
1080p - 90fps
Now, let's assume you're in the majority and have a 60hz monitor, and so going above 60fps is pointless. Let's also assume that with an FX CPU, you can get 45fps average in the above game, with ~35fps minimums, and with an Intel CPU (i5 or i7?), 70fps average with 50fps minimums.
^ I feel this is a scenario that's quite realistic and representative of more demanding games.
Would it be a correct conclusion to draw, then, that because you can only average 30fps at 4K/max with your GPU, that there's no point in buying a CPU that will deliver more than 35fps minimums and 50 average, because that's still higher than the lowest framerate you can bring your game down to with in-game settings?
In the hypothetical scenario above, the FX CPU owner essentially has two choices: 4K max at 30fps, or lower settings with 45fps average and dips into the mid 30's because of the CPU.
The Intel CPU owner has the options of 4K @ 30fps, slightly lower settings for 45fps average with no drops due to CPU limitations, or lower settings still with 60fps average and dips into the 50's due to CPU limitations.
Frankly, I think most people would rather drop their settings a little bit and get higher framerates, rather than play their games on their $1,000 monitor and $800+ GPU(s) at 30fps average. In fact, it doesn't matter how much you spend on your monitor or GPU - you'll want a CPU that can deliver fluid gameplay, if you can afford it, because there aren't really many settings in-game that you can change that will improve a CPU bottleneck, whereas any modern discrete GPU can deliver fluid framerates in any game, given the right combination of settings.
I'm not saying that FX CPUs don't deliver the better part of 60fps in most games today - but so would an i3. Both an FX and an i3 will be fine for more uses than not, and both will offer a compromised experience in some situations.
I generally don't advocate cheaping out on a CPU purchase in order to budget more for a GPU because 1) CPU limits generally can't be fixed with in-game settings, so when you have one, you have to live with it, 2) GPUs become obsolete and depreciate far more quickly than CPUs, so it makes more sense to replace GPUs more regularly, and 3) GPUs are easier to replace/upgrade than CPUs,
especially if you have to replace your entire platform, which is generally the case these days.