It's hard to say based on just one number, since invariably what is reported is average framerate (not taking into account minimums and how much it varies over time). Anything below 20-30FPS starts to bother me (and for online FPS gaming with a low ping, you ideally want a higher framerate, like 50-60FPS).
Now, "30FPS" would be fine for me if the minimum was 20FPS or 25FPS, and 90% of the time it was between 25 and 35. But if "30FPS" means it's running at 50FPS half the time, and 10FPS the other half (which also gives an average of 30), then that's obviously not going to be a pleasant gaming experience. Big swings like this are often worse than they seem, since generally the lowest framerates are (at least for an FPS game) while you're in combat, which is a real bad time to have your UI go all choppy.
A better measurement of how well a game runs (IMO!) is *minimum* framerate, which is hardly reported by anyone (although HardOCP's much-maligned testing does generally include this figure). And I've never seen anyone attempt to statistically show FPS variance in a benchmark, although it would be useful if done right. Part of it is that some benchmarking tools (especially built-in benchmarks) don't report anything but average framerates.