Well it is an indication of a bad port. Letterboxed 16:9 = effective resolutions of 1920x816 or 1280x544. If it's still only 30fps at those, something's very wrong...
Edit : Not intended for anyone here, but what amuses me in general are those funny Hollywood Artist Diva comments of "Oh but you simply MUST play this at 30fps, darling, due to the cinematic effect!". The sad truth is - the "cinematic effect" being "superior" is entirely placebo. 24fps arose in the 1920's as a de facto fixed standard (to standardize audio pitch after the variable fps silent movie era ended) out of the mean average of what cinema's were outfitted to show formerly silent movies at the time (typically 22-26fps), plus measurement convenience (at 24fps the film travels through the projector at a rate of exactly 18.0 inches per second). Likewise 25Hz (PAL) and 30Hz (NTSC) were chosen based on the electrical grid systems (25/30 fps frames interlaced = 50/60 fields per second where early TV's would use the mains AC frequency (240v @ 50Hz / 120v @ 60Hz) as the "timer". None of these rates were selected for any "artistic" effect at all. Nor do any of them "look" any better beyond the placebo of simply being conditioned into thinking a genre "belongs" to a certain frame-rate simply because that's the way it's been historically filmed anyway purely out of coincidental backwards compatibility. There's really zero relevance from film vs rendered on-the-fly modern PC / console fps which are not "filmed" at any static rate at all.
In reality - the so called "glorious" 24p cinematic effect actually looks cr*p whenever motion is involved when you play it back at that rate without using motion blur to hide it (which itself looks highly unrealistic in PC & console games due to the way the human eye doesn't see such blur in real life when viewing a moving object due to... higher "frame rates" of the human eyeball). That's why you have the amusing scenario of games developers pretending to base it on "24-30 cinematics" (as an excuse to avoid admitting performance problems), whilst at the same time, Blu-Ray forums are filled with people who setup their player up to output "proper, native, as it was filmed 24p" (vs deinterlaced 50-60Hz), soon complaining about [judder, judder, judder] every time the camera pans, and solving the problem by... wait for it... turning their TV's frame-rate doubling / tripling / quadrupling interpolation feature on! You just can't make it up...