Or is this game much more optimized for PC than for Console?
It isn't a good test for either of them. Capcom hit the visuals they wanted on all of the platforms, it isn't pushing any of the systems. I really like the psuedo cell shaded graphics, but they are not a good display of anyone's capabilities.
need proof. input lags supposed to get pretty hideous once you start splitting up games to that level. i seriously doubt there are 6 cores at full utiilization. if that problem were solved pc games would be maxing quad cores all the time now...but they aren't.
If a PC game was maxing out an i7 it would
require an i7 to run, seriously think about that. What potential audience would you have at that point? That is one of the advantages fixed hardware has, you can push ever bit of it to the edge of its capability, on the PC if you do that you end up with the overwhelming majority of people not able to play the game. Also, input lag has nothing to do with how many threads you are running unless something is horribly broken in the code. UC2 is pushing all 7 cores on Cell, and it doesn't have any issues at all.
id's record is sh*t. look at the number of game licenses that run under unreal, id hasn't been a factor since q3a. the scale of difference in acceptance of their engines is vast, and the engines id has come out with for all their hype have never been as cutting edge as they claim, they are at best equal with unreal. and trailing crytec. carmack lost all interest in gaming, he became mr spaceman, and his indifference is showing.
If you honestly believe that, you don't know much about graphics engines. Rage will use the first multiplatform engine build by id, we'll see how badly it 'loses' to UE.
At 3.2 GHz, each SPE gives a theoretical 25.6 GFLOPS of single precision performance.
That is for each SPE, there are 6 SPEs and 1 PPE, just the SPEs give 153.6GFLOPS, more then double the fastest i7(and that is ignoring the PPE).
Ben, I have yet to see any game that utilizes the potential power of the cell architecture. Not only that, but back when the ps3 was still fresh there was a chance it could matter. Currently, though, the GPU gap is too large between PC and consoles.
In theoretical terms you are absolutely right. But in reality the most impressive game we have seen on PCs is still two years old. The raw technology is absolutely on the PC's side, there is absolutely no question there, the issue is the game aren't using it.
And the racers you mention - Forza3. As great as it looks, it's still average. Hell, NFS looks comparable if not better.
Seriously, you have absolutely no clue what you are talking about in the vaguest sense if you honestly believe that. Shift isn't in the league of Forza3 in terms of visuals, it is actually extremely poor- not to mention it is a vastly inferior game.
And where did you get the free AA? It's bullshit, don't read marketing slides, see for yourself.
Study up a bit on GPU architectures and how MSAA works, then make some comments. It's actually very, very easy to see in games, I suppose if you were borderline legally blind you could do a very simple comparison, fire up something like Fallout3 on both the PS3 and 360 and toggle back and forth between the systems. Yes, the 360 still has easily visible aliasing, but it is certainly much better off then the PS3.
The Xbox runs AA? It runs most games at 720p and upscales them to 1080o (my HDTV's native res).
Something wrong with your TV? I can't think of a 360 game that runs at 1080p on my TV off the top of my head, they all run at 720p, a few PS3 games run at 1080p, but not too many. Some titles do scale- and it is more common on the 360 then the others, but it is certainly less common then you are making it out to be.
Ports are going to look better on the PC, there really isn't a question there. If it can run on the PC at all, the raw power of the GPU is going to allow it to be run at a higher resolution with hgher AA/AF then its console counterpart.
Hey, I think that a 790GX can run xbox 360 games identically (minus AA) is a pretty impressive feat in itself. It also shows that consoles aren't using their gpus any better than PCs, a 790GX has almost identical specs to the 360's gpu, except in memory bandwidth. And hey, it's an IGP. Basically any real video card could run any console port at 1024x768 at 30 to 60 fps with AA.
I would agree that any remotely reasonable video card could handle any of the ports at a lower resolution then the consoles operate at with AA.
And i7s could run anything out on the consoles, even something highly optimized for the Cell. They've got nearly the raw flops of the Xbox 360 CPU, and should come a hell of a lot closer to reaching it.
No.
6 individually weaker cores....
Plus, i7 has hyperthreading, so it has 8 virtual cores. And each core in the i7 is of a high enough complexity that performance, in many apps, does scale as if it had 8 real cores once hyperthreading is turned on.
Cell has 7 cores, I was ignoring the PPE as it can be entirely dedicated to scalar ops allowing the SPEs to focus on FP tasks. The fact that you can utilize as many(actually more) threads on the i7 isn't the problem, it is simply too slow in terms of raw FP performance to do some of the things Cell is capable of(in particular physics calculations). PCs can get around this using PhysX, but that seems to get the ire up of a lot of people when they do.
I don't think renderware is even in that business anymore, EA folded them into themselves and only use them for internal development now.
I was speaking in terms of UE being the most succesful license engine. Renderware was massive last generation.