• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Xbox One games at E3 were running on Geforce/Win7

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
We also don't know what the hardware needs to be in a PC to match the performance of a console. I would expect it would take higher spec'd equipment in a PC because of the extra overhead.

At least 2x more powerful Hardware

https://twitter.com/ID_AA_Carmack/status/50277106856370176

http://www.eurogamer.net/articles/digitalfoundry-inside-metro-last-light

For a ~ 2TFLOP console you need about a 4TFLOP GPU. This is GTX-780 level.

The 8 jaguar cores in console is about a half the performance of an i7.

Therefore the PC equivalent (2x) is about an i7 + GTX 780. That was Microsoft choice at E3.
 
At least 2x more powerful Hardware

https://twitter.com/ID_AA_Carmack/status/50277106856370176

http://www.eurogamer.net/articles/digitalfoundry-inside-metro-last-light

For a ~ 2TFLOP console you need about a 4TFLOP GPU. This is GTX-780 level.

The 8 jaguar cores in console is about a half the performance of an i7.

Therefore the PC equivalent (2x) is about an i7 + GTX 780. That was Microsoft choice at E3.

You keep posting this again and never seem to get it. (Sometimes, like now, its 2x other times its 2x + 2x, or two titans; at least make up your mind)

1. That assumes optimizations and such. ~2x is the final figure, This also comes later in the dev cycle.
If you look at console ports you can see the the xbox 360 / ps3 in a straight comparison with modern gpus is around a 630m. Ports such as skyrim, dishonoured, me3, are playable on those gpus at the same or slightly better settings than the console versions.

2. 1 Nvidia GFLOP != 1 AMD GFLOP. gtx 780 != 7970 ghz despite having a similar number of GFLOPS (boost). That should tell you right now that GFLOP arguments should really be taken with a grain of salt and should always be compared across architectures whenever possible (I don't know why you are always comparing to nvidia instead of amd when both consoles are GCN; comparing to AMD we get a a ~7970 (non ghz) level of graphics performance which is significantly less than a gtx 780).

3. We cannot compare current games to next gen games. Current games are build on the same base as consoles and dressed up for the pc. Scaling between visuals and the required power is anything but linear (often minimal difference between high and ultra despite fps tanking). Take bioshock infinite for instance, high res textures and fairly demanding but because its built on a console base you get things such as

(Bioshock infinite high settings- no dx 10/11 effects, everything else maxed)

ha2e.jpg


Everything maxed

2cne.jpg


Basically no IQ gains for a near halving of fps. And look at that ugly planter of roses.
Hopefully next gen will fix this but until then games are basically putting lipstick on a pig. Costing tons of fps for almost no IQ gain. This is why comparisons should not be made between a pc game maxed out and a console game.

Here is a scene in which more can be seen

High

dh7p.jpg


Ultra

z3f7.jpg


(Basically no IQ difference, the trees in ultra are very slightly more detailed and for some reason the bell tower isn't as tall, contrast is very slightly better on ultra for the background). But is it worth the fps loss? No. In a game designed for better effects the fps cost of high vs ultra would be much less (maybe 4-5 fps vs the 17 fps it costs here).

I have no doubt that next gen consoles will have amazing graphics but please don't try and compare to maxed current gen games which are mostly glorified console ports.
 
Sony and Microsoft not showing games with actual console. Don't know how this can be good for AMD. 😀

Sony and MS were not showing them. The game makers were showing early dev versions of the games. Some of them were on real hardware, some were not.

But why do I bother, you are one of the more annoying trolls here. No real point in replying.
 
Sony and MS were not showing them. The game makers were showing early dev versions of the games. Some of them were on real hardware, some were not.

But why do I bother, you are one of the more annoying trolls here. No real point in replying.

Only completely brain wash fanboy could believe this. 😀

Enough with the personal attacks. -Admin DrPizza
 
Last edited by a moderator:
2. 1 Nvidia GFLOP != 1 AMD GFLOP. gtx 780 != 7970 ghz despite having a similar number of GFLOPS (boost). That should tell you right now that GFLOP arguments should really be taken with a grain of salt and should always be compared across architectures whenever possible (I don't know why you are always comparing to nvidia instead of amd when both consoles are GCN; comparing to AMD we get a a ~7970 (non ghz) level of graphics performance which is significantly less than a gtx 780).

It is the other way around. If PS4 has performance comparable to GTX780 and GTX780 has the same GFLOPS as 7970 ghz, but is at the same time 20% faster, than ps4 is as well 20% faster than 7970 ghz. Then ps4 real performance is about 1,20 x 4300 GFLOPS. That is 7970@1.2ghz
 
You keep posting this again and never seem to get it. (Sometimes, like now, its 2x other times its 2x + 2x, or two titans; at least make up your mind)

1. That assumes optimizations and such. ~2x is the final figure, This also comes later in the dev cycle.
If you look at console ports you can see the the xbox 360 / ps3 in a straight comparison with modern gpus is around a 630m. Ports such as skyrim, dishonoured, me3, are playable on those gpus at the same or slightly better settings than the console versions.

2. 1 Nvidia GFLOP != 1 AMD GFLOP. gtx 780 != 7970 ghz despite having a similar number of GFLOPS (boost). That should tell you right now that GFLOP arguments should really be taken with a grain of salt and should always be compared across architectures whenever possible (I don't know why you are always comparing to nvidia instead of amd when both consoles are GCN; comparing to AMD we get a a ~7970 (non ghz) level of graphics performance which is significantly less than a gtx 780).

3. We cannot compare current games to next gen games. Current games are build on the same base as consoles and dressed up for the pc. Scaling between visuals and the required power is anything but linear (often minimal difference between high and ultra despite fps tanking). Take bioshock infinite for instance, high res textures and fairly demanding but because its built on a console base you get things such as

(Bioshock infinite high settings- no dx 10/11 effects, everything else maxed)

ha2e.jpg


Everything maxed

2cne.jpg


Basically no IQ gains for a near halving of fps. And look at that ugly planter of roses.
Hopefully next gen will fix this but until then games are basically putting lipstick on a pig. Costing tons of fps for almost no IQ gain. This is why comparisons should not be made between a pc game maxed out and a console game.

Here is a scene in which more can be seen

High

dh7p.jpg


Ultra

z3f7.jpg


(Basically no IQ difference, the trees in ultra are very slightly more detailed and for some reason the bell tower isn't as tall, contrast is very slightly better on ultra for the background). But is it worth the fps loss? No. In a game designed for better effects the fps cost of high vs ultra would be much less (maybe 4-5 fps vs the 17 fps it costs here).

I have no doubt that next gen consoles will have amazing graphics but please don't try and compare to maxed current gen games which are mostly glorified console ports.

Where did you find these images? I'm getting very annoyed I can't find these types of IQ reviews. I wish more games were broken down like this to find optimal settings.
 
It is the other way around. If PS4 has performance comparable to GTX780 and GTX780 has the same GFLOPS as 7970 ghz, but is at the same time 20% faster, than ps4 is as well 20% faster than 7970 ghz. Then ps4 real performance is about 1,20 x 4300 GFLOPS. That is 7970@1.2ghz

Thats the whole problem right there. (major assumption)

I'm simply saying don't compare GFLOPS across architectures.

2x 1.84 TFLOPS = 3.68 TFLOPS (around 7970 non ghz).

(Don't compare to nvidia unless you absolutely have to).
 
I took them.😀

Seriously needs to be done for more games. I don't want to have to fiddle with settings all day to know what they do. Nvidia did this a little bit on their site but it's Nvidia. I can't trust them as they want to sell their GFX cards. I wish other review sites did this.
 
You keep posting this again and never seem to get it. (Sometimes, like now, its 2x other times its 2x + 2x, or two titans; at least make up your mind)

1. That assumes optimizations and such. ~2x is the final figure, This also comes later in the dev cycle.
If you look at console ports you can see the the xbox 360 / ps3 in a straight comparison with modern gpus is around a 630m. Ports such as skyrim, dishonoured, me3, are playable on those gpus at the same or slightly better settings than the console versions.

2. 1 Nvidia GFLOP != 1 AMD GFLOP. gtx 780 != 7970 ghz despite having a similar number of GFLOPS (boost). That should tell you right now that GFLOP arguments should really be taken with a grain of salt and should always be compared across architectures whenever possible (I don't know why you are always comparing to nvidia instead of amd when both consoles are GCN; comparing to AMD we get a a ~7970 (non ghz) level of graphics performance which is significantly less than a gtx 780).

The ~2x factor is due to API/OS. 4x is the theoretical boost claimed by Microsoft for the cloud.

"at least 2x" does not mean that 2x "is the final figure".

I would like to see real benchmarks or at least some synthetic benchmark, but I know none. We only know GFLOPs thus I am using that.

Nobody said that GFLOP is an accurate measure of performance. That is why I used the symbol ~ to denote approx. and that is why I wrote the word "about" like in "about a 4TFLOP GPU" or in "about an i7 + GTX 780"

The average difference between GTX 780 and a 7970 GHz is much less than the >2x factor discussed here. Moreover, due to AMD being on both XboxOne and PS4 the future games will be better optimized for AMD GCN than for Nvidia dropping the difference in performance measured today in the PC.

I am comparing to Nvidia by a simple reason. The PCs used by Microsoft at E3 run Nvidia.


Sony and Microsoft not showing games with actual console. Don't know how this can be good for AMD. 😀

Both Sony and Microsoft shown games running on the consoles as well.
 
AMD is in trouble. Sony also showed games on PC. 😀

http://www.dsogaming.com/news/report-plenty-of-ps4-games-also-running-on-the-pc-during-e3-2013/



Indie ran on consoles AAA titles on PC. 😀

Clink the link below. It has comments from actual game developers that almost every E3 demo on PS4 was on a devkit and not a regular windows PC. The article you link is quoting giantbomb editors who may have a stake in which company the argument goes in favor of. I trust a developer with a track record more than some editor i've never heard of.

http://www.playstationlifestyle.net...n-pcs-with-nvidia-cards-bodes-ill-for-launch/
 
The ~2x factor is due to API/OS. 4x is the theoretical boost claimed by Microsoft for the cloud.

"at least 2x" does not mean that 2x "is the final figure".

I would like to see real benchmarks or at least some synthetic benchmark, but I know none. We only know GFLOPs thus I am using that.

Nobody said that GFLOP is an accurate measure of performance. That is why I used the symbol ~ to denote approx. and that is why I wrote the word "about" like in "about a 4TFLOP GPU" or in "about an i7 + GTX 780"

The average difference between GTX 780 and a 7970 GHz is much less than the >2x factor discussed here. Moreover, due to AMD being on both XboxOne and PS4 the future games will be better optimized for AMD GCN than for Nvidia dropping the difference in performance measured today in the PC.

I am comparing to Nvidia by a simple reason. The PCs used by Microsoft at E3 run Nvidia.

MS can claim anything. We already know what to make of their claims.

You were making ps4 to nvidia comparisons way before E3.

And again 'about a 4 TFLOP gpu' compares to a 7970 ghz, NOT at gtx 780. Same architecture = better comparison.
 
Has MS explained/shown how the "cloud" will help? So far it just seems like marketing I haven't seen it applied yet.
 
Has MS explained/shown how the "cloud" will help? So far it just seems like marketing I haven't seen it applied yet.

Sony said somewhere that they will use cloud to boost ps4 performance aswell. I'm very interested in how they can use such a limited bandwidth to enhance graphics/gameplay
 
Sony said somewhere that they will use cloud to boost ps4 performance aswell. I'm very interested in how they can use such a limited bandwidth to enhance graphics/gameplay

MS actually said they purchased 300,000 servers though for this. I haven't heard anything from Sony.
 
MS actually said they purchased 300,000 servers though for this. I haven't heard anything from Sony.

They didn't "purchase" 300k servers. They have 300k servers for it. It will be using azure, which has been up and going for a few years now. They did expand it for XBO, but it is not new.
 
You were making ps4 to nvidia comparisons way before E3.

Because Epic and others used a Nvidia dGPU in the PCs to run the demos.

Has MS explained/shown how the "cloud" will help? So far it just seems like marketing I haven't seen it applied yet.

MS explained that initially the cloud will provide a theoretical 4x boost, but they plan to upgrade the servers in future. They will be using the cloud for gaming tasks not affected by latency and bandwidth. And shown several games that already use the cloud. Don't ask me the performance gain for those games, I don't know.
 
Last edited:
Yes, being in every single next gen console is big, big trouble. Educated posts are a little more helpful, but I know that's asking too much from you.

They have had console contracts for several years and yet are constantly losing money. The WiiU has been out already and they still posted another loss last quarter. Your opinion is not supported by facts.
 
Because Epic and others used a Nvidia dGPU in the PCs to run the demos.



MS explained that initially the cloud will provide a theoretical 4x boost, but they plan to upgrade the servers in future. They will be using the cloud for gaming tasks not affected by latency and bandwidth. And shown several games that already use the cloud. Don't ask me the performance gain for those games, I don't know.

What are gaming tasks not affected by latency and bandwidth though?

All I can think of is what we currently use the "cloud" for, dedicated server hosted and server side damage, shared player/mob positioning, and loot generation for RPG type games.

If the Lucid Hydra couldn't get two different cards to generate one image on a local motherboard, what else is really left for offsite to do?
 
Back
Top