• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Xbox One games at E3 were running on Geforce/Win7

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
How disappointing, why are they still using 7? UI aside, you obviously want 8. That does indeed look like LGA 2011, that socket is rather large. Still, not surprising. The only good thing about either next gen console is that they have 8GB RAM and x86 architecture. Although checkpoints likely won't die. *grrrrr*.
 
ROFL!! i see a intel platform with a Corsair H#O AIO watercooling unit.
And a BIG GTX card!!!

_1371234242.jpg

I guess the 7790 wasn't powerful enough for the Xbox One's E3 games eh? They needed a real manly card like the GTX 780, eh?

now i would like to hear what the console people have to say.
You know why ur demo's are like ultra high end gaming PC's? ... cuz your demo's are being played on ultra high end gaming pcs!
 
Last edited:
Did you look at the picture?

It looks like a LGA 2011 platform with corsair water cooling.

My statement was in response to Balla saying MS should have put Haswell and Maxwell in the Xbox One, not whatever they used for E3.

About as much sense as your price figures.


Xbox One already lost, a year late with a OoO atom and $100 maxwell would beat the PS4 in every metric.

$800 was a high estimate going on prices near the top of the line-up rather than low- to mid-tier products. My mistake.

Still, MS can play the waiting game into perpetuity. Next year when Maxwell and OoO Atom launches it will only be a year away from the next greatest thing and so on and so on.

Also, I'm thinking a $100 Maxwell will get beat by a 7850-class GPU. That's like comparing a GTX 650 to the Radeon 6850.

perfrel_1920.gif


I agree though that MS hosed the Xbox One with their pricing and used-game policy. PS4 looks better all around.
 
Also, I'm thinking a $100 Maxwell will get beat by a 7850-class GPU. That's like comparing a GTX 650 to the Radeon 6850.

I did a little comparison of Fermi chips vs. their equivalent Kepler chips (insofar as inherent successor, i.e. GK104 was derived from GF114 etc.) GK107 replaced GF118 (equal die sizes), and comparing the gtx650 (which is the fastest product using GK107) with the fastest GF118 based card (which I cannot recall what it is now, maybe the gt440 but not sure atm), performance slightly more than doubled (108% performance increase, to be exact). So if Nvidia more or less keeps up that same performance improvement on the small die, Maxwell's fastest card based off it's smallest chip should be able to perform right at, or even slightly better than, hd7850 levels. Hence, the truth is somewhere in between both you and Balla.

Here are the comparisons I made with Kepler to Fermi and with GCN to Cayman and Evergreen. http://alienbabeltech.com/abt/viewtopic.php?f=6&t=30552
 
Last edited:
I did a little comparison of Fermi chips vs. their equivalent Kepler chips (insofar as inherent successor, i.e. GK104 was derived from GF114 etc.) GK107 replaced GF118, and comparing the gtx650 (which is the fastest product using GK107) with the fastest GF118 based card (which was the same die size as the gtx650), performance slightly more than doubled (108% performance increase, to be exact). So if Nvidia more or less keeps up that same performance improvement on the low end, Mzxwell's fastest card based off it's smallest chip should be able to perform right at, or even slightly better than, hd7850 levels.

Here are the comparisons I made with Kepler to Fermi and with GCN to Cayman and Evergreen. http://alienbabeltech.com/abt/viewtopic.php?f=6&t=30552

I'm guessing most of the reason is GPU Boost 1.0 and 2.0 which utilize a given TDP more intensively than TDP was used before.

The other factor was that ATi/AMD didn't have alot of clock headroom higher in their 5xxx and 6xxx series compared to the massive clock headroom in the 4xx and 5xx series.

4xx and 5xx series were also crippled with bad ROP/IMC and could not clock their VRAM very high as a result, the same problem ATi/AMD is having in the 7xxx series (other than the GCN 2.0 7790).

bad ROP/IMC also makes higher voltage required for the same clocks solely because the ROP/IMC requires the higher voltage, not the other hardware.

If ROP/IMC were on separate power planes from the rest of the core, you could probably have lower voltage GCN parts just like you have in GCN 2.0 with 7790. Same could easily be applied to 4xx and 5xx, where you could clock higher at lower voltages if the ROP/IMC were on a separate power plane.
 

Nvidia made some huge freaking performance jumps with Kepler overFermi, much larger jumps than Fermi made over GT200b and G92 based cards. I'm curious to see if Nvidia can keep it up with Maxwell, or if Kepler represented an abnormally large performance increase because Fermi was just so inefficient.
 
You guess incorrectly sir. The gtx650 does not have GPU boost. 😉

Read the rest of my post please 😀

I was merely adding an important element to the already present element of Hotclocks (double pumped shaders), less basic graphics hardware, etc.

for ATi it was VLIW4 to VLIW4 6xxx to 7xxx but there was the penalty of the crap ROP/IMC crippling the whole thing in the way I described above.

A 250w TDP 1150 Mhz Core 1750 Mhz GDDR5 (7000 MT/s) 7970 would not be hard with GCN 2.0 (Only the ROP/IMC part, not the increased anything) purely because of the improved ROP/IMC requiring lower voltages for same clocks.

7970 having slightly less basic graphics resources than 680 also contributes.

This is in comment to the reasons why the relative performance differences are where they are.
 
Last edited:
Don't really see the big deal here. Things like this aren't unusual, and it isn't going to have any affect on how commercially successful XBO will be. This episode appears more embarrassing to AMD than MS. Why isn't AMD providing hardware to MS for development purposes if they are the ones supplying the hardware for the actual console? The only real concern for MS is why don't they have development kits closer to the actual hardware this late in the ballgame they can be using to demo their titles. Will be interesting to see if they can actually hit their announced launch date and what quality of software will be available.
 
Spin and deflect?
You seem to be the one trying to spin anything.
I didn't even mention anything about the use of an i7 or NV hardware, just the general ridiculousness of your post overall, and use of an utterly absurd analogy.

When someone calls you out on making a terrible post, the correct response isn't to say "spin and deflect". I didn't spin or deflect, I said your post was dumb, because it was.

Oh my 🙂 Please feel free to continue your swath of self ownage, it's quite entertaining.

The analogy was EXACTLY on point, as they LIED about what they were running the games on. What was under the hood? One holy CRAPTON more power.

Less than 6 months from D-day, there's NO excuse for not being on more similar hardware when you're telling people this is how Xbone is going to run. ZERO excuse.

Watercooled CPU (almost certainly i7 given the ludicrous looking mobo, looks like SB-E actually)

INTEL Cpu (lol)

$650 GPU with an exorbitantly higher performance level.

Simply put, it was a 'ringer', pure and simple.

Go ahead with your insults and tantrum, it's entertaining. Everyone sees right through you. Look at posts on the subject by the local CPU/GPU gods such as Russian and Aigo, you're fooling nobody. 🙂
 
Hopeful that SB-E was used because the new games are taking advantage of threads/cores. 😉

Have the PS4 games that have been shown run on native hardware or also on a PC ?
 
Don't really see the big deal here. Things like this aren't unusual, and it isn't going to have any affect on how commercially successful XBO will be. This episode appears more embarrassing to AMD than MS. Why isn't AMD providing hardware to MS for development purposes if they are the ones supplying the hardware for the actual console? The only real concern for MS is why don't they have development kits closer to the actual hardware this late in the ballgame they can be using to demo their titles. Will be interesting to see if they can actually hit their announced launch date and what quality of software will be available.

The box in the picture looks nothing like a typical development box. A case with a window, really? Need much more evidence than this to be able to say it's the Xbox One dev kit box.
 
Please keep the discussion on topic and civil, discuss and debate constructively, and stop resorting to personal attacks.
 
Hopeful that SB-E was used because the new games are taking advantage of threads/cores. 😉

Have the PS4 games that have been shown run on native hardware or also on a PC ?

The PS4 demos were apparently run on real hardware. Also, one of them crashed at one point and the crash screen showed a real console OS not Windows.

It seems like Sony is months ahead of MS, and MS developers are still trying to optimize enough to get acceptable performance on real hardware so MS needed to use high-end PC hardware to run the demos.

This could mean that for launch and near-launch titles the PS3/360 roles will be reversed -- much worse frame rates and image quality on the X1 titles as developers struggle to optimize enough to get around the less-capable GPU and (allegedly) slower-clocked CPU.

If (game X) has 2 teams (X1 and PS4) and the X1 team had to wait an extra 6 months to get accurate dev kits that matched the real hardware, they would have been working on the i7 - nvidia hardware instead, not optimizing for the AMD APU.
 
Therefore Epic used an i7+GTX-680 in the early demo comparisons with PS4 and now Microsoft is using an i7+GTX-780 for the Xbox1 and accidentally showing it.

Do you get this is a message for Nvidia? You need a high-end PC to match the consoles.

Also how you know it was running W7? Why not the next Windows that is rumoured to reintroduce the start button?
 
Therefore Epic used an i7+GTX-680 in the early demo comparisons with PS4 and now Microsoft is using an i7+GTX-780 for the Xbox1 and accidentally showing it.

Do you get this is a message for Nvidia? You need a high-end PC to match the consoles.

Also how you know it was running W7? Why not the next Windows that is rumoured to reintroduce the start button?


It's your logical leaps that get that message. It's a stretch.
I could stretch a logical conclusion also. Microsoft felt the APU designed for them by AMD isn't up to the task right now, and needed a over-abundance of computing power to look good for this expo. The developers might have trouble optimizing for this version of the APU (unique characteristics). It's about future sales and reputation at this point. You don't go in stumbling, you go big or go home. And in this case Microsoft presented a illusion.
 
Last edited:
The box in the picture looks nothing like a typical development box. A case with a window, really? Need much more evidence than this to be able to say it's the Xbox One dev kit box.

It's not a dev box. The question is why doesn't MS have dev kits which perform relatively similar to what the final hardware will do yet?

Simply put, it was a 'ringer', pure and simple.

Even beginning developers know, if you are trying to run unfinished games with probably little optimization at this stage on a computer that is emulating another system even with relatively similar architectures, you're going to need to use some seriously fast hardware to get any sort of decent performance. Even though XBO is the assumed slower of the 2 upcoming consoles, there are still probably areas where it will outperform the "ringer" system MS was using to demo their games.
 
I love the logic though...

PS4 more powerful needs 680...

XBox One not as powerful needs 780...



I can't even wrap my head around it, impressive.
 
It's not a dev box. The question is why doesn't MS have dev kits which perform relatively similar to what the final hardware will do yet?

There is a rumor that Xbox 1 chips are having yield issues due to the large amount of SRAM.
 
It's your logical leaps that get that message. It's a stretch.
I could stretch a logical conclusion also. Microsoft felt the APU designed for them by AMD isn't up to the task right now, and needed a over-abundance of computing power to look good for this expo. The developers might have trouble optimizing for this version of the APU (unique characteristics). It's about future sales and reputation at this point. You don't go in stumbling, you go big or go home. And in this case Microsoft presented a illusion.

But you miss this:

A woman working on behalf of Harmonix confirmed that the demonstration for Fantasia: Music Evolved was run entirely on Xbox hardware.

Now let us play the game: What demos were running on the XboxOne and what on a PC with similar performance to the console?
 
The PS4 demos were apparently run on real hardware. Also, one of them crashed at one point and the crash screen showed a real console OS not Windows.

PS4 showoff hardware would never show Windows, it would show some Linux variant. Xbox One like the Xbox 360 uses Windows and DX.
 
Do you get this is a message for Nvidia? You need a high-end PC to match the consoles.

No, its just to hype up consoles showing something you will never see on a console with its weak hardware. Shame all these shows before launch never looks like the games that the consoles will actually run.

They simply show what you might expect from the PC port. Consoles users will have to settle for much less eyecandy and performance.
 
And AMD already announced they will retire support to DirectX because its next GPUs will integrate "other technologies"

I'm still waiting on a source for this earth stattering news you posted in the PS vs PC tthread. You should have had time to find it by now. you can ignore it like you always do when proven wrong, or you could be a man and admit you made a mistake...

This issue has already been settled, and is off topic here. Do not derail the discussion.
-- stahlhart
 
Last edited by a moderator:
Back
Top