Interview with Carmack on Developing ID Tech 5

PricklyPete

Lifer
Sep 17, 2002
14,582
162
106
here

Interesting that he's sticking with OpenGL on the PC even though he seems to really like DX10. He also states that the PS3 is the hardest to develop for between all the platforms (no new news here) and that it seems his biggest beef is the 96MBs out of 256MBs available to the system being taken up by "stuff" (os, etc). This is in comparison to 32MBs out of 512MBs (although the 512 is shared by video and system...so all of it is never available exclusively to the system processor) that the Xbox 360 uses.
 

RaiderJ

Diamond Member
Apr 29, 2001
7,582
1
76
96 out of 256MB? Wow, that just seems crazy, especially with trying to support 1080p resolutions.
 

PricklyPete

Lifer
Sep 17, 2002
14,582
162
106
Originally posted by: RaiderJ
96 out of 256MB? Wow, that just seems crazy, especially with trying to support 1080p resolutions.

From the various articles I've read, developers seem unanimous that the MS design with a unified memory architecture and much less memory overhead is superior to the PS3's architecture. 160MBs for system functions is still a lot of memory to play around with for a dedicated game system...but it certainly does not give you a lot of flexibility as a developer.

I deal with memory management issues in code every day. A lot of times it forces you to be much more diligent in your programming (a good thing), but every once in a while if forces you to cut back a feature because you don't have the needed memory to do the operation keep performance in check.

Either way, it seems Carmack is in agreement that the systems are awfully close in realistic capability...the PS3 just being harder to tame.
 

Pacemaker

Golden Member
Jul 13, 2001
1,184
2
0
Originally posted by: PricklyPete
here

Interesting that he's sticking with OpenGL on the PC even though he seems to really like DX10.

He's a big fan of open source so it doesn't surprise me. Plus he once said he would never code in a proprietary format again. Although DX probably doesn't fit his original definition of proprietary, as VQuake, the first 3D accelerated version of Quake, only worked on one brand of 3D cards and caused him to make that statement.
 

Pacemaker

Golden Member
Jul 13, 2001
1,184
2
0
Originally posted by: RaiderJ
96 out of 256MB? Wow, that just seems crazy, especially with trying to support 1080p resolutions.

Carmack is crazy good at getting the most out of little. Read the article about quake on wikipedia (link). It's really interesting how he made quake happen on systems that only had 50-75 MHz processors and no 3D cards.
 

dwell

pics?
Oct 9, 1999
5,185
2
0
Originally posted by: RaiderJ
96 out of 256MB? Wow, that just seems crazy, especially with trying to support 1080p resolutions.

It's 32MB out of 256, the rest comes out of system RAM. It was 96 total, it's down to 84MB now but nobody knows the exact breakdown as it's under NDA.

It's a forward-thinking design, where they have the liberty to add features to the OS later (like in-game XMB). The 360 is pretty much stuck at 32MB forever and they can't expand on it without breaking older games.
 

PricklyPete

Lifer
Sep 17, 2002
14,582
162
106
Originally posted by: Chris
Originally posted by: RaiderJ
96 out of 256MB? Wow, that just seems crazy, especially with trying to support 1080p resolutions.

It's 32MB out of 256, the rest comes out of system RAM. It was 96 total, it's down to 84MB now but nobody knows the exact breakdown as it's under NDA.

It's a forward-thinking design, where they have the liberty to add features to the OS later (like in-game XMB). The 360 is pretty much stuck at 32MB forever and they can't expand on it without breaking older games.

Sony is stuck as well at 96mb...I don't see where you're getting at. No console maker can shift the amount of memory used by system functions at random without breaking older games. The fact is MS has limited the space used and offers in most cases more features (biggest exception is web browser which is a joke in my opinion).

If anything, MS's design is more "forward thinking" since it allows the developer to determine how much memory is used by for the in game CPU operations and how much is used by the GPU at any given point. Everything for the PS3 is fixed. 96MB for OS/system functions - 160 for in game CPU functions - 256 for graphics functions. MS just has 32MB for OS/System and the rest of the 512MB (480MB) can be used in any way they want.
 

Looney

Lifer
Jun 13, 2000
21,938
5
0
Originally posted by: Chris
Originally posted by: RaiderJ
96 out of 256MB? Wow, that just seems crazy, especially with trying to support 1080p resolutions.

It's 32MB out of 256, the rest comes out of system RAM. It was 96 total, it's down to 84MB now but nobody knows the exact breakdown as it's under NDA.

What do you mean nobody knows? If anybody knows, it's John Carmack, who's actually developing for the system.
 

dwell

pics?
Oct 9, 1999
5,185
2
0
Originally posted by: PricklyPete
Sony is stuck as well at 96mb...I don't see where you're getting at. No console maker can shift the amount of memory used by system functions at random without breaking older games. The fact is MS has limited the space used and offers in most cases more features (biggest exception is web browser which is a joke in my opinion).

Well then you better call Sony and tell them because firmware 1.6 dropped the overhead down to 84MB. You can go down, you can't go up. If you increase the overhead older games will never know. If you eat even 1k of overhead, older games may break.

If anything, MS's design is more "forward thinking" since it allows the developer to determine how much memory is used by for the in game CPU operations and how much is used by the GPU at any given point. Everything for the PS3 is fixed. 96MB for OS/system functions - 160 for in game CPU functions - 256 for graphics functions. MS just has 32MB for OS/System and the rest of the 512MB (480MB) can be used in any way they want.

Are you making those numbers up? ~32MB is reserved out of the VRAM. ~50MB is reserved out of system RAM, the exact breakdowns have not been disclosed. MS has a pool of 480M available, but it does not do a damn of good to allocate a ton to system RAM and starve the GPU or vesa-versa.
 

dwell

pics?
Oct 9, 1999
5,185
2
0
Originally posted by: Looney
What do you mean nobody knows? If anybody knows, it's John Carmack, who's actually developing for the system.

The fact that he's throwing around the outdated 96M number proves he doesn't know. He's not doing the PS3 engine code, they hired one of the leading EDGE developers out of Naughty Dog to do it.
 

PricklyPete

Lifer
Sep 17, 2002
14,582
162
106
Originally posted by: Chris
Originally posted by: PricklyPete
Sony is stuck as well at 96mb...I don't see where you're getting at. No console maker can shift the amount of memory used by system functions at random without breaking older games. The fact is MS has limited the space used and offers in most cases more features (biggest exception is web browser which is a joke in my opinion).

Well then you better call Sony and tell them because firmware 1.6 dropped the overhead down to 84MB. You can go down, you can't go up. If you increase the overhead older games will never know. If you eat even 1k of overhead, older games may break.

If anything, MS's design is more "forward thinking" since it allows the developer to determine how much memory is used by for the in game CPU operations and how much is used by the GPU at any given point. Everything for the PS3 is fixed. 96MB for OS/system functions - 160 for in game CPU functions - 256 for graphics functions. MS just has 32MB for OS/System and the rest of the 512MB (480MB) can be used in any way they want.

Are you making those numbers up? ~32MB is reserved out of the VRAM. ~50MB is reserved out of system RAM, the exact breakdowns have not been disclosed. MS has a pool of 480M available, but it does not do a damn of good to allocate a ton to system RAM and starve the GPU or vesa-versa.

You are correct that the number has gone down with the release of 1.60 of Sony's firmware. And I stand corrected that 32MB is in video and 52MB are in hardware.

Lowering that number obviously gives future developers more to work with...but everytime they lower it, they negate the exact reason you gave for it being a benefit (they can't take back that space to add additional features at a later date).

The ability to determine how you want to split your memory is by far a better option than the hope that in another year Sony will release another 5MB's of space for me to play with. There are plenty of situations where you may want your game to be AI or physics intensive, but are fine with limiting graphics processing or vice-versa. The point is MS lets the developer make that call and doesn't impose hard limits. On top of that they reserve a significantly smaller portion of memory for system related functions while providing easily as much capabilities.

Your argument seems silly.

Edit: reference overviewing memory usage here.
 

dwell

pics?
Oct 9, 1999
5,185
2
0
Originally posted by: PricklyPete
You are correct that the number has gone down with the release of 1.60 of Sony's firmware. And I stand corrected that 32MB is in video and 52MB are in hardware.

Lowering that number obviously gives future developers more to work with...but everytime they lower it, they negate the exact reason you gave for it being a benefit (they can't take back that space to add additional features at a later date).

The ability to determine how you want to split your memory is by far a better option than the hope that in another year Sony will release another 5MB's of space for me to play with. There are plenty of situations where you may want your game to be AI or physics intensive, but are fine with limiting graphics processing or vice-versa. The point is MS lets the developer make that call and doesn't impose hard limits. On top of that they reserve a significantly smaller portion of memory for system related functions while providing easily as much capabilities.

Your argument seems silly.

Who's arguing? It's two different approaches. One makes developer's life easier, the other providers a richer user-experience.
 

PricklyPete

Lifer
Sep 17, 2002
14,582
162
106
Originally posted by: Chris
Originally posted by: PricklyPete
You are correct that the number has gone down with the release of 1.60 of Sony's firmware. And I stand corrected that 32MB is in video and 52MB are in hardware.

Lowering that number obviously gives future developers more to work with...but everytime they lower it, they negate the exact reason you gave for it being a benefit (they can't take back that space to add additional features at a later date).

The ability to determine how you want to split your memory is by far a better option than the hope that in another year Sony will release another 5MB's of space for me to play with. There are plenty of situations where you may want your game to be AI or physics intensive, but are fine with limiting graphics processing or vice-versa. The point is MS lets the developer make that call and doesn't impose hard limits. On top of that they reserve a significantly smaller portion of memory for system related functions while providing easily as much capabilities.

Your argument seems silly.

Who's arguing? It's two different approaches. One makes developer's life easier, the other providers a richer user-experience.

I would not consider the PS3 a richer user-interface...actually I think most would argue in teh opposite direction. So MS is providing a richer or as rich environment with more flexibility for the developer. I'd still contend that MS is more forward thinking on the memory architecture and software side of things. Sony's "home" application may change my mind on that...but we'll have to wait till that is released to decide that.