• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Tim Sweeney, about GPGPU!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: ShawnD1
I think his computer monitor might be broken...
:laugh:

Originally posted by: ShawnD1
Of course this goes back to the previous paragraph where Tim is expecting hardware to be more powerful than Commander Data;
:laugh:


You're on a roll today!

 
In a game like Fallout 3 on PC, the draw distance goes on for miles and the game will draw objects as far as you can see if you happen to have the best video card money can buy. If you're limited to PS3 graphics, the draw distance is much shorter and objects can only be seen if they are close by.

Very, very poor example. You can point to aliasing issues, but I can only assume at this point that you have read about the game running on either platform and not actually seen it(based on these comments and those in other threads). FO3 has some issues on the PS3, draw distance certainly isn't one of them(as is clearly evident in the video I linked).

To his comparison of Crysis being marginally better then the top console titles from this generation he is absolutely right. Ten years from now ask a 14 year old kid to point out which game looks better, he may come to the conclusion that Crysis looks a bit better, but it is without a doubt marginal. Compare Quake1 to Doom3, that is a big jump, or Unreal to Unreal2 to make it more Epic centric. Those are big improvements, Crysis versus the top games on the other platforms at this point aren't far apart at all.

So being a man of this industry as you are, what are you thinking Sweeney is up to?

I'm an analyst, not really a man of the industry. That said, I think Sweeney is likely very interested in coming up with a software renderer, and he is likely trying to promote himself a bit proactively by somewhat scaring other developers off from trying to make their own engines in house(which honestly the industry as a whole is better off if we have a bit more engine licensing then we do now, too many resources wasted on engine development for one off games IMO). His pointing to GoW and its' 2 Million lines of engine code versus less then 200K for game specific I think help drives home that point.

I think Sweeney is hedging his bets on a more software centric environment, but he has been known to make technology mistakes in the past. I liked his dig at Carmack in the presentation too, very subtle, but I think Tim knows that Jon is going to end this round on top in terms of visuals, even if Tim does take home the cash award.
 
Originally posted by: BenSkywalker
I liked his dig at Carmack in the presentation too, very subtle, but I think Tim knows that Jon is going to end this round on top in terms of visuals, even if Tim does take home the cash award.

I hope you are right!
I like Carmack's character more! (I mean from what i have seen from his interviews...
and the whole free "licencing situation" of the older engines...)

Did you see the new Rage trailer? (I suppose it's the XBOX360 version?)
While it was a little better overall than GoW2 the difference was not that big imo!

Anyway i'll visit Console forum to post later!


 
Originally posted by: BenSkywalker
In a game like Fallout 3 on PC, the draw distance goes on for miles and the game will draw objects as far as you can see if you happen to have the best video card money can buy. If you're limited to PS3 graphics, the draw distance is much shorter and objects can only be seen if they are close by.

Very, very poor example. You can point to aliasing issues, but I can only assume at this point that you have read about the game running on either platform and not actually seen it(based on these comments and those in other threads). FO3 has some issues on the PS3, draw distance certainly isn't one of them(as is clearly evident in the video I linked).
That game is broken and you know it.

http://www.gamespot.com/features/6202552/index.html
A Washington, DC, teeming with automotive executives seeking government aid isn't nearly as dismal as the postapocalyptic DC setting in Fallout 3. Shadows and lighting change according to the game's day-and-night cycle, and we made sure to match timestamps for our comparison shots. In what will come as no surprise, the PC shames both consoles in the image-quality comparison. Everything from the textures to the antialiasing to the reflections looks better on the PC. Foliage, piping, and far-off buildings look far superior on the PC due to transparency antialiasing effects. Even draw distance is better on the PC, as the rocks and a fence near the burned-out bus aren't even visible on the consoles.

http://www.computerandvideogam...php?id=199997&site=psm
The PC version of Fallout 3 is gorgeous. The colours are vivid, the draw distance is endless, the textures are high-res and the lighting effects are beautifully subtle, especially when you're gazing over the Capital Wasteland at sunset. It's the best-looking of the three.

The Xbox 360 version's textures are noticeably rougher than on PC, and objects in the distance aren't quite as clear. It does, however, boast an impressively solid frame rate. The game is, otherwise, identical.

NOW, the PS3 version looks the same as on Xbox, but things in the distance are slightly jaggier/rougher, the textures seem 'muddier' up-close and the frame rate is choppier, especially during the last few story missions (which may be the same on Xbox, but we've not seen the equivalent scenes to comment)

http://asia.cnet.com/reviews/h...7627,62048782-2,00.htm
We made sure to match the in-game time for our comparison shots to keep the lighting consistent, but even then the Xbox 360 seemed to have better shadowed rocks.

We also noticed that the PS3 didn't have the bridge pillar water reflection that's visible on the Xbox 360. The Super-Duper building in the background of the first image set also has more jaggies in the PS3 version. Fallout 3 hides draw distance well by showing major landmarks in the distance while leaving out smaller details that pop into view as you get closer. Pop-in was more noticeable on the PS3--you can see a small stand to the left of the bus in the Xbox 360 shot that isn't visible in the PS3 image.

We need Tim Sweeney on the case. He'll make it look like Crysis!
 
Originally posted by: ShawnD1
Originally posted by: MODEL3

ShawnD1, said:

1.What Tim Sweeney said, sounds like a load of bullshit!

2.Tim Sweeney meant that it cost millions of dollars to add PhysX to a game!

3.If it cost millions of dollars to add PhysX to a game, no game would have PhysX!

4.Of course it's not free to add this stuff, but come on!


And with my answer, I just meant the following:

1.No, Tim Sweeney didn't say that! (it cost millions of dollars to add PhysX to a game!)

2.Millions of dollars is the cost of a game!

I remember watching a speech John Carmack gave a few years ago where he was talking about the development of Rage and he mentioned something about the game costing upwards of 20 or 30 million dollars to create (that includes building the engine from the ground up). A few million is not the cost of the entire game; that would only cover the game's engine. If you're telling me that GPU code is going to cost 5x as much as multithreaded CPU code, that really does amount to potentially millions of dollars difference.

The other weird part of the article is that it seems to contradict itself here:
?In the next generation we?ll write 100% of our rendering code in a real programming language ? not DirectX, not OpenGL, but a language like C++ or CUDA. A real programming language unconstrained by weird API restrictions. Whether that runs on Nvidia hardware, Intel hardware or ATI hardware is really an independent question. You could potentially run it on any hardware that's capable of running general-purpose code efficiently,"
He's complaining about the cost of development, but he doesn't want to use DirectX and OpenGL/OpenCL which are specifically designed to make things easier to write. How does that make sense? If he doesn't want to use high level programming languages that are inefficient but easy to write, why doesn't he start writing his games in assembly language? ASM is much more efficient than programming in C, is it not? Of course nobody does that because programming in ASM is extremely expensive. Using a video card is no different; he's bitching about low level GPU programming being very difficult then he dismisses DirectX because it's too slow or he doesn't like the name or some other asinine reason.

By moving to a proper programming language like CUDA or C++, Sweeney will be able to save a ton of money by not having to do any cross-platform coding. It will also open up his market to include platforms like the Mac and Linux systems. It will be a painful transition at first, but will pay off enormously in the long run.

I don't understand your analogy to assembly. CUDA has developer tools, as does C++. Programming a game in assembly would be like flying blind and would take forever.

His statement is contradictory, however he will have to accept what the hardware industry gives him, and the way things are going, CUDA and the GPGPU are going to offer the most powerful solutions in the near future.
 
I liked his dig at Carmack in the presentation too, very subtle, but I think Tim knows that Jon is going to end this round on top in terms of visuals, even if Tim does take home the cash award.

What was his dig? I didn't see the presentation.
 
Originally posted by: Fox5
I liked his dig at Carmack in the presentation too, very subtle, but I think Tim knows that Jon is going to end this round on top in terms of visuals, even if Tim does take home the cash award.

What was his dig? I didn't see the presentation.

IDC gave the link, in the first page!

If i remember, Sweeney said something about the limitations of the texture sampling & the scalability of megatexture tech!

 
Originally posted by: SickBeast
By moving to a proper programming language like CUDA or C++, Sweeney will be able to save a ton of money by not having to do any cross-platform coding. It will also open up his market to include platforms like the Mac and Linux systems. It will be a painful transition at first, but will pay off enormously in the long run.
I guess that somewhat makes sense, but I don't know if I agree. Programming language is not what kills it. Windows and Mac still use different libraries for OS-related stuff, so the game still needs to be rewritten a little tiny bit. On a hardware level, Windows, Mac, and Xbox already work in a very similar way. Your Xbox has 3 symmetrical hardware cores, 6 logical cores (hyperthreading), ATI graphics, and DirectX 9. If you write a game for the Xbox, it can be moved to the PC in a reasonable amount of time. The PS3 is the only platform that poses a real challenge if you want to port your game. If you try moving your game to the PS3, it doesn't matter how you wrote it, it's going to be hard as hell. Converting it from DirectX to OpenGL isn't the tricky part, but converting it from symmetrical Xbox cores to asymmetrical PS3 cores is a real bitch. You basically need to rewrite the game even though the game is already written in C and it's already multi-threaded.

http://playstation.about.com/b...fficult-to-program.htm
If your game starts on Xbox 360 you will have to re-engineer aspects of the game to run properly on PS3. This means additional effort. Some developers have been complaining about this, but I don't believe we can solve that. The 360 is a different machine with good, but lower powered hardware in a different architecture. Developers have to view them as two different machines not as a common platform.


Wikipedia has a good article about this.
http://en.wikipedia.org/wiki/Asymmetric_multiprocessing
Where as a symmetric multiprocessor or SMP treats all of the processing elements in the system identically, an ASMP system assigns certain tasks only to certain processors. In particular, only one processor may be responsible for fielding all of the interrupts in the system or perhaps even performing all of the I/O in the system. This makes the design of the I/O system much simpler, although it tends to limit the ultimate performance of the system. Graphics cards, physics cards and cryptographic accelerators which are subordinate to a CPU in modern computers can be considered a form of asymmetric multiprocessing.[citation needed] SMP is extremely common in the modern computing world, when people refer to "multi core" or "multi processing" they are most commonly referring to SMP.
[....]
The Sony PS3 is an example of an extrapolated asymmetric multiprocessor. The cell processor has unique cores which compute only certain tasks, though it is a games console rather than a general-purpose computer.
Of course this is only wikipedia and there's that big "citation needed" next to the statement about physics and graphics cards, but you get the idea. A processor like my Phenom or your Xbox treats the cores as being equal. The PS3 is completely different and the cores are not equal; this is why Tim estimates this as being considerably more expensive (time consuming) to program for. Using the same language is not going to fix this problem.

If Tim's estimation is to believed, GPU programming takes that complexity to the point of being ridiculous and not worth doing. He's probably right when it comes to writing stuff the first time, but the desire to not write the same exact code a million times over is why we have DirectX.


I don't understand your analogy to assembly. CUDA has developer tools, as does C++. Programming a game in assembly would be like flying blind and would take forever.
You're right. Programming an Intel processor in assembly would be ridiculous. Doing GPU programming in C isn't much better. The whole reason for DirectX's existence is to simplify the process of GPU programming. Instead of writing 100 complicated math functions telling the video card what to do, you can use something like DirectX or OpenGL to simplify it to just 1 or 2 vague commands like "make a shadow of this object". Tim might not like having a limited number of options when it comes to programming the GPU, but that's the name of the game. If you don't want to use the DirectX or OpenGL libraries which are intended to take care of this complexity for you, you're left writing insanely complicated GPU code.

At the beginning of this thread I used PhysX as the example of adding GPGPU code to your game. The guys at Nvidia probably put a huge amount of time into writing that code, and it was probably every bit as complicated as Sweeney says it is. The problem is that Sweeney's position seems to be from the perspective that game developers should be writing this kind of code themselves, but in the future this code will most likely come in the form of DirectX or OpenCL libraries. You don't write the GPU physics or graphics code yourself; Microsoft adds the function to DirectX so all you need to put is something like "gravity(object)" and suddenly that object has gravity acting on it.
 
That game is broken and you know it.

Stop making yourself look foolish man, play the game before you try and continue on with everything you are saying. I linked a video clip above, from every screenshot you have shown in all of your links one object doesn't appear at the same time. As far as the actual IQ overall, take a look at the video running side by side in the link I posted. Lousy screen captures from consoles are the norm, check it out in motion and you will see just how 'bad' the port stacks up. It has considerably more aliasing, a few shader effects are MIA and its' texturing is very weak in certain areas, but watch the video and see for yourself what a, heh, 'huge' difference there is. You have made it exceptionally clear that you haven't actually seen the game played on the PS3.

By moving to a proper programming language like CUDA or C++, Sweeney will be able to save a ton of money by not having to do any cross-platform coding. It will also open up his market to include platforms like the Mac and Linux systems. It will be a painful transition at first, but will pay off enormously in the long run.

No, it will be a complete disaster. Not kind of, not somewhat, total disaster if he tries and does it the way you are describing. If you use an identical code base for cross platform useage you absolutely assure you are going to get blown out of the water by almost everyone. You can pick one platform and hit ideal, and be destroyed on the others, or you can go for equality across platforms and assure mediocricity. BTW- Mac and Linux? As of right now Windows is having issues remaining competitive in the gaming market, Mac and Linux as far as gaming goes are dead.

What was his dig? I didn't see the presentation.

Talking about how Megatexturing was an idea that was behind the times. Nothing over the top, but certainly a dig.

On a hardware level, Windows, Mac, and Xbox already work in a very similar way. Your Xbox has 3 symmetrical hardware cores, 6 logical cores (hyperthreading), ATI graphics, and DirectX 9. If you write a game for the Xbox, it can be moved to the PC in a reasonable amount of time.

The 360 has an in order PPC core. In terms of instruction set, the PS3 and 360 are much closer then the 360 and the PC. Not saying that the 360 doesn't end up being the easier machine to deal with porting, but there are major differences that change development by a very large amount between the platforms.
 
Originally posted by: BenSkywalker
So being a man of this industry as you are, what are you thinking Sweeney is up to?

I'm an analyst, not really a man of the industry. That said, I think Sweeney is likely very interested in coming up with a software renderer, and he is likely trying to promote himself a bit proactively by somewhat scaring other developers off from trying to make their own engines in house(which honestly the industry as a whole is better off if we have a bit more engine licensing then we do now, too many resources wasted on engine development for one off games IMO). His pointing to GoW and its' 2 Million lines of engine code versus less then 200K for game specific I think help drives home that point.

I think Sweeney is hedging his bets on a more software centric environment, but he has been known to make technology mistakes in the past. I liked his dig at Carmack in the presentation too, very subtle, but I think Tim knows that Jon is going to end this round on top in terms of visuals, even if Tim does take home the cash award.

Cool, makes complete sense to me. I figured you'd have the "big picture" nailed on this one, whether you have it correct or not, what you say makes complete sense to me so I'll adopt it as my position on the topic as well until I hear something more compelling.
 
To me, the big revelation here is that Epic won't be making an engine to replace UE3 for some time. I imagine they'll have a halfway upgrade similar to UE2.5, but their next from the ground up engine is going to be at least 5 years off if they're eschewing DirectX and OpenGL.
 
I think that the point of saying that new engines are going to be written in C++ and CUDA, instead of DirectX and OpenGL, is related to shader programming. Current GPUs are now turing-complete, so they are general-purpose enough that it should be posssible to code in C++ and map that code to shader instructions. As I understand it (from reading about it not programming it), is that DirectX shader code is essentially written in "DirectX shader assembly language", it's an idealize virtual machine model, which is then translated by the video card drivers at runtime into card-specific GPU shader code, and is then executed on the actual physical hardware as needed.

The boost will come from not having to code shaders in DirectX and OpenGL shader languages, but in real programming languages like C++.
 
Oh, and for some applications, when you consider the actual runtime, GPGPU code is worth it. Even if it costs 20x more to develop, it can run at 100x the performance of equivalent CPU code (ala Folding@home). So for certain applications, it's very worth it.

Is it worth it for the games industry? If sweeney's comments on budgets are accurate, then perhaps not. At least not at a direct, low level. If it could be developed in an engine, and leveraged that way (such that individual game budgets wouldn't have to be spent on developing the engine code), then it might be worthwhile.
 
Sweeney has been talking about GPU's being an interim solution until CPU's became more powerful for at least 10 years.

Just go look at some of his comments around the time of the Voodoo 1.
 
Originally posted by: Phynaz
Sweeney has been talking about GPU's being an interim solution until CPU's became more powerful for at least 10 years.

Just go look at some of his comments around the time of the Voodoo 1.

So you're saying the man is consistent with his message at least? Who knows, give him another 20 or 30yrs and he may be balls on accurate :laugh:
 
Back
Top