• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Article NVIDIA Asteroids Demo Available Now, Showcases ‘Dramatically Improved’ Performance

It's nice that when Nvidia adds features they actually work.

Although I guess TrueAudio worked for the two titles that supported it.


As you are aware of, trolling tech
threads is not allowed.


AT Mod Usandthem
 
Last edited by a moderator:
I'm not sure I understand what the advancement is. We have equivalent IQ with present rasterisation + shaders. With less processing requirement.

I am all for the push to ray tracing, it IS the gold standard. But if ILM can't render princess leia or grand moff tarkin to a degree of believability beyond the uncanny valley then maybe we're a bit early for partial ray tracing.

If the Image quality improvement was worth the lower FPS then poss poss. but one game with some fire reflections etc is NOT the future. Especially at the current prices.

If RTX can can become a thing then great. but this really accentuates the gameplay over visuals.

No games. (okay, one)

Frame rate hit.

Does it enhance the experience? I think G-sync or free-sync make more of a difference. I'd ratehr invest in that.

BUt for now, I STILL stick with my 2500k OC and 680 GTX for 1080p.
 
The proof is in the pudding IMO, that demo looks incredible. Anything which improves perceived polygon density is absolutely a win. All the tricks in the rasterization book still need to be backed up by good looking (not overly blocky) models. Anyone remember that brief period around 2006-2008 where the lighting got a lot better with deferred rendering but we still couldn't push enough polygons and it just looked weird? The overdone 'everything is shiny' factor didn't help either. Gears of War and all the other early UE3 games

Here I think the trick will be applying this to landscapes and other environments where you're not in empty space where each mesh is totally independent. I can see why its the perfect demo environment to show off the tech
 
Anyone remember that brief period around 2006-2008 where the lighting got a lot better with deferred rendering but we still couldn't push enough polygons and it just looked weird? The overdone 'everything is shiny' factor didn't help either. Gears of War and all the other early UE3 games

More of the other way around; games came with vastly improved texturing, models, and polygons but shader advancements were more troublesome to take advantage of for most devs resulting in "everything is shiny". Deferred rendering didn't become a thing until around 2008 I believe and far more mainstream in the decade after.
 
I might be remembering the dates wrong but yeah things got weird there for a little bit. The anemic PS360 CPU design target certainly didnt help either
 
Am I right in saying this is essentially using the RT cores to do visibility checks for culling? That's actually a really smart idea, I guess cores designed for doing vector work of raytracing are probably really fast at doing visibility check calculations it's kinda the same thing, right? This could be another big move in the right direction for lowering the cost to developers by reducing development by no longer having to worry about any manual culling during the optimization stage.
 
Back
Top