Article NVIDIA Asteroids Demo Available Now, Showcases ‘Dramatically Improved’ Performance

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
It's nice that when Nvidia adds features they actually work.

Although I guess TrueAudio worked for the two titles that supported it.


As you are aware of, trolling tech
threads is not allowed.


AT Mod Usandthem
 
Last edited by a moderator:

Sable

Golden Member
Jan 7, 2006
1,127
98
91
I'm not sure I understand what the advancement is. We have equivalent IQ with present rasterisation + shaders. With less processing requirement.

I am all for the push to ray tracing, it IS the gold standard. But if ILM can't render princess leia or grand moff tarkin to a degree of believability beyond the uncanny valley then maybe we're a bit early for partial ray tracing.

If the Image quality improvement was worth the lower FPS then poss poss. but one game with some fire reflections etc is NOT the future. Especially at the current prices.

If RTX can can become a thing then great. but this really accentuates the gameplay over visuals.

No games. (okay, one)

Frame rate hit.

Does it enhance the experience? I think G-sync or free-sync make more of a difference. I'd ratehr invest in that.

BUt for now, I STILL stick with my 2500k OC and 680 GTX for 1080p.
 
  • Like
Reactions: happy medium

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The proof is in the pudding IMO, that demo looks incredible. Anything which improves perceived polygon density is absolutely a win. All the tricks in the rasterization book still need to be backed up by good looking (not overly blocky) models. Anyone remember that brief period around 2006-2008 where the lighting got a lot better with deferred rendering but we still couldn't push enough polygons and it just looked weird? The overdone 'everything is shiny' factor didn't help either. Gears of War and all the other early UE3 games

Here I think the trick will be applying this to landscapes and other environments where you're not in empty space where each mesh is totally independent. I can see why its the perfect demo environment to show off the tech
 
  • Like
Reactions: ZGR

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
Anyone remember that brief period around 2006-2008 where the lighting got a lot better with deferred rendering but we still couldn't push enough polygons and it just looked weird? The overdone 'everything is shiny' factor didn't help either. Gears of War and all the other early UE3 games

More of the other way around; games came with vastly improved texturing, models, and polygons but shader advancements were more troublesome to take advantage of for most devs resulting in "everything is shiny". Deferred rendering didn't become a thing until around 2008 I believe and far more mainstream in the decade after.
 
  • Like
Reactions: Arachnotronic

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Am I right in saying this is essentially using the RT cores to do visibility checks for culling? That's actually a really smart idea, I guess cores designed for doing vector work of raytracing are probably really fast at doing visibility check calculations it's kinda the same thing, right? This could be another big move in the right direction for lowering the cost to developers by reducing development by no longer having to worry about any manual culling during the optimization stage.