Your probably both saying the same thing and speaking past one another.
I think what confuses him is that I'm not on nVidia's side. The GeForce FX was an abomination of a DX9 architecture, as I have always said.
Your probably both saying the same thing and speaking past one another.
In theory they could do that, but it would be cheating.
Namely, as AMD themselves promote tessellation: it should be adaptive.
That means that the level of tessellation is determined at runtime by parameters such as the distance or the projected size on screen.
This can (and generally will) change at every frame, so trying to buffer the geometry is pretty useless.... Unless ofcourse you're going to cheat and not actually going to do adaptive tessellation every frame, but just every X frames, re-using the buffered geometry for the remaining X-1 frames, rather than generating a proper adaptive set of geometry.
I don't mean they would break compatibility with DX11 tesselation code. But I'm curious, if you use adaptive tesselation, does the geometry grow linearly with the inverse distance or is it more like a staircase looking growth? If it's the latter, then buffering could still work since we'd be rendering at approx 60 frames per second. That's very little forward movement in a single frame. It could still wreck the minimum framerate however, anytime the camera makes too sudden movements or shifts to a new position.
Kitguru is predicting that Crysis 2 will feature "copious" amounts of tessellation courtesy of NV's $2MM investment in the game--enough to push even the GTX580 to the limit: http://www.kitguru.net/components/g...2-being-re-designed-for-gtx580-expect-delays/
Sounds wonderful!
If it actually makes the game look better and doesn't have diminishing returns, but I would expect that from you.
Expect what from me? Like a game making use of a major component of DX11? We want DX11 games right?
Kitguru is predicting that Crysis 2 will feature "copious" amounts of tessellation courtesy of NV's $2MM investment in the game--enough to push even the GTX580 to the limit: http://www.kitguru.net/components/g...2-being-re-designed-for-gtx580-expect-delays/
Kitguru is predicting that Crysis 2 will feature "copious" amounts of tessellation courtesy of NV's $2MM investment in the game--enough to push even the GTX580 to the limit: http://www.kitguru.net/components/g...2-being-re-designed-for-gtx580-expect-delays/
Sounds good to me!
I'm glad somebody is pushing developers to increase graphics standards.
Expect what from me? Like a game making use of a major component of DX11? We want DX11 games right?
Didn't you get the memo?
"We" only want AMD DX11 level games :whiste:
Didn't you get the memo?
"We" only want AMD DX11 level games :whiste:
You mean we only want games that no card can play with all the options on?
The fastest single GPU at the moment can only manage 50fps at 1920x1200 with 4xAA and most settings on level 3 of 4 (Gamer) in terms of quality.
You think adding lots of tessellation is going to enable ~any~ single GPU to give playable framerates?
If they do add an 'extreme' level, it would probably be dual GTX580s only that would be able to run it without chugging horribly.
What the hell does this even mean?
I think it means playable level.
Also apparently tessellation is the only DX11 feature in existence.
Not like Depth of Field in Metro 2033 kills framerates or anything (reducing by 1/3rd is nothing).
Which is actually another thing. It will be interesting to see if the new architectures (going back to the original topic) bring anything to other DX11 features, other than just tessellation.
When something like DOF can kill FPS by a lot, will they make improvements in other areas, or will that just come with a 'better'/bigger architecture/chip?
I believe that we dont need any form of Anti-Aliasing when we have Tessellation.
