[techeye] High end kepler -- 2013?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
The same way that the entire Fermi generation was a fail by being 6 months late: bringing working 3D gaming, unparalleled tessellation performance (about 2 years ahead of AMD), while achieving 60% market share in the discrete GPU space and giving us awesome price/performance in GTX460 and GTX560Ti cards, and at the same time putting price pressume on AMD, resulting in such gems as the HD6950 2GB at sub $299? You think AMD would have focused on Tessellation performance improvements in HD6900 and HD7900 if NV wasn't dominant in this area before? Let's consider that competition can often improve the landscape for all of us, even if the competitor does arrive late.

In the current PC era characterized by console ports/game engines that lack realism, it would be amazing to see a GPU maker trying to add some unique features that may encourage game developers to produce more realistic games. Sure, thus far physX has not worked out, but that doesn't mean a company should just give up. I'd rather take more realistic physics effects (regardless if NV or AMD brings that advantage) than 80-120 FPS in DX9 game engines and/or 20-30 fps in DX11 games with extreme levels of Tessellation (arguably the only standout DX11 feature).

At the end of the day, HD7900 series did not revolutionize much aside from Zero Power core technology. The new cards still cannot hope to cope with heavy Tessellation in games like Metro 2033, struggle in demanding games like Dragon Age 2 and are still more or less unable to pull off Eyefinity with a single card on 3 monitors. That more or less leaves us having to resort to inefficient deferred AA settings in modern game engines that more often than not result in blur fest and/or massive performance hits in games such as BF3 to extract some value. Are we at a point of diminishing returns though? Fundamentally, games barely look better or feel any more realistic than Crysis 1 from 2007. Maybe I am getting older, but I am no longer impressed by 40% performance increases. I want more realistic PC games. Sure, having a card powerful enough to game on a 30 inch 2560x1600 monitor is a nice option, or being able to play on 3 screens is decent, but that's not improving realism of gaming whatsoever.

If NV is 6 or even 9 months late but actually adds something that has a long-term potential to improve the gaming experience, it might be worth the investment for them and for all of us gamers (as it might encourage AMD to focus on physics for once). The unfortunate side effect of being late is the lack of pricing pressure on AMD.

However, personally, I don't care about Eyefinity due to bezels or 120 Hz PC gaming @ 120 fps. As such, it would be a breath of fresh air if any company actually incorporated some useful features that help improve the realism in games. After the incredible hype behind BF3's graphics/realism, the game is a laughing stock. 5 years after Crysis 1 and the PC gaming industry hasn't moved a dime, aside from more fluid character animations ripped off from EA sports game engines. Sad really.

Maybe it's time games started to focus on physics more and worry less about AA and high resolutions so much.

Softimage Lagoa ICE - Mousetrap 1080p HD

and

Real-Time Grass Rendering

and

Physically Guided Animation of Trees

GRAW with Ageia PhysX (not bad)


Or do we people want 5 monitor gaming at super high resolutions and 128x AA filters with the same ragdoll and physics effects from 2008?
He lives!!!
 

gladiatorua

Member
Nov 21, 2011
145
0
0
AMD's pricing on 79xx is the biggest clue that Nv's new generation GPU will be late.
And 79xx's are quite tame for royalty of the GPUs in terms of power consumption and temps. The whole point of the top GPU is that it has best performance, and hotter and hungrier than any other GPU.
As I see it, when Nv's GPU finally decides to show up prices on AMD's 7xxx will be much lower and powerful refresh will be month(s) away.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
The same way that the entire Fermi generation was a fail by being 6 months late: bringing working 3D gaming
yay, pitty is a gimmick
unparalleled tessellation performance (about 2 years ahead of AMD),
pitty tessellation is reduced to some rocks here and there
while achieving 60% market share in the discrete GPU space
they will keep getting more since low-end AMD gpus migrated to the cpu and don't count for discrete anymore
and giving us awesome price/performance in GTX460 and GTX560Ti cards, and at the same time putting pricing pressure on AMD, resulting in such gems as the HD6950 2GB at sub $299?
or was the AMD pressure that resulted in gem like the gtx460?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
IF Nvidia would open PhysX to AMD and others and IF it would get traction, this could revolutionize gaming. RussianSensation has a point - realism has been mostly about graphics, but interactivity has not that much improved in the past 10 years. You can shoot at objects today and they don't move a bit. Water is still not 3D but rather shader-enhanced textures. Fog is mostly static and not interactive.

If this worked out, it would be great imo.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91

Nice to see that you're back. While I often disagree with you, your posts are usually thougtful and interesting to read :)

That said, all these "demanding" games of this generaton can be ran on a single 6950/560ti at max, if you disable a single setting that exists only to sap framerates and to make up a reason for enthusiasts to get the latest and greatest while providing next to no visible IQ increase. MSAA on BF3, DOF on Metro, subpixel tessellation on Crysis 2 etc.

It's not that hardware is falling behind, it's that software either goes for the lowest hanging fruit(consoles) and/or is horribly unoptimised
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
IF Nvidia would open PhysX to AMD and others and IF it would get traction, this could revolutionize gaming. RussianSensation has a point - realism has been mostly about graphics, but interactivity has not that much improved in the past 10 years. You can shoot at objects today and they don't move a bit. Water is still not 3D but rather shader-enhanced textures. Fog is mostly static and not interactive.

If this worked out, it would be great imo.

I'm not sure if realism is such a desirable thing for games or at least all games, although higher interaction, realistic or not, is.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Well, we're not talking complete realism here like looking at a wounded soldier in BF8 and puking all over the keyboard. Baby steps.
Imagine a game like Skyrim where it rains and the rain actually consists of simulated water droplets that interact with objects, coalesce and form small streams that follow gravity. Or nebula/gas that is influenced by winds and particles (like a bullet) that are flying through - btw something that was promised for Unreal 2 almost 10 years ago.
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
IF Nvidia would open PhysX to AMD and others and IF it would get traction, this could revolutionize gaming.

Since speculation is rife in these threads and taking into account the rumour in the OP:

What if Nvidia have been focusing on PhysX but rather than using general purpose hardware they've done something similar to what Intel did with Quick Sync and have set aside some die space for a fixed function PhysX unit. Which might enable PhysX games without much of performance hit.

THEN they might be willing to open PhysX to others since they would have a major head start with their fixed-function units... Although AMD might be reluctant since they'd have to take a huge performance hit to enable PhysX.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Well, we're not talking complete realism here like looking at a wounded soldier in BF8 and puking all over the keyboard. Baby steps.
Imagine a game like Skyrim where it rains and the rain actually consists of simulated water droplets that interact with objects, coalesce and form small streams that follow gravity. Or nebula/gas that is influenced by winds and particles (like a bullet) that are flying through - btw something that was promised for Unreal 2 almost 10 years ago.

And after a few seconds looking at that you are back shooting the other guy and don't give a toss about if the physics engine is a complete simulation of reality or it is just well elaborated scripts or a mix of scripts and simulation that are enough to deceive your senses. In the end is just fluff unless tha game is about rain drops or nebulas.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
And after a few seconds looking at that you are back shooting the other guy and don't give a toss about if the physics engine is a complete simulation of reality or it is just well elaborated scripts or a mix of scripts and simulation that are enough to deceive your senses. In the end is just fluff unless tha game is about rain drops or nebulas.

You know you could say that about almost every single graphical effect, right? Think further how that could impact gameplay. You can spot enemies by the movement they cause in fog. There could be water puzzles where you have to distribute a certain amount of water into cups to trigger some weight plates. Or having a tsunami wave that hits. Possibilities are manifold if you let your imagination fly.

And then there is also immersion, the ability to create not a realistic but believable world where you don't immediately think "oh man, this looks so ridiculous, well it's just a game"
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Maybe it's time games started to focus more on realistic physics and worry less about obscene levels of AA and high resolutions.

Softimage Lagoa ICE - Mousetrap 1080p HD

and

Real-Time Grass Rendering

and

Physically Guided Animation of Trees

GRAW with Ageia PhysX (not bad)

Or do people want 5 monitor gaming at super high resolutions and 128x AA filters with the same ragdoll and physics effects from 2008?


The problem with some of those videos:
http://www.youtube.com/watch?v=5gkbZldbjHw&list=UUp7zhrJwQv9bBp8UyQoqXtw&index=9&feature=plcp

18 hour simulation time. 6 hour render time
my gawd! 11 secounds of animation take up 24 hours to show, before being saved as a AVI. Some of the others have taken days to do.... DAYS.... when its possible to do it in realtime, maybe then it ll show up in a few games (as more and more people get hardware that can do it, but I think we re pretty far away from that stage atm).

In short... not possible to use in games (at these levels/graphics).



1 tree.... most games have more than 1 tree in them.

This took 20 PC's to do in realtime,... some MacPRO 10,000$ rending machines.
ei. The guy used 20 x 10,000$ to do this short clip, with 1 tree.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Hm, this is done on CPUs, right? 8-16 CPU cores vs. 2000 SIMDs would make quite a difference, wouldn't you say?
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
How much better are dedicated PPUs in comparison with GPUs when handling physics (i thought GPUs were pretty good at it)?

Nice to see you back russian
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
How much better are dedicated PPUs in comparison with GPUs when handling physics (i thought GPUs were pretty good at it)?

Nice to see you back russian
They weren't much better. In fact I hope no one is taking this article seriously, as there are so many bad technical points that it's obvious they're making stuff up.

Ageia's PPU was just a vector processor with cache coherency; there was nothing else special about it. In terms of features GPUs have long exceeded that, and there's no reason that I know of that you'd ever need "Ageia" hardware to do physics (particularly kinematics, which is what Ageia's hardware was built for). What is a GPU but a stupidly fast matrix multiplication engine?

Second, the idea that Kepler needed porting from 32nm is rubbish. Kepler was always a 28nm product. NVIDIA's 2 year cycle meant it wouldn't be out until late 2011 at the earliest, so why would they even begin designing it around 32nm? Never mind the fact that the 32nm process was canceled over 2 years ago, in November of 2009, and it was clear it was off the rails long before that.

I have no idea who these Techeye guys are, but they suck at making stuff up.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Yeah, but at the same time, high-end Kepler is nothing more than a theoretical GPU design now. There's no working silicon. At this point it looks like a 2H hard launch.
 

96Firebird

Diamond Member
Nov 8, 2010
5,742
340
126
It amazes me how so many people know what is going on behind closed doors. I want to know all this super secret stuff too! :|
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
What you cannot see, doesn't exist - according to some people. Which honestly doesn't make sense.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
If there were working engineering samples we would've known by now.

Or do you think NVIDIA is being completely silent about Kepler because they want to? No, it's because they have nothing to show.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
IF Nvidia would open PhysX to AMD and others and IF it would get traction, this could revolutionize gaming. RussianSensation has a point - realism has been mostly about graphics, but interactivity has not that much improved in the past 10 years. You can shoot at objects today and they don't move a bit. Water is still not 3D but rather shader-enhanced textures. Fog is mostly static and not interactive.

If this worked out, it would be great imo.

hmm give me one Physx game that have more interactive than BF3.

i think the next step in realism is not physx but raytracing, that thing will revolutionize our gaming
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If there were working engineering samples we would've known by now.

Or do you think NVIDIA is being completely silent about Kepler because they want to? No, it's because they have nothing to show.

Its hard to say either way. I know with Fermi , nvidia were eager to talk even 6 months before release.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
If there were working engineering samples we would've known by now.

Or do you think NVIDIA is being completely silent about Kepler because they want to? No, it's because they have nothing to show.

You have no way of proving that.

hmm give me one Physx game that have more interactive than BF3.

i think the next step in realism is not physx but raytracing, that thing will revolutionize our gaming

PhysX has the potential. The problem right now is, that it is proprietary and not getting traction. I'm talking about the future, not the past or present.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Unless you're the type to spend $600 on a video card this year is going to suck without Kelper.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
hmm give me one Physx game that have more interactive than BF3.

i think the next step in realism is not physx but raytracing, that thing will revolutionize our gaming

We get maybe 1 game every 6 months that makes use of physx, and the performance hit is huge. At least, running a GTX 580 in single mode, it is not usable in metro 2033 or Batman: AC unless you are have or enable SLI. In the Batman: AC ingame benchmark in particular, the framerate nearly halves in the scene towards the end with the glass breaking. (unless you have SLI). So if you're at a baseline of 60 fps and physx chops it to 35-40, thanks but no thanks.

So with that said, if Kepler adds on die hardware to improve this situation I welcome it. It appears NVDA is opening *all vendors* to use it, so it will work on all hardware and even multi platform releases. Good news if true. Only bad thing is, if any of this is true -- Kepler will have signifigant delays
 
Last edited: