• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Charlie at Semiaccurate says: Physics hardware makes Kepler/GK104 fast

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I was more rolling my eyes at "The overclock is @1125 mhz core (something like every 7970 can do)" part. If "like every" 7970 could do it, AMD would have clocked the 7970 higher.

Or it means AMD is holding back on a 7980 that will run at 1125 or more and compete better with whatever Nvidia releases. From what I've read pretty much every 7970 can hit those clocks easy.
 
The 590`s are hand picked anyway, so a hand picked 7970 may not be a bad idea to compare lol

I think most have overclocked quite good or we would have an outcry
 
Only if it fit their TDP target, which it didn't. Again, even at stock clocks, the 7970 is substantially faster as long as one forces MSAA to override the in-game MSAA.

Why should one have to run a game differently than the developer intended to beat a last gen part? You should be able to run game, set in game settings to max, and enjoy next gen performance.

The fact that the gtx 580 is even in the same stratosphere says a lot about 7970's subpar performance.
 
I wonder if one of the games where it smokes a 7970 is BF3? Throw out all the other benchmarks then IMO. If buyers can get a card for $300 that smokes the 7970 in BF3 then this is a game changer. It wouldn't surprise me either since BF3 favors nvidia cards. That's a big reason why I didn't even consider 79XX series. It's subpar performance in BF3.

The 7970 doesn't have subpar performance in BF3. While it isn't a game where the card shows it's brute strength in regards to where it should be, there is simply no other single gpu card that can touch it and it even breaths down the neck of the 590 when overclocked.

If you actually owned the card or even were able to tinker with one first hand your tone would change from hater to believer. I realize the card doesn't do 3d vision and you are invested in the tech (as am I) but that doesn't change the facts the card performs well, ehem... Very Well in most situations.

I have pitted the *My* 7970 verse *My* GTX 580, which has long been sold, but I know exactly what it was capable of as I had it for a year. The 7970 is hands down a better card in virtually all metrics. As it should be.

Comparing to Kepler is an non issue right now because there is way too much FUD out there right now to be able to grasp on to anything as fact and claim "see 7970 blows, Kepler is going to pwn"

I remember all too well some people lighting up the forums with how Cypress wasn't all that it was cracked up to be. The card wasn't double the performance of a GTX 285, not by a long shot. Certain members (not going to name names) were bashing the card because Fermi was going to annihilate everything. We all know how that turned out http://www.anandtech.com/show/2977/...-gtx-470-6-months-late-was-it-worth-the-wait-.

IF ANYTHING, Kepler is like likely to best the 7970 by 10-15%. Is that good? Of course, more performance is always good. The real questions are... At what costs does 10-15% more performance come with? When, $$, Temps and power consumption. Those are the factors and the order in which are most important to me.

When I want a card, I want it. I was bored with my 580 and the 7970 looked promising so I bit. I am absolutley satisfied with my purchase. The card took less than 10 seconds to hit 1125/1575 clock speeds and has decimated my 580 in titles I am/was playing.

Metro 2033 went from 44 fps ave to 72. HUGE!!! My 580 was texture thrashing in Skyrim due to my texture mods(which made the game beautiful verse the terrible stock textures). 7970's huge frame buffer fixed that. I just played through Crysis warhead again in 2560x1600 4xMSAA with Adaptive(foliage AA) turned on, something my 580 was *not* capable of.

IF one was looking for the highest performing single gpu card(some of us dislike multigpu) on the market there is no question what card to buy *NOW*
 
Why should one have to run a game differently than the developer intended to beat a last gen part? You should be able to run game, set in game settings to max, and enjoy next gen performance.

The fact that the gtx 580 is even in the same stratosphere says a lot about 7970's subpar performance.

troll 🙄


No personal attacks, please.

Either make a counterargument/refutation, or, if you have none to offer, let them be. Don't resort to personal attacks.

Moderator jvroig
 
Last edited by a moderator:
The 7970 doesn't have subpar performance in BF3. While it isn't a game where the card shows it's brute strength in regards to where it should be, there is simply no other single gpu card that can touch it and it even breaths down the neck of the 590 when overclocked.

If you actually owned the card or even were able to tinker with one first hand your tone would change from hater to believer. I realize the card doesn't do 3d vision and you are invested in the tech (as am I) but that doesn't change the facts the card performs well, ehem... Very Well in most situations.

I have pitted the *My* 7970 verse *My* GTX 580, which has long been sold, but I know exactly what it was capable of as I had it for a year. The 7970 is hands down a better card in virtually all metrics. As it should be.

Comparing to Kepler is an non issue right now because there is way too much FUD out there right now to be able to grasp on to anything as fact and claim "see 7970 blows, Kepler is going to pwn"

I remember all too well some people lighting up the forums with how Cypress wasn't all that it was cracked up to be. The card wasn't double the performance of a GTX 285, not by a long shot. Certain members (not going to name names) were bashing the card because Fermi was going to annihilate everything. We all know how that turned out http://www.anandtech.com/show/2977/...-gtx-470-6-months-late-was-it-worth-the-wait-.

IF ANYTHING, Kepler is like likely to best the 7970 by 10-15%. Is that good? Of course, more performance is always good. The real questions are... At what costs does 10-15% more performance come with? When, $$, Temps and power consumption. Those are the factors and the order in which are most important to me.

When I want a card, I want it. I was bored with my 580 and the 7970 looked promising so I bit. I am absolutley satisfied with my purchase. The card took less than 10 seconds to hit 1125/1575 clock speeds and has decimated my 580 in titles I am/was playing.

Metro 2033 went from 44 fps ave to 72. HUGE!!! My 580 was texture thrashing in Skyrim due to my texture mods(which made the game beautiful verse the terrible stock textures). 7970's huge frame buffer fixed that. I just played through Crysis warhead again in 2560x1600 4xMSAA with Adaptive(foliage AA) turned on, something my 580 was *not* capable of.

IF one was looking for the highest performing single gpu card(some of us dislike multigpu) on the market there is no question what card to buy *NOW*

You make a compelling argument. Does sound like a fun card to play with. 🙂

It's not so much that I'm invested in 3D so much as I enjoy it. I wish 3D was an open standard but it's not.
 
Why should one have to run a game differently than the developer intended to beat a last gen part? You should be able to run game, set in game settings to max, and enjoy next gen performance.

The fact that the gtx 580 is even in the same stratosphere says a lot about 7970's subpar performance.

You are talking about 1 game! ONE! To top it off, as pathetic as it is, AMD hasn't even released a WHQL driver for the card sans for the drivers on the disc.

What about games with Physx? You can't run those games with equal settings can you? What does that say about the 7970? I guess I should just throw mine in the garbage.

I almost want the Kepler to be a let down. Not to so it justifies my purchase, I could give a crap. I'll just sell this 7970 and pick up one of them.

I just don't get why you say what you say. I feel like it is almost baseless.
 
You make a compelling argument. Does sound like a fun card to play with. 🙂

It's not so much that I'm invested in 3D so much as I enjoy it. I wish 3D was an open standard but it's not.

I too wish 3d was an open standard. I make something up everytime the wifey comes up behind me and ask why I'm not using 3d (she bought it for my Christmas present last year).

Also wish Physx was open, as I really want to play Arkham City with it enabled. However, It is just *ONE* game. I am going to get a GTX 285 though and use the PHYSX hack to run it. The things us enthusiasts have to do to get the best of both worlds =)
 
I think I got everyone who responded to my comments – if not, my apologies 🙂

Its not bad. If the rumor is true, its just not what most people expected.

Thinking it over now, I really think this would be a kickass card for 300$. I just have to see which NV specific games really pan out performance wise when it ships. As it is now, if its only faster in physx titles that would be REALLY underwhelming.

Unless you only play Batman.


Why? Again, am I the only one that is seeing something wrong with free performance boost in basically specific scenarios? Here is how I see it:

I have a $300 budget for a GeForce card for my girlfriend (and before anyone asks why, she is an nV fangirl – so don’t suggest anything else.)

She has a GTX 460 SC, I can buy her a GTX 570 in that ball park OR get a GTX 660 that offers similar performance (guesstimate) to a GTX 570 (again I said ignore the “oh generation switch, power/perf ratios blah blah” because if that is an issue now – the AMD hoorahs are all hypocritical) with lower temps, lower power consumption, and in specific games/scenarios WAY MORE PERFORMANCE.

For someone who is invested in that ecosystem this is, I word it again, a HUGE perk. I’m a red ant, and I’d say that’s something that makes me jealous.



Its not WRONG, its just a dead end. Physx has no traction right now and we are averaging 1 physx title per.....6-10 months? Its kind of like back in the day with the sound blaster 16, when games had to have *specific* support for it. Games would ship with a list of which soundcards they supported, eg, sound blaster 16, gravis, etc. Seeing PC graphics degenerate into that wouldn't be cool.

Why not open it up? Since 99% of games are multi platform, most developers will not develop for it unless it can benefit all platforms..


Again, in my example I said: “performs like a $300 card 95% of the time and a $450 card 5% of the time” because I’m not an idiot and I know where PhysX is and its poor penetration. However, I won’t ignore that some games use it and some of those games are fun, and for those specific games an Nvidia user gets a nice perk.

Why does that bother people? Why not open it up? Wait, so it’s okay for one company (AMD) to be pro-company (ie high prices on parts) and anti-consumer, but not the other? Kidding me?


Are you new to the PC (gaming) world? Driver compatibility has existed since the dawn of multi-part vendors. There was a reason why some of us spent extra money on the EAX cards, or the name products and it wasn’t because we liked to wag our epeen. I still have driver issues with my FX Audigy Platinum “Your-Mother-Edition” but it’s still worlds better than the issues I had with my ASRock Realtek integrated audio part.

Because the pricing of the $450 part and the expected 78xx is based on no competition. Unless AMD gets some unusual driver optimizations the paper specs of the 78xx cards don't appear to be much better performance wise (10-20%) than the 40nm cards in the same price bracket that it will be replacing.

It is appearing more and more that we consumers are going to experience price/performance stagnation with 28nm. I think this is one of the reasons AMD was in a hurry to get 28nm cards out the door, this way they have placed the onus on NVIDIA to adjust pricing. The GTX 580 is remaining obstinately in the $450-500 bracket.


Again I already said throw out this constant debate that follows prior trends. We already know things are changing (ie we aren’t seeing the “usual” shifts in performance/price we’ve expected in the last two-three years.) You guys tend to flip-flop so much its crazy.

Again, if this is a $300 part that performs like a $300 part 95% of the time and a $450 part 5% of the time (again if people missed why I’m using $450 part – the article states Tahiti, but not if Tahiti or Tahiti XT)) that is a nice perk for the nVidia users. A really nice perk. For those select few games that they’ll play anyways to be given a little performance boost, free mind you, why the hell not be “that’s freaking sweet.?”
 
HahahahahHAHAHAHA OBR is calling out Charlie, specifically. The guy over there is saying Charlie copied his previous blogs about Kepler, and in fact all (or most) of the information in both the original article, and Charlie's is false. He even goes as far to say who Charlie's source is.

First blog that obr accused Charlie of getting some of it's info wrong: http://www.obr-hardware.com/2012/01/some-new-info-about-kepler-and-chiphell.html

New blog: http://www.obr-hardware.com/2012/02/semiaccurace-is-absolutely-wrong-about.html

I love controversy! Truth is, I think Charlie's latest article and explanation of how and what GK104 will do is too narrow and full of misinformation. I also think OBR is probably exaggerating GK104's performance claims. I can't wait to see this card in action. So much fud flying everywhere and getting ate up by anyone willing to believe anything.
 
Last edited:
I too wish 3d was an open standard. I make something up everytime the wifey comes up behind me and ask why I'm not using 3d (she bought it for my Christmas present last year).

Also wish Physx was open, as I really want to play Arkham City with it enabled. However, It is just *ONE* game. I am going to get a GTX 285 though and use the PHYSX hack to run it. The things us enthusiasts have to do to get the best of both worlds =)




Unfortunately if the rumors are true it appears Nvidia is moving more towards their locked standards and are attempting to push AMD out of the market with proprietary techniques.

I'd prefer open standards and hope for surprises and maybe even other players entering the market. If Intel ever did actually get serious about graphics and push the open standards things would change and Nvidia couldn't do this.

With this rumor and a more tightly integrated GPU/Phsyx and perhaps even a reliant GPU on TWIMTBP, we're moving into a whole new world of vendor lockout and division.
 
HahahahahHAHAHAHA OBR is calling out Charlie, specifically. The guy over there is saying Charlie copied his previous blogs about Kepler, and in fact all (or most) of the information in both the original article, and Charlie's is false. He even goes as far to say who Charlie's source is.

First blog that obr accused Charlie of getting some of it's info wrong: http://www.obr-hardware.com/2012/01/some-new-info-about-kepler-and-chiphell.html

New blog: http://www.obr-hardware.com/2012/02/semiaccurace-is-absolutely-wrong-about.html

I love controversy! Truth is, I think Charlie's latest article and explanation of how and what GK104 will do is too narrow and full of misinformation. I also think OBR is probably exaggerating GK104's performance claims. I can't wait to see this card in action. So much fud flying everywhere and getting ate up by anyone willing to believe anything.





I swear the graphics market is better than a soap opera.
 
HahahahahHAHAHAHA OBR is calling out Charlie, specifically. The guy over there is saying Charlie copied his blog from a few days earlier, and in fact all (or most) of the information in both the original article, and Charlie's is false. He even goes as far to say who Charlie's source is.

First blog that obr accused Charlie of getting some of it's info wrong: http://www.obr-hardware.com/2012/01/some-new-info-about-kepler-and-chiphell.html

New blog: http://www.obr-hardware.com/2012/02/semiaccurace-is-absolutely-wrong-about.html

I love controversy! Truth is, I think Charlie's latest article and explanation of how and what GK104 will do is too narrow and full of misinformation. I also think OBR is probably exaggerating GK104's performance claims. I can't wait to see this card in action. So much fud flying everywhere and getting ate up by anyone willing to believe anything.

OBR is about as objective as Fox News, he has a strong documented hate for AMD. His site has anti-AMD BS all over it. He has at done at least 20 articles in the past year and stated that 7970 would be no faster than 6970.

On the other hand, Charlie hates nvidia as much as OBR hates AMD. Does their hate nullify each other out?

OBR also posted fake pictures and slies of GK104 a while back. I dunno. Charlie is sensational and hates nvidia but over the past year, most of his stuff has been true. Unless anyone can prove otherwise? Fun times!
 
Unfortunately if the rumors are true it appears Nvidia is moving more towards their locked standards and are attempting to push AMD out of the market with proprietary techniques.

I'd prefer open standards and hope for surprises and maybe even other players entering the market. If Intel ever did actually get serious about graphics and push the open standards things would change and Nvidia couldn't do this.

With this rumor and a more tightly integrated GPU/Phsyx and perhaps even a reliant GPU on TWIMTBP, we're moving into a whole new world of vendor lockout and division.

I completely agree with this, but people thinking that the company with the most invested in their proprietary systems are going to be the first to push for an open system are out of their minds.

AMD, and others, have to step it up. The only reason PhysX is even an issue is because AMD has still to show anything we, gamers, can use. Woots, it's in some movies - doesn't help me.
 
I should also add, the bit about on-die physx coprocessing was also uncovered by techeye. Their reporting is largely accurate.

So it appears that this revelation about Kepler IS true.
 
I completely agree with this, but people thinking that the company with the most invested in their proprietary systems are going to be the first to push for an open system are out of their minds.

AMD, and others, have to step it up. The only reason PhysX is even an issue is because AMD has still to show anything we, gamers, can use. Woots, it's in some movies - doesn't help me.

Pfft. Is this a joke? Phsyx has 1 game per 10 months and is largely a joke. UNLESS you only play Batman. Further, AMD does support all of the industry standards such as OpenCL and such. NV is hellbent on keeping physx as an NV value added option, but its really rubbish because developers aren't adoping it. And that will NOT change because 99% of AAA titles are multi platform.
 
Wow! this is not anymore about AMD and Nvidia, it's about Charlie and OBR!
What a boring time for GPU enthusiasts to live in...
 
Pfft. Is this a joke? Phsyx has 1 game per 10 months and is largely a joke. UNLESS you only play Batman. Further, AMD does support all of the industry standards such as OpenCL and such. NV is hellbent on keeping physx as an NV value added option, but its really rubbish because developers aren't adoping it. And that will NOT change because 99% of AAA titles are multi platform.

/facepalm

Try reading my post again, let me just resummarize my point:

It will perform like a $300 card 95% of the time and a $450 card 5% of the time. If AMD offered something similar, I’d be saying the exact same thing “that is a sweet perk.”

I don’t get how anyone can even spin this as a negative. TWIMTBP games will still exist, just as Gaming Evolved games will, and if AMD does something similar where Gaming Evolved games get magical pixie dust optimization guess who’ll be the first one to say “that’s freaking awesome.” Hint: look in the mirror. 😉
 
Why should one have to run a game differently than the developer intended to beat a last gen part? You should be able to run game, set in game settings to max, and enjoy next gen performance.

The fact that the gtx 580 is even in the same stratosphere says a lot about 7970's subpar performance.

Because not all games are developed equally. If the developers target one GPU over another, the other is going to end up with worse performance. In this case, the way that MSAA is implemented is better for nVidia than it is for AMD. But thats not entirely a fault of AMD as they don't have control of developers.
 
Pfft. Is this a joke? Phsyx has 1 game per 10 months and is largely a joke. UNLESS you only play Batman. Further, AMD does support all of the industry standards such as OpenCL and such. NV is hellbent on keeping physx as an NV value added option, but its really rubbish because developers aren't adoping it. And that will NOT change because 99% of AAA titles are multi platform.

He is right though. There IS an alternative - problem is, nobody is using it. 1 out of 10 games is better than 0 out of 10 games, don't you think? But it is more convenient to bash the pioneers than to lift one's own butt and provide a competitive solution. As long as there is no competition, PhysX is fine the way it is.
 


/facepalm

Try reading my post again, let me just resummarize my point:

It will perform like a $300 card 95% of the time and a $450 card 5% of the time. If AMD offered something similar, I’d be saying the exact same thing “that is a sweet perk.”

I don’t get how anyone can even spin this as a negative. TWIMTBP games will still exist, just as Gaming Evolved games will, and if AMD does something similar where Gaming Evolved games get magical pixie dust optimization guess who’ll be the first one to say “that’s freaking awesome.” Hint: look in the mirror. 😉

My bad, skimmed over the post
 
My bad, skimmed over the post

Don't get me wrong - that isn't enough for me to switch from my AMD setup to nVidia, I just think it's a really nice perk (EDIT: if true of course.)

The fragmentation we all fear already exists, and has. DirectX vs OpenGL, Intel vs AMD, nVidia vs AMD, and as you stated - the soundcard issues.

We always brag about the options we PC gamers have, but it seems we also like to cry foul when we're on the losing side of a switch. But hey, it's like sports - we route for our fave teams!
 
Lets clear a few things because i believe even Charlie has messed up big time.

What is NVIDIA PhysX Technology?

NVIDIA® PhysX® is a powerful physics engine enabling real-time physics in leading edge PC games. PhysX software is widely adopted by over 150 games and is used by more than 10,000 developers. PhysX is optimized for hardware acceleration by massively parallel processors. GeForce GPUs with PhysX provide an exponential increase in physics processing power taking gaming physics to the next level.

Game Physics

Computer animation physics or game physics involves the introduction of the laws of physics into a simulation or game engine, particularly in 3D computer graphics, for the purpose of making the effects appear more real to the observer. Typically, simulation physics is only a close approximation to real physics, and computation is performed using discrete values.

In order to run NVIDIA PhysX in a game, you need an NVIDIA card and a game to support PhysX, like Batman : AC.

All game engines have physics, some use in game physics engines and some others use proprietary physics engines like NVIDIA PhysX.

Examples.

Batman: AC uses NVIDIAs PhysX proprietary physics engine to compute and simulate (through NVIDIA's CUDA architecture) real time physics inside the game, like fog, fragment movement etc.

BF3 game engine (Frostbyte 2) incorporates its own physics engine to calculate bullet ballistics, vehicle physics, gravity forces etc etc. If im not wrong its running through the CPU.

The only difference is that NVIDIA PhysX is coded to run on CUDA architecture hardware (From GTX8800) when other physics engines can run in every GPU or even in CPU. (CUDA can run in CPU too with the PGI CUDA x86).

So, when we are talking about physics, we are talking in general about the physics inside the game and when we are talking about PhysX we are talking about the NVIDIAs physics engine.

Both do the same job, but PhysX needs to be implemented in the game and an NVIDIA card to run it.


Now,

From what i have understand, i believe what Charlie is trying to communicate is that, in games that need a lot of computational performance (DX-11 Tessellation, Compute Shaders etc) Kepler will shine and when the game doesn't implement such features, then Keplers performance will fall.
 
Saw this over at OCN from a programmer involved with this stuff:

It wouldn't just be for any physics, it would only be for GPU physics. I guess this could be dedicated hardware for any GPU-based physics, but right now that would pretty much just be PhysX. Physics engines like Havok are purely CPU-based. GPU-based physics are alright I guess, but they're like graphics, only important for visuals, GPU-based physics cannot affect gameplay physics objects like the player or things the player can interact with. I really don't see the point to this, seems like an unnecessary waste of GPU die space, unless that hardware can be used for things other than GPU-based physics.

I guess my point is, yes this could be for "generic physics", but only if it's GPU-based effects, so that won't help in any situation except PhysX at the moment.

The net effect of this, is that if the rumor true then the entire thing is worthless. NV should aim for better across the board performance. This will not benefit anything except physx, and broad adoption is a pipedream because all games are multi platform now.
 
Lets clear a few things because i believe even Charlie has messed up big time.

What is NVIDIA PhysX Technology?



Game Physics



In order to run NVIDIA PhysX in a game, you need an NVIDIA card and a game to support PhysX, like Batman : AC.

All game engines have physics, some use in game physics engines and some others use proprietary physics engines like NVIDIA PhysX.

Examples.

Batman: AC uses NVIDIAs PhysX proprietary physics engine to compute and simulate (through NVIDIA's CUDA architecture) real time physics inside the game, like fog, fragment movement etc.

BF3 game engine (Frostbyte 2) incorporates its own physics engine to calculate bullet ballistics, vehicle physics, gravity forces etc etc. If im not wrong its running through the CPU.

The only difference is that NVIDIA PhysX is coded to run on CUDA architecture hardware (From GTX8800) when other physics engines can run in every GPU or even in CPU. (CUDA can run in CPU too with the PGI CUDA x86).

So, when we are talking about physics, we are talking in general about the physics inside the game and when we are talking about PhysX we are talking about the NVIDIAs physics engine.

Both do the same job, but PhysX needs to be implemented in the game and an NVIDIA card to run it.


Now,

From what i have understand, i believe what Charlie is trying to communicate is that, in games that need a lot of computational performance (DX-11 Tessellation, Compute Shaders etc) Kepler will shine and when the game doesn't implement such features, then Keplers performance will fall.





I hope that is the case. We need a good $300 card that offers us something the previous generation didn't. We already saw similar in Civ5 with the multithreaded drivers that nvidia uses vs. the AMD ones. So I'd be interested to see if these Kepler chips are doing something similar.
 
Back
Top