[Xbit] Nvidia: GeForce "Kepler" Graphics Processors Will Be Unbeatable.

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I had to LOL at the tags at the bottom there,that is pretty freaking funny especially"more fighting" and " nobody wins" cause isn't that the truth about threads like this one.

Nvidia really is kinda late to this party and it sucks,me i would have loved to sli my old gtx560 which was a fun card but the freaking hassle of sli isn't worth it nor the heat and their crown jewel the gtx580 is still a anal raping $449+ retail right in 7950 territory LOL and amd came through first and conquered my business with a 7970 as their stubborn ass couldn't at least drop the gtx580 msrp for the avid nvidia fans who have to tough out a wait now.

Terrible market right now i freaking hope it gets exciting quickly cause GK104 rumors don't really sit well with people wanting to buy a 7000 series radeon or may consider it.
 
Last edited:

deadken

Diamond Member
Aug 8, 2004
3,199
6
81
Wow, an "Unbeatable" video card.

Do we finally have a future proof PC component?!? It'd stand to reason that it can't ever be beat. Therefore, buy one now, and never have to worry about changing it later.
 

atticus14

Member
Apr 11, 2010
174
1
81
as a performance per $ fan, I hope nvidia does give AMD a good spanking in price/performance, prices adjust accordingly and then i can pick out which card i want once the initial rush of premium pricing is over.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
1328588884f5KPiYuIly_8_1.gif


(574 - 160)x2 = 828 watts

animated-gifs-humans-dancing-34.gif


Let's ignore the fact that a 800 watt single gpu is pretty much impossible, but since the 7970 was mentioned to the defense we shall go!

And before this blows up, clearly the 680 using double the power of a 7970 is a logical fallacy, but make sure you take my half cocked totally non serious post as something it wasn't meant to be instead of addressing the post I was with your passionate desires to rout misinformation out of the internet.

"Passionate desires to *rout misinformation on the internet"


*A rout is commonly defined as a chaotic and disorderly retreat or withdrawal of troops from a battlefield, resulting in the victory of the opposing party, or following defeat, a collapse of discipline, or poor morale. A routed army often degenerates into a sense of "every man for himself" as the surviving combatants attempt to flee to safety. A disorganized rout often results in much higher casualties for the retreating force than an orderly withdrawal. On many occasions, more soldiers are killed in the rout than in the actual battle. Normally, though not always, routs either effectively end a battle, or provide the decisive victory the winner needs to gain the momentum with which to end a battle (or even campaign) in their favor. The opposite of a rout is a rally, in which a military unit that has been giving way and is on the verge of being routed suddenly gathers itself and turns back to the offensive.


LOL! What? Get a grip. Besides, what does that graph have to do with the graph you originally posted? Post what you mean and mean what you post.


Please avoid letting the discussion fall to a personal level ("you vs another member" thing). Stick to the issues, debate the topics. Do not debate personalities.

"LOL! What? Get a grip" falls into the category of "personal". It is clearly outside of the scope of the thread and needlessly personal, especially when you allot a big block of text for something that has no bearing on the thread in an effort to shame your sparring partner. You would have been better off just posting "what does that graph have to do with the graph you originally posted? Post what you mean and mean what you post"
all by itself

Please avoid turning the thread into a personal war of words in the future. Don't let stuff sink to a personal level - debate the topic, not the personalities.

Moderator jvroig
 
Last edited by a moderator:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Bulldozer was going to be unbeatable. UNBEATABLE...

There was no reason to believe that Bulldozer would magically make AMD processors jump ahead 3 years to beat Intel. nV consistently has the fastest single chip when it is released, so there could be some truth to it.

Although the 7970x2 (or whatever it will be called) is going to be rather hard to beat as well.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
You getting the midrange kepler or enthusiast kepler?
It depends how much faster the mid-range is over a GTX580 (if at all), and how late the high-end is. But I usually aim for the fastest single GPU available.
 

kidsafe

Senior member
Jan 5, 2003
283
0
0
Although the 7970x2 (or whatever it will be called) is going to be rather hard to beat as well.
Going to assume performance will once again be limited by heat more than anything. Perhaps the HD 7990 and/or dual-GPU Kepler card reference designs will see triple-wide cooling solutions? Unlike 2x8-pin PEG connectors, that's actually within the PCIe specification.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
6990 vs 590 was more of a tie, AMD had the advantage of lower AF quality and of course who can forget AMD "optimized" tessellation.

It most certainly was not a tie; GTX 590s need a quick release update to lower the clocks because the cards were burning up.

And no I'm not saying that all GTX 590s were burning up either.

AMD left the headroom because wattage increases at a very fast rate when overclocked with voltage. [H] reviewed one at 1300MHz and it was drawing over 400 watts in actual gaming, and that isn't total system draw.

As much flack as Fermi got it's clear why AMD decided not to push the clocks as the stock wattage is already close 220, an increase of around 50% core clocks + ram oc results an a near 100% increase in power draw.

Here is the review you're talking about from [H].

No the reason FERMI got flack from everyone is because it was hot & power hungry with minimal gain over AMD. In fact the GTX 480 use more power at loan than the HD 5970.

I don't mind a card being power hungry when it delivers on the performance.

That OC'd model HD 7970 was running at 1.3GHZ when consuming a little over 400watts. If the top tier GTX does similar I won't complain.

AMD left themselves wiggle room instead of hitting the 300watt ATX limit. They can drop another card if need be and I hope they do to maintain a lead this round.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
even if AMD released a 20% faster card than the 7970 that would still only mean 40-45% faster than a gtx580. do you actually think Nvidia is going to be less than 40% faster than the gtx580 for their fastest single gpu card? I am guessing even the gtx670 could be 40% faster than the gtx580. well at least what was originally planned to be the gtx670 certainly was.

If Nvidia does not roll out a new arch I do not see them being 40% faster than the GTX 580. Understand that I'm also factoring in that Nvidia doesn't have the best track record with new process nods.

As it stands now the GTX 580 is only 10% - 15% faster than the HD 6970. I can see Nvidia rolling out a big chip that will beat the stock HD 7970, by a slim margin.

Going to assume performance will once again be limited by heat more than anything. Perhaps the HD 7990 and/or dual-GPU Kepler card reference designs will see triple-wide cooling solutions? Unlike 2x8-pin PEG connectors, that's actually within the PCIe specification.

I doubt staying in spec will happen, AMD designed the 5970 & 6990 to both go over spec from the beginning.
 

Piano Man

Diamond Member
Feb 5, 2000
3,370
0
76
Usually the louder that nV talks, the worse their products will be. Remember when they released the 5xx series with like 1/10th the commotion than the 4xx series, yet it was the card that everyone thought the 4xx should have been? Hopefully this isn't deja vu. nV has a chance to finally lead in price/performance since AMD decided to jack up all of their pricing.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Easy there hoss, don't get your panties in another bunch. This topic is not as vital and close to home for for the rest of us.


Another reminder (and this is a good reminder for all, not just for Grooveriding): Please avoid letting the discussion fall to a personal level ("you vs another member" thing). Stick to the issues, debate the topics. Do not debate personalities.

"Don't get your panties in another bunch" falls into the category of "personal". It is clearly outside of the scope of the thread.

"This topic is not as vital and close to home for the rest of us" is also a personal attack since it practically overtly accuses another member of being too emotionally invested in the topic. Again, outside of the scope of the thread.

Please avoid turning the thread into a personal war of words in the future. Don't let stuff sink to a personal level - debate the topic, not the personalities.

Moderator jvroig

Grooveriding is one of the most honest/non-partisan members -he owns 3 480's and gets warned because he's objective.Wtf?


He got a warning because of the reasons stated in the mod edit, and has nothing to do with whether he is honest or not, or if he owns nV or AMD.

"Don't get your panties in another bunch" is not being objective. It is being personal. It contributes nothing to the thread but noise.

"This topic is not as vital and close to home for the rest of us" is not being objective. It is being personal. It contributes nothing to the thread but noise.

Both are unacceptable, no matter if you think he is honest/non-partisan. I do not even care if the Pope or the Dalai Lama himself posted here - if either of them proceeded to post personal attacks and/or attempted to derail the thread, I will hand them the same treatment. Now that is being objective.

You pointing this out in public is also against the rules - comments/suggestions/criticisms about moderator actions should be done in the Moderator Discussions subforum so as not to disrupt technical threads with off-topic material. In the future, do not do this anymore.

Moderator jvroig
 
Last edited by a moderator:

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
I want competition which hopefully this will bring. I hate Nvidia but will buy their products if they're genuinely better at the same price point. That hasn't been true for a while but maybe this time. Don't like Intel much either or Apple but same rules apply.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
If it is unbeatable I'm buying two and I'll be future proof until the Universe slowly collapses to another singularity or cools gently. Actually, why would I buy two? One is unbeatable! Hurrah for Nvidia, never thought this was possible but lets see.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
If it is unbeatable I'm buying two and I'll be future proof until the Universe slowly collapses to another singularity or cools gently. Actually, why would I buy two? One is unbeatable! Hurrah for Nvidia, never thought this was possible but lets see.

I seemed to have lost your train of thought. But yes, I will also be buying 2x 7970's or 7950's after Kepler launch.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Bulldozer was going to be unbeatable. UNBEATABLE...

Haha, nice, that gave me an LOL - especially reading it out loud in my head.

Was hoping for more GK104 info, hopefully the other thread gots something. Not sure if it was this thread (or the other) but is March 2x really the release date?

Hopefully true, and we get some leaks, really want to see how that funky new power plug works.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I want to see how GK104 performs so I can start arguing over GK100/110.

Rumors are:

40-60% faster than HD 7970 and will cost $300.

But I think that got updated to:

40-60% faster than GTX 580 and will cost $300.

But now I'm hearing:

5-10% faster than HD 7970 and will cost $600.

But then again, I lost track so I too would love to find out how it performs. Why haven't those Asian sites leaked benches yet?
 

Kippa

Senior member
Dec 12, 2011
392
1
81
I think there is a schism between high end gaming gfx cards and actual games. Even if the new nvidia blows the 7970 out of the water, what game will actually make full use of it for you to really see any real difference?

One could arge that console ports do hold quite a few PC games back. In the not to distant future the next generations consoles will be comming out and hopefully they won't keep back PC gaming as they have done.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I think there is a schism between high end gaming gfx cards and actual games. Even if the new nvidia blows the 7970 out of the water, what game will actually make full use of it for you to really see any real difference?

One could arge that console ports do hold quite a few PC games back. In the not to distant future the next generations consoles will be comming out and hopefully they won't keep back PC gaming as they have done.
people say the same thing every time a new generation comes out yet most of us eventually upgrade don't we? there are some demanding games out there especially if trying to run all settings maxed. also plenty of people are AA whores and that can take a massive toll on framerate. and there are people playing at resolutions well above 1920x1080/1200 and that can be very demanding.