• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX680, images from THG review leaked

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
nVidia is about to release a faster, cheaper and more energy efficient flagship card than AMD's...
mind-blown-11.jpeg
 
Last edited:
Everything I've seen indicates $550 or thereabouts and for those who think this card will have an appreciable effect on AMD prices of any GPU, dream on. As for prices migrating down to $400, call me I have a bridge in Brooklyn listed for sale. There isn't a game out there that you can't play for less than half these prices. So before you complain about the prices give that comment some thought. 😉
 
I love how everybody is taking these as factual. What makes this any different than any of the other "leaked" things we have seen over the last 6 months. Its not like its hard to create the graphs shown.

I will without my judgment until actual reviews are actually posted on actual review sites.

We don't need another topic locked because of issues AMD fanboys may have
 
Removed dirt3, added Crysis 2 in DX9:

Reaching.
DX9 matters in 2012 why?
NV is aiming this product at the blunt of the market when it comes to vram and GPU use. Why improve compute when your aiming at frame rate junkies on 1080P monitors.

This is a pretty big win for gamers and the industry overall. Less power consumption, heat, and better performance priced competitively and directly with the 7970.
 
Reaching.
DX9 matters in 2012 why?
NV is aiming this product at the blunt of the market when it comes to vram and GPU use. Why improve compute when your aiming at frame rate junkies on 1080P monitors.

This is a pretty big win for gamers and the industry overall. Less power consumption, heat, and better performance priced competitively and directly with the 7970.

Console ports.
 
Reaching.
DX9 matters in 2012 why?
NV is aiming this product at the blunt of the market when it comes to vram and GPU use. Why improve compute when your aiming at frame rate junkies on 1080P monitors.

This is a pretty big win for gamers and the industry overall. Less power consumption, heat, and better performance priced competitively and directly with the 7970.

Nvidia's scientific GPU market is huge.
 
Reaching.
DX9 matters in 2012 why?
NV is aiming this product at the blunt of the market when it comes to vram and GPU use. Why improve compute when your aiming at frame rate junkies on 1080P monitors.

This is a pretty big win for gamers and the industry overall. Less power consumption, heat, and better performance priced competitively and directly with the 7970.

let me see.... at 1080p a single 570 can't handle everything?
 
let me see.... at 1080p a single 570 can't handle everything?

Not really sure where aim is...
Are you serious or... BF3 with ultra settings and 4xAA, Metro 2033, ect. ect.

So, yeah. The 570 can handle everything if your not a pristine pirate 😉
 
yawn... wake me when the <$250 cards are out.

Man, I should have bought a card like a year ago. Who knew 28nm would suck so bad?
 
Willing to bet gaming takes the cake.

That doesn't say anything to my previous post. The scientific gpu community is very large. Several professors use nvidia GPUs in my department. I know of a few clusters that are going as well. NCSA at my university created a project cluster that was recently retired with another in the works. Universities, labs, and the industry buy GPUs in quantities of 100 or 1000. The market is large and is continuing to grow very quickly.
 
Than isn't that where the 680 also belongs than?

What I don't get is, by following some of the vitriol that AMD was getting by certain posters, and how it was overpriced, and the performance increase didn't justify it (considering the 580 was still ~450-500 ish, and the 7970 was ~20% faster), shouldn't the 680 also be getting that flack?

Let's assume for a moment that this is Nvidia's "mid" card. Why are they charging so much for it? Why are they gouging? They could probably sell this for 350 and still be making a healthy profit.

Nvidia is gouging it's customers by selling a mid range card for a high end price. This is, according to months of discussions and bickering.

So, Nvidia will get a pass for it's pricing, because it's AMD's fault for setting that level in the first place? Nvidia is not responsible?

And I'm not even going to get into AMD and their gouging.

With smaller die and transistors one expects lower pricies . It was AMD that raised pricies on a die shrink . We have yet to see NV price . I would love to see JJ have the balls to price this at $399
 
Why did nvidia only go with 2GB? I can't figure that one out..

because they opted for a 256bit bus, thus excluding 3GB as an option and 4GB would be flat out overkill

also 2GB is more than enough for any retail game now and for the foreseeable future on a single monitor setup - where the 680 looks to be king - and its memory bandwidth would become a limitation for multiple monitors anyway that it really doesn't make sense to go any higher than 2GB.
 
That doesn't say anything to my previous post. The scientific gpu community is very large. Several professors use nvidia GPUs in my department. I know of a few clusters that are going as well. NCSA at my university created a project cluster that was recently retired with another in the works. Universities, labs, and the industry buy GPUs in quantities of 100 or 1000. The market is large and is continuing to grow very quickly.

Eh. It actually does. What I was saying is that I'm willing (through assumption) to bet that the gaming sector is still the bread and butter and that there are more gamers than scientists and scientific organizations.

I have no argument about the scientific community growing and most likely in need of GPGPU because I don't know that data. Though that is a great thing.
 
Last edited:
That doesn't say anything to my previous post. The scientific gpu community is very large. Several professors use nvidia GPUs in my department. I know of a few clusters that are going as well. NCSA at my university created a project cluster that was recently retired with another in the works. Universities, labs, and the industry buy GPUs in quantities of 100 or 1000. The market is large and is continuing to grow very quickly.

Please cite numbers because based on NVDA's financial statements it is clear that HPC is growing but still not more than a tiny sliver of their business. If you think HPC is large, then how ginormous do you suppose the console business is? NVDA just got shut out of all three next-generation consoles, from what I'm hearing. (NVDA powered the PS3 for this generation.) Is losing the PS4 smaller than the HPC market? I suggest you go back and do the math.

One day, though, HPC may become another NVDA cash cow. Just not anytime soon.
 
One thing no one has mentioned...

Where's the GTX670?

I don't know, but why are people believing these slides when at least one of them is demonstrably incorrect (price/perf chart)? I can't believe any of the slides are for real at this point and will wait for more info.
 
Please cite numbers because based on NVDA's financial statements it is clear that HPC is growing but still not more than a tiny sliver of their business. If you think HPC is large, then how ginormous do you suppose the console business is? NVDA just got shut out of all three next-generation consoles, from what I'm hearing. (NVDA powered the PS3 for this generation.) Is losing the PS4 smaller than the HPC market? I suggest you go back and do the math.

One day, though, HPC may become another NVDA cash cow. Just not anytime soon.

While console GPU sales are enormous in volume, the profit margins aren't anything like they are in HPC. Tesla cards cost thousands of dollars. Still, I suspect that HPC isn't a cash cow for Nvidia yet.
 
I don't know, but why are people believing these slides when at least one of them is demonstrably incorrect (price/perf chart)? I can't believe any of the slides are for real at this point and will wait for more info.

[H] is confirming that the editor-in-chief at THG said their charts were accessed and these leaks are real. The guys at [H] are also saying the gtx 680 will launch for around $500 msrp.
 
One thing no one has mentioned...

Where's the GTX670?

Good question. I'm assuming that like most other X70 parts it's the same die with some disabled hardware. It's possible that nVidia's yields have been good enough with this part that there aren't enough binned parts to release the 670 yet. The 570 wasn't released until a month after the 580, so it's not as we should expect that they will be released at the same time.
 
[H] is confirming that the editor-in-chief at THG said their charts were accessed and these leaks are real. The guys at [H] are also saying the gtx 680 will launch for around $500 msrp.

Legit question here: do you have a link/source?
 
I'm suddenly not sure whether I should buy a second 6850 or just wait to see what the (hopefully) ensuing price wars bring . . . wow.
 
Back
Top