[WCCF] 980ti coming this summer with full GM200

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Head1985

Golden Member
Jul 8, 2014
1,863
685
136
Nvidia got the sweet deal of a life time with GM200.

Have GTX Titan X alone on the market for 3-4 months. Slap on 12GB to make it feel like a premium card and charge big bucks for it while cashing in big due to big margins.
R9 390X is announced in June and enter the market.
Release a 6GB GTX 980 Ti for $700 to not only match 390X price due to the HBM driving up the cost of the card, but also to match or beat 390X in performance.

I can`t see any way AMD is able to recoup much market share from Nvidia even with 390X because of this tactic.

Sad to see Nvidia kicking AMD because of the dire situation R&D wise between the two companies. I really hope Samsung do buy AMD and start bringing up the big bucks for R&D and hiring more engineers (the rumors about AMD engineers being over worked) to get the company up to shape again.
100% agree
 

PullTheTricker

Junior Member
Dec 12, 2007
24
0
0
So the TDP of the 980TI is 250w? That would be quite higher then I expected.
Would my CM V750S Gold be able to power these babies in SLI?
According to eXtreme PSU Calculator, with 2x 780ti SLI I end up with 599w minimum and 649w recommended. Even a 100w headroom would still make me hesitant about it. I'm hoping it will be just slightly more power efficient if the true 980ti specs are revealed.

Current 970 and 980 lineup has us spoiled in power efficiency.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Pretty sure GTX 980 Ti will have higher TDP than GTX Titan X.
The reason GTX 780 Ti had the same TDP as GTX Titan while clocked higher was because they disabled many of the power hungry FP32 cores they use for computation performance from the GTX Titan.

With GTX Titan X and GTX 980 Ti its a bit different because the Titan X doesnt have the amount of FP32 cores the original GTX Titan had. So they can`t disabled any of them and "gain" a thermal envelope they can use to clock the 980 Ti higher while still staying on 250W.
 
Last edited:

PullTheTricker

Junior Member
Dec 12, 2007
24
0
0
Pretty sure GTX 980 Ti will have higher TDP than GTX Titan X.
The reason GTX 780 Ti had the same TDP as GTX Titan while clocked higher was because they disabled many of the power hungry FP32 cores they use for computation performance from the GTX Titan.

With GTX Titan X and GTX 980 Ti its a bit different because the Titan X doesnt have the amount of FP32 cores the original GTX Titan had. So they can`t disabled any of them and "gain" a thermal envelope they can use to clock the 980 Ti higher while still staying on 250W.
Well eitherway, thats 100 more watt compared to the regular 980. I'm not sure what degree of perfomance gains we will see to compensate that. But these specs are nothing final, lets wait and see. Guess I'l wait for GTX1070 and GTX1080 Pascal chips for more power efficient cards. The 980TI looks like an attractive purchase for SLI, but I will have to wait and see if it will run on my current PSU.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
http://wccftech.com/nvidia-geforce-gtx-980-ti-coming-summer/

Full 3072 cores and faster clocks than Titan X with the expected 6gb of VRAM

I'm skeptical. This would essentially make the Titan X completely redundant, especially if an AIB comes out with a version that increases the RAM back to 12GB.

Besides, the yields on such a gigantic chip can't be 100 percent, even on a process as mature as 28nm. As another commenter pointed out, it would make no sense to bill such a release as the "980 Ti", since this would leave no room for a cut-down version in the future. I'm betting that the next GM200 release we see will have at least a couple of SMMs disabled, probably 3 (which would bring the CUDA core count down from 3072 to 2688). And I think it will be called the GTX 990. Why reserve the 9 subclass for dual-GPU cards, when there hasn't been one there since the GTX 690? I think that dual-GPU Nvidia cards will be reserved for the Titan brand from now on. Calling this the GTX 990 leaves open the possibility of using GTX 980 Ti for an even more cut-down GM200, or eventually releasing a fully-enabled, hot-clocked version at the GTX 990 Ti.
 
Feb 19, 2009
10,457
10
76
You don't need any inside tip to know this was coming.

Yup, we've all seen this coming.

Plus, on custom PCBs with better power limits AND cooling, GM200 as a 980ti is going to demolish Titan X for LESS $.

o_O

6GB vram is the sweet spot.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
x90 cards are usually dual-GPU on single PCB.

Full GM200 could be called a GTX 985 and a cut down version can be a 980 Ti.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Yup, we've all seen this coming.

Plus, on custom PCBs with better power limits AND cooling, GM200 as a 980ti is going to demolish Titan X for LESS $.

o_O

6GB vram is the sweet spot.

So crazy to hear that. It seems like yesterday that people were arguing about 2bg being enough. Now 6gb is suddenly the sweet spot? 1440p is my next upgrade, and I feel crazy for hoping 6gb will be enough for a few years.
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
If they want a summer release; they will need to finalize them soon....

When's 390x supposed to drop? Having a counter is something both camps have achieved, sometimes anyway.

$799 NVidia.. that's what I'll pay to upgrade my 780Ti, no more, but certainly less.. ;)
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
So crazy to hear that. It seems like yesterday that people were arguing about 2bg being enough. Now 6gb is suddenly the sweet spot? 1440p is my next upgrade, and I feel crazy for hoping 6gb will be enough for a few years.

Yeah, we are talking about the high end though. I've been on 1440p for years, now I'm eying 3440x1440 as my next monitor, so 6GB isn't out of the question.
 
Feb 19, 2009
10,457
10
76
So crazy to hear that. It seems like yesterday that people were arguing about 2bg being enough. Now 6gb is suddenly the sweet spot? 1440p is my next upgrade, and I feel crazy for hoping 6gb will be enough for a few years.

Because people who believed low vram is enough lack the foresight to see that consoles dictate cross-platform game development. With the current consoles now being the major target of studios, they can and will push its vram limits to the max. Typically 4GB would be the realistic target but some extra for 4k textures for PC versions serve as a good buffer.

For Quad SLI gamers on 4K+ who run with DSR/MSAA, Titan X's 12GB vram is the major advantage. So NV must restrict 980ti to 6GB so they can still demand $999 for the 12GB Titan X for enthusiasts who demand more vram.
 

MiRai

Member
Dec 3, 2010
159
1
91
I'm skeptical. This would essentially make the Titan X completely redundant, especially if an AIB comes out with a version that increases the RAM back to 12GB.
nVidia didn't allow them to kick the 780 Ti up to 6GB (from 3GB) because they already knew that they'd eventually release the Titan Black with 6GB, so I think it's safe to assume that nVidia would do the same thing this time around if the 980 Ti is as good as the rumors/leaks make it out to be, and they'd just lock down the VRAM spec.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So crazy to hear that. It seems like yesterday that people were arguing about 2bg being enough. Now 6gb is suddenly the sweet spot? 1440p is my next upgrade, and I feel crazy for hoping 6gb will be enough for a few years.

That's because they were trying to convince everyone that the 770 was a better card than the 280X and the 770 had 2gig of RAM. Now they're trying to convince us that 6gig is "the sweet spot". Like anyone needed to be a fortune teller to know that was coming. Keeping in mind that the 970's RAM is just fine also. ;)
 

Riceninja

Golden Member
May 21, 2008
1,841
3
81
That's because they were trying to convince everyone that the 770 was a better card than the 280X and the 770 had 2gig of RAM. Now they're trying to convince us that 6gig is "the sweet spot". Like anyone needed to be a fortune teller to know that was coming. Keeping in mind that the 970's RAM is just fine also. ;)

its very obvious that the narrative changes based on what's convenient for nvidia. when i bought my 7950 everyone said that 3gb won't provide any extra benefit at 1080p compared to the 2gb gtx 670...
 
Feb 19, 2009
10,457
10
76
That's because they were trying to convince everyone that the 770 was a better card than the 280X and the 770 had 2gig of RAM. Now they're trying to convince us that 6gig is "the sweet spot". Like anyone needed to be a fortune teller to know that was coming. Keeping in mind that the 970's RAM is just fine also. ;)

Now now, I thought the 7970 would be more future proof than the 680, due to vram as well as GCN/console similarities.

I didn't think the 970 3.5gb fiasco was fine at all.

But realistically, more games are already using >4gb vram at 4K. With the passing of time and more titles developed focused for consoles, we'll see that become the norm (at high resolution).

4GB may remain to be enough as long as you don't run with MSAA at 4K. Generally even two top GPUs cannot play recent titles at 4K with 4x MSAA anyway so it doesn't apply for most users, only those with Quad SLI/Fire.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Now now, I thought the 7970 would be more future proof than the 680, due to vram as well as GCN/console similarities.

I didn't think the 970 3.5gb fiasco was fine at all.

But realistically, more games are already using >4gb vram at 4K. With the passing of time and more titles developed focused for consoles, we'll see that become the norm (at high resolution).

4GB may remain to be enough as long as you don't run with MSAA at 4K. Generally even two top GPUs cannot play recent titles at 4K with 4x MSAA anyway so it doesn't apply for most users, only those with Quad SLI/Fire.

That wasn't directed at anyone in particular. It was just the general narrative that nVidia marketing spins. (Although you have taken some peculiar positions lately. I think you are just a bit understandably pessimistic about AMD. It will likely pass when Fiji drops and the other blocks they've set in place come together. Nothing in this instance though was directed at you in particular. Although I can see why it appeared that way. Sorry. :))

I actually think something like a 390(non X) with 4gig HBM will be the "sweet spot" up to 1440, which is the typical single GPU enthusiast resolution. Multi GPU 4K gaming I believe will need more than 6gig in the foreseeable future. Add that the new API's are virtually extensions of Mantle to add other IHV GPU's and the consoles being GCN I actually feel optimistic for AMD. Oh, that and "The Bumbler" Rory Read being history.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Also this is lurking around, but it feels like early April fools.

BFG trolls with the GeForce GTX Titan XXX with 24GB of RAM and 300W OC

I thought BFG declared bankruptcy, and closed its doors, a while ago?

http://www.overclockers.co.uk/showproduct.php?prodid=GX-098-BG&groupid=701&catid=1914&subcat=1576

It is an April Fool's joke, a tad early.

BFG is as dead as dead can be.

It's also an image of a box BFG used during the GTX 200 era, which is the last card series they manufactured. ;)

Box.jpg


"Is this your bus type?" Is what it says in the upper left-hand corner.
This was the era where PCI Express was still somewhat new, it was only the fourth generation of GPU that was PCIe exclusive, so there were still potential buyers out there with older motherboards. Now they never bother asking if the buyer understands what a PCIe slot is. :D
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Pretty sure GTX 980 Ti will have higher TDP than GTX Titan X.
The reason GTX 780 Ti had the same TDP as GTX Titan while clocked higher was because they disabled many of the power hungry FP32 cores they use for computation performance from the GTX Titan.

With GTX Titan X and GTX 980 Ti its a bit different because the Titan X doesnt have the amount of FP32 cores the original GTX Titan had. So they can`t disabled any of them and "gain" a thermal envelope they can use to clock the 980 Ti higher while still staying on 250W.

I see this different.

The 250w TDP is just about right for the 980Ti. Subtract 6GB VRAM from the Titan X and up the clocks on the 980Ti and 250w is a perfect target usage.

As others have stated though, overclocked models will likely be higher (10-20%) and probably do 2x8pin connectors to allow >300w power usage.

The reference 980Ti will likely just do 6+8-pin, just like the X.

All speculation, of course.
 

Majcric

Golden Member
May 3, 2011
1,369
37
91
Unfortunately it looks like we might not see this card until September. At least this is the word coming from sweclockers. I was ready for this thing by June. With the likes of GTA V and Witcher 3. I really hate to buy the 980 but September is a good ways off in the tech world.
 

PullTheTricker

Junior Member
Dec 12, 2007
24
0
0
Unfortunately it looks like we might not see this card until September. At least this is the word coming from sweclockers. I was ready for this thing by June. With the likes of GTA V and Witcher 3. I really hate to buy the 980 but September is a good ways off in the tech world.
Its all poker from AMD and NVIDIA. Don't be surprised if the 980TI gets shipped right after 390X is on the market. NVIDIA may not know when AMD release their next line-up, but at least they can make claims and bluffs. But who knows, we'l see I guess.