Its happening: GTX 980Ti already listed in China!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Morbus

Senior member
Apr 10, 2009
998
0
0
So far, but that seems like the obvious way for later-generation games to have the visual improvements we expect over the life of a console-huge textures.
VRAM isn't computing power. The GPUs of the consoles aren't powerful enough to handle more than 2GB. I doubt they can eve reach that number, since they don't run at 1080p much, as far as I'm aware.

VRAM is like having a trailer for your car. It's not much good to have a huge trailer if your car isn't powerful enough.
 

Riceninja

Golden Member
May 21, 2008
1,841
2
0
He's saying it's equivalent to a seller on Amazon that isn't Amazon itself listing it, not that the website isn't big.
its equivalent to newegg or adorama listing their products on ebay

You have to own the website for the website to matter for your credibility.
in north america yes, but e-tailers operate differently in asia. listing only on taobao doesn't mean they're not credible
 

dangerman1337

Senior member
Sep 16, 2010
314
0
76
If true its strange to think that the GM200 yield/redundencies are enough to supply only 3072SP chips without any cutdown versions.
Pretty much, that's why I also think there won't be a GM200 '980ti' as the name does not make any sense. Maybe a 985/990 (Ti and non-Ti) instead of people assuming 780Ti successor = 980Ti that is parroted by WCCFTech.
 

2is

Diamond Member
Apr 8, 2012
4,294
129
106
VRAM isn't computing power. The GPUs of the consoles aren't powerful enough to handle more than 2GB. I doubt they can eve reach that number, since they don't run at 1080p much, as far as I'm aware.

VRAM is like having a trailer for your car. It's not much good to have a huge trailer if your car isn't powerful enough.
So Sony equipped the PS4 with 8GB of expensive and power hungry GDDR5 because?????

Please link the technical brief that illustrates how the consoles are incapable of utilizing the memory that they are wastefully equipped with.
 

SimianR

Senior member
Mar 10, 2011
609
15
81
Again - this really doesn't make sense. Even if it's true, I'm not sure how people who bought a Titan X won't feel burned by the fact that they just bought the "flagship" for $1000 only for NVIDIA to turn around and sell the "real" flagship 3 months later for less? Yikes.
 

Morbus

Senior member
Apr 10, 2009
998
0
0
So Sony equipped the PS4 with 8GB of expensive and power hungry GDDR5 because?????
Because it's SHARED RAM.

They use it to store game assets and other stuff, not related to graphical computing. My computer, for example, has 18GB of RAM, only 2 of which is used by the GPU.

Please link the technical brief that illustrates how the consoles are incapable of utilizing the memory that they are wastefully equipped with.
I don't think you understood my meaning, and I don't feel you have any interest in doing so.
Have a nice day.
 

Saika

Junior Member
Apr 21, 2015
13
0
0
$799 means between 1050 to 1100+ Euro in Europe.
I'm new here.Why is it that new cards don't replace the current high end cards price wise?
A GTX 980 is already damn expensive here, no way I'm putting 650 Euro on a mid-range card with only 4 GB vram.
I was thinking about upgrading my [redacted] hd 5570, i5-650 and 768p for some 1440p and get ready for the Vive but the prices will be ridiculous.

Profanity isn't allowed in the technical forums.
-- stahlhart
 
Last edited by a moderator:

DooKey

Golden Member
Nov 9, 2005
1,555
238
106
Again - this really doesn't make sense. Even if it's true, I'm not sure how people who bought a Titan X won't feel burned by the fact that they just bought the "flagship" for $1000 only for NVIDIA to turn around and sell the "real" flagship 3 months later for less? Yikes.
Because most people willing to buy the best know it won't be the best for long. Early adopter tax has been understood for quite a while when your talking halo products.
 

crisium

Platinum Member
Aug 19, 2001
2,631
587
136
If true its strange to think that the GM200 yield/redundencies are enough to supply only 3072SP chips without any cutdown versions.
Indeed. I expected the 980Ti to be a 2560 Shader part to fill in the gap between 980 and Titan X. And it helps ensure the legacy of the Titan brand which was tarnished by losing DP. This leaked 980Ti would be equal or slightly faster (higher clocks) than the Titan X except when VRAM usage is over 6GB. I think Titan X has the biggest premium for double VRAM ever on a card if this is true.
 

2is

Diamond Member
Apr 8, 2012
4,294
129
106
Because it's SHARED RAM.

They use it to store game assets and other stuff, not related to graphical computing. My computer, for example, has 18GB of RAM, only 2 of which is used by the GPU.


I don't think you understood my meaning, and I don't feel you have any interest in doing so.
Have a nice day.
I know it's shared, and I also know that some of it is reserved for the OS, but unless you have something to backup the 2GB claim, then it's just a made up number based on little more than how powerful you think the GPU is. Do you think it's mere coincidence that >2GB of VRAM usage for PC games, (even at 1080p) is suddenly happening more often now than it was before the PS4/XBOX One release?
 

Gloomy

Golden Member
Oct 12, 2010
1,462
0
0
Consoles can have more than 2GB dedicated to graphics.

Killzone: Shadowfall for example used 3GB just for graphics data.

Total possible at the moment is roughly 6GB for the game (CPU + GPU) and it's up to the developer to partition that between the GPU and CPU. In general though most of it is going to go towards graphics, because those assets take up the most space. Runner up in size is sound data, that will make up the largest CPU-only chunk of memory, but that is much much smaller in comparison. The rest of the game is going to be a drop in the bucket.

Total system memory can't really be compared with PC at face value because PC memory is segmented, so some duplication happens. If a graphics asset is being put into GPU memory, it also has to exist in CPU memory, too, at least while loading it in-- basically, stuff has to pass through system RAM at some point. On consoles CPU memory is GPU memory, so no duplication happens.
 
Last edited:
Feb 19, 2009
10,458
5
76
Because it's SHARED RAM.

They use it to store game assets and other stuff, not related to graphical computing. My computer, for example, has 18GB of RAM, only 2 of which is used by the GPU.
The function of vram is not just related to "graphical computing". It's major function isn't just to store the frame buffer as shaders & processors work on it. It's major function is to store graphics ASSETS, everything the developer needs for a particular level/stage or in open world games, a buffer of things they need as the player moves around.

These assets are called in as required to generate any given frame. If the assets are not in vram because the vram is not large enough to hold it, the assets are in system ram. Guess what happens then? The frame is delayed because system ram is very slow, so you get a major drop in fps. And if you don't have enough system ram? It's taken from the HDD.. major pauses with your disk thrasing like mad.

The reason why cross-platform games of late have blown up in their vram requirements are because consoles do in fact have a massive amount of vram available for them to store more of and more quality assets.
 
Feb 19, 2009
10,458
5
76
Agreed, it doesn't make financial sense for NVIDIA to release a Ti version right now at all. They could simply name a cut down GM200 GTX 1080 and sell it for $799 and people would buy it in droves. Then when 390X finally lands in August/September after their June paper launch, we'll know what the specs are and so will NVIDIA. At that time, if they need to, they can release a water cooled 1080 Ti that beats 390X WCE by 10-20% and keep the crown. Releasing a full GM200 for $200 cheaper than Titan X w/half the ram right now just undercuts profitability and wastes full dies which aren't cheap to produce despite the 28nm maturity. I also don't think 6 GB will be the sweet spot for 4K gaming as GTA V already reaches that limit with some MSAA added into the mix. I regularly reach 5.4 GB at 1440P alone with that game. With these new generation titles, we're looking at a shift where 6 GB = 1080p and maybe 1440p with no MSAA and 8GB+ for 1440p+DSR or MSAA/4K and 12GB+ for surround.
Good points. It's a huge die, yields would not be anywhere close to excellent so it makes little sense to release yet another full-die SKU, certainly without pressure from AMD, there's zero need to undercut a well selling GM200 in the Titan X.

The thing about vram requirements are twofold:

1) At 4K with the typical 28-32 inch monitor, its equivalent to 1440p with 4x MSAA in terms of pixel density and image quality. There is no need to enable 4x MSAA, doing so is just wasting performance & vram for very little IQ gains.

2. D/VSR = Super Sampling AA, or the best form of AA. As such, its actually very silly to run DSR + MSAA combo.

As such, 1440p + DSR scaling down is about as much vram requirements as normal 4K without MSAA. A few games of late need more than 4GB vram for both of these tasks, but they certainly don't push (need) more than 6GB. This also will be the likely upper limit as developers are limited by what they can do due to console specs.


Titan X selling very well with massive margins at $999? I would leave it un-threatened and not release a SKU that's potentially faster for cheaper. Makes ZERO sense.

That didn't stop NVIDIA from releasing the 780Ti.
You know when the 780ti arrived and why it landed? NV was happy with the Titan $1000 and 780 $600 scenario. The 780 was slower than Titan, with less vram and no DP compute. It being cheaper is obvious. Then AMD released the R290/X both are faster than 780 and Titan so NV had to counter with the 780ti to reclaim the performance crown. NV is not in the business of threatening their margins themselves. They only do so in response to fierce competition.

Titan X $999 that is selling very well, then a few months later, a faster SKU for less is the worse way to ensure maximum profits. If anything, they need a slower GM200 SKU that uses the harvested dies which did not make the cut into Titan X.

They need a 1080, cut-down GM200 with 6GB, 10% slower than Titan X at $799. This way both Titan X and 1080 will sell well and the lower card competes less against the X. Economics 101.
 

RussianSensation

Elite Member
Sep 5, 2003
19,460
743
126
They need a 1080, cut-down GM200 with 6GB, 10% slower than Titan X at $799. This way both Titan X and 1080 will sell well and the lower card competes less against the X. Economics 101.
Original Titan $1K => Feb 21, 2013
GTX780 Ghz $650 => May 23, 2013 (3 months)



7800GTX 512MB = $699 = Nov 14, 2005
7900GTX 512MB = $499 = March 9, 2006

NV has no problem releasing a 'flagship' card and then releasing an even faster and sometimes cheaper card in less than 6 months. IMO, anyone buying a Titan X for $1000 should not care that in 6 months there will be a card for $500 90% as fast. Last time it took only 9 months before we had the original Titan's performance in a $400 R9 290. This is just a natural course for the GPU industry when there is strong competition.
 

xorbe

Senior member
Sep 7, 2011
368
0
76
Again - this really doesn't make sense. Even if it's true, I'm not sure how people who bought a Titan X won't feel burned by the fact that they just bought the "flagship" for $1000 only for NVIDIA to turn around and sell the "real" flagship 3 months later for less? Yikes.
Because we already know from the Titan / 780Ti days two years ago. :wub:
 
Feb 19, 2009
10,458
5
76
Original Titan $1K => Feb 21, 2013
GTX780 Ghz $650 => May 23, 2013 (3 months)
That's a factory OC model (with a massive out of the box boost to ~1.25ghz). The official 780 was ~10% slower than Titan. Sure, you could OC it to match Titan, but so can Titan be OC itself. The gap will remain due to the 780 GK110 chip being cut-down compared to Titan.

The situation now is that the rumor here proposes NV releases a FULL GM200 with a higher boost clock and selling it for less.

That would been akin to NV releasing the 780Ti after Titan, not the 780. Something that didn't happen because it would have majorly threaten Titan's $1000 pricing, even with its vram/DP bonus, something the Titan X is lacking (no special DP powress).
 
Last edited:

ddarko

Senior member
Jun 18, 2006
263
3
81
Again - this really doesn't make sense. Even if it's true, I'm not sure how people who bought a Titan X won't feel burned by the fact that they just bought the "flagship" for $1000 only for NVIDIA to turn around and sell the "real" flagship 3 months later for less? Yikes.
I think a new, lower-priced product is intended to attract new buyers, not placate people who already gave their money to Nvidia. I will try to muster sympathy for people who spent $1000 for a video card, only to have it superseded a little while later....

Nope, can't do it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,460
743
126
That's a factory OC model (with a massive out of the box boost to ~1.25ghz). The official 780 was ~10% slower than Titan.
Just like NV stopped AIBs from launching custom-PCB Titan Xs and applying MSI Lightning/EVGA Classified treatment, NV could have prevented AIBs from releasing non-reference factory pre-overclocked 780s that beat the Titan. But they didn't. The point here isn't that you could overclock the Titan to beat 780 OC. The point is NV threw stock Titan performance under the bus 3 months later with a faster out of the box + factory warrantied 780Ghz for $650. There is no precedent that tells us NV will not release a faster and cheaper card in 3-5 months that beats its previous flagship. If NV wants to capture all those gamers who are waiting for R9 390 series, they could easily throw all those Titan X owners under the bus too and just say well you had 3 months of cutting edge performance, and you knew that paying $1000 meant an early adopter tax. So really those Titan X owners wouldn't be able to have any rebuttal whatsoever. As I said, NV also did this move with 7800GTX 512MB -> 7900GTX. 7900GTX was faster and cheaper!
 
Last edited:
Feb 19, 2009
10,458
5
76
@RS
You have to factor in that Titan X is selling very well, better than expected based on early reports.

NV has no reason to under cut it with a faster part.
 

x3sphere

Senior member
Jul 22, 2009
723
24
81
www.exophase.com
@RS
You have to factor in that Titan X is selling very well, better than expected based on early reports.

NV has no reason to under cut it with a faster part.
It may be selling well, but relative to what? I assume the original Titan, but I'd imagine NV sells several times more GPUs slotted in the $700 mark than at $1000, maybe even 3-5x as much. Just a guess based off how many more people had 780 Tis on forums.

It's a no brainer to try and capture those sales, especially if yields are good. Even though this a big chip, the 28nm process is a lot more mature now. Also, the target consumer for the Titan just buys the best of the best regardless - I bet a lot of them feel the 12GB VRAM is worth it and wouldn't settle for a 6GB card despite it being overkill in most cases.
 
Feb 19, 2009
10,458
5
76
A cut-down GM200 as a 980Ti that's 10% slower than Titan X would still sell very well at $799.

It slots right in between 980 - Titan X gap.

NV doesn't need to under-cut Titan X with a faster part without competition when its selling beyond their expectations.
 

Shmee

Memory and Storage, Graphics Cards
Super Moderator
Sep 13, 2008
4,311
445
126
980 Ti may be interesting, but I would prefer a better SLI solution. Will these still use the old bridges instead of the PCIE lanes?
 

ASK THE COMMUNITY