GeForce Titan coming end of February

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Yes I agree. But not everyone who reports the rumor adequately coneys that it is a rumour. Tweaktown for instance makes it clear, but wccftech refers to 'multiple independent sources' as if we have anything more than a single Swedish article

Maybe they do, are you the internet police or just a lawyer?

edit: look for stock gossip/news sites to pick up the same rumor next.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
I'm pretty sure that Nvidia has some GK110's that didn't quite make the cut that they would love to sell as very high priced consumer GPUs.

This has to be the case. This card is going to be like the gtx480. Fast as all get out, but using the chips that didn't quite make the cut for the workstation cards.

Maybe some day they will release the real deal GK110 as a desktop gaming card, but not while they can sell them for thousands of dollars.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
This has to be the case. This card is going to be like the gtx480. Fast as all get out, but using the chips that didn't quite make the cut for the workstation cards.
That's just how the semiconductor industry works, period. Intel saves their top binned chips for companies like Google, then they dole them out to the server crowd, then they go out to OEMs and the rest of the market.

Same situation here. The top chips go to companies willing to pay extra for them , then they go into their professional lineup for the consumption of that market, then they work their way into consumer cards.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Not sure I really want a GK110 as a desktop gaming card as it's design is compromised for compute. I'd much rather have a GK104 like design with 50% more cores/memory bus/etc as that would be smaller and more efficient for gaming.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
Not sure I really want a GK110 as a desktop gaming card as it's design is compromised for compute. I'd much rather have a GK104 like design with 50% more cores/memory bus/etc as that would be smaller and more efficient for gaming.

And you'll likely get something along those lines @ 20nm. No way would Nvidia make a 500mm chip like GK104 just for the gaming market. The big chip must serve the high margin markets first and foremost.
 

Ajay

Lifer
Jan 8, 2001
15,565
7,923
136
And you'll likely get something along those lines @ 20nm. No way would Nvidia make a 500mm chip like GK104 just for the gaming market. The big chip must serve the high margin markets first and foremost.

Totally depends on yields. If they are good, I see no reason why NV wouldn't sell them to consumers. The BOM costs would be lower (probably only 3GB of non-ECC dram) plus less QA, and NV might make a pretty penny off it.
 
Feb 19, 2009
10,457
10
76
Not sure I really want a GK110 as a desktop gaming card as it's design is compromised for compute. I'd much rather have a GK104 like design with 50% more cores/memory bus/etc as that would be smaller and more efficient for gaming.

Its not compromised in terms of performance, just in efficiency. Most ppl who buy top end stuff dont care about power consumption, so it matters didly squat. I was hoping they release gk110 gpus for around $500, id grab it... but $800 is being too damn greedy for single gpus. It would have to beat gtx680 by 80% to justify that price.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Subjective justification aside, the GTX 690 commanded a 999 MSRP and did not offer an 80 percent over-all increase at 1600p and 5760x1080 resolutions over the GTX 680.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
I think about it the other way around. GPU's must become more and more powerful so we can have some more eye candy ;)

I would like that too, but even most enthusiasts have no idea about advanced methods like SGSSAA, AO, downsampling, 3D etc.
Most just go for multiple monitors and that's it.

Unfortunately game developers won't give us incredible graphics out of the box. Either because consoles are holding us back or because content creation is too expensive or because they don't have the balls to tell their customers "you have to have SLI/CF to turn on ultra". If your average Joe cannot play with the highest settings with his $300 card, he goes on a "it's programmed badly" rant, see Crysis 1...
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
Just add more monitors :biggrin:

Never, I don't like the bezels, kills immersion for me. I also look straight forward when gaming, I would not really appreciate the peripheral monitors as the visual field of attention is too small to look at them all at once. Some like it, I don't ;)
 

Pandora's Box

Senior member
Apr 26, 2011
428
151
116
Never, I don't like the bezels, kills immersion for me. I also look straight forward when gaming, I would not really appreciate the peripheral monitors as the visual field of attention is too small to look at them all at once. Some like it, I don't ;)

portrait mode ftw.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Never, I don't like the bezels, kills immersion for me. I also look straight forward when gaming, I would not really appreciate the peripheral monitors as the visual field of attention is too small to look at them all at once. Some like it, I don't ;)

That's the key, subjectivity! Some like it; some don't. Personally like dimensional innovation; to go beyond the 2d plane of existence for gaming and some don't.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I think about it the other way around. GPU's must become more and more powerful so we can have some more eye candy ;)

Indeed! The IHV's may need to find innovative ways to improve immersion with features that may raise the bar for gaming experiences.

I don't desire to just play games but experience them.
 

BababooeyHTJ

Senior member
Nov 25, 2009
283
0
0
But, the Lightning has better PCB components which will allow for more overclocking headroom even on water :) Generally speaking, the reference 680 did not gain much in terms of overclocking with water cooling. That definitely was not the case with the 680 lightning, many users have achieved 1500mhz or higher completely stable while on water - the quality of the components used was key in this.

I guess i'm a fan of the Lightning cards :(. Used them several gens now , they're the best of all the choices on the market..... I hope nvidia still allows custom boards.

Thats if you can find a full cover block.

I kind of have my doubts that an over-engineered pcb would make much of a difference for overclocking without the ability to adjust core voltage.

I would mostly be putting this monster on water for silence. Although cooler temps might help with clocks on a large hot die like this. I know that with my GTX280 temps played a large role.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
That's the key, subjectivity! Some like it; some don't. Personally like dimensional innovation; to go beyond the 2d plane of existence for gaming and some don't.

Jesus lay off the budds sp,or atleast share it:p
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Man am I excited for this. Not because I want it — I don't — but I just like seeing the bars go bigger and the numbers going up. Progress is awesome.
 

Pandora's Box

Senior member
Apr 26, 2011
428
151
116
To me its just impressive as to what they are about to do. Increasing single card performance by around 170% from a GTX 680. Definitely interested in getting this card, never been a fan of sli/crossfire.
 
Status
Not open for further replies.