GeForce Titan coming end of February

Discussion in 'Video Cards and Graphics' started by Rikard, Jan 21, 2013.

Thread Status:
Not open for further replies.
  1. Jaydip

    Jaydip Diamond Member

    Joined:
    Mar 29, 2010
    Messages:
    3,659
    Likes Received:
    11
    Box I think titan like dual gpu cards don't have to strictly follow the TDP scale set by ~$500 cards.
     
  2. boxleitnerb

    boxleitnerb Platinum Member

    Joined:
    Nov 1, 2011
    Messages:
    2,596
    Likes Received:
    1
    I like to look at all cards, small and large, from an engineering point of view, too. Efficiency is somewhat important to me, even with SLI. The lower the power consumption, the better for my wallet, my ears and my conscience :)
     
  3. Olikan

    Olikan Golden Member

    Joined:
    Sep 23, 2011
    Messages:
    1,829
    Likes Received:
    0
    yeah, and 50% more bandwidth :rolleyes:
     
  4. Keysplayr

    Keysplayr Elite Member

    Joined:
    Jan 16, 2003
    Messages:
    21,200
    Likes Received:
    48
    Might be all it needs. We'll see soon enough apparently.
     
  5. Keysplayr

    Keysplayr Elite Member

    Joined:
    Jan 16, 2003
    Messages:
    21,200
    Likes Received:
    48
    If you're really that concerned about efficiency, there are tons of options for you in the nv 6xx series and AMD 7xxx series. No real reason for you to be concerned about a few uber high end cards breaking the TDP "barrier".
     
  6. Jaydip

    Jaydip Diamond Member

    Joined:
    Mar 29, 2010
    Messages:
    3,659
    Likes Received:
    11
    You said the magic word :biggrin:
     
  7. RussianSensation

    RussianSensation Elite Member

    Joined:
    Sep 5, 2003
    Messages:
    19,458
    Likes Received:
    695
    If Titan is trading blows with GTX680 SLI, that would automatically make it more efficient since a single GTX680 uses 185W of power. So unless the Titan uses 370W, it wins on efficiency too. :p If Titan is really that amazing, it makes no sense for anyone to own GTX680 SLI / GTX690 and put up with SLI. I am sure even if it comes in with 10-15%, people will overclock it to match GTX680 SLI and take the slight downgrade in FPS for smoother frame delivery. The used sale market is going to flooded with those 680s/690s GPU. With GTX680 market prices so high, there hasn't been a better time to sell one!

    Agreed. If NV puts a very high quality cooler that dissipates 275-300W of power at reasonable noise level (not GTX480 first 6 months samples), then let her rip! GTX690 has a TDP of 300W and uses about 275W of power in games and yet due to its very high quality fan and cooler, it remains reasonably quiet for that level of performance.

    [​IMG]

    If NV manages to boost the Titan to 1050-1100mhz and real world power consumption is 275-280W, the target market for this card won't mind. I don't think many people who are spending $900 on a GPU care about another 25-30W of extra power over 250W TDP.
     
    #957 RussianSensation, Feb 13, 2013
    Last edited: Feb 13, 2013
  8. Crap Daddy

    Crap Daddy Senior member

    Joined:
    May 6, 2011
    Messages:
    610
    Likes Received:
    0
  9. nextJin

    nextJin Golden Member

    Joined:
    Apr 16, 2009
    Messages:
    1,848
    Likes Received:
    0
    No one paying a thousand dollars for a gpu gives a damn about that.
     
  10. PrincessFrosty

    PrincessFrosty Platinum Member

    Joined:
    Feb 13, 2008
    Messages:
    2,176
    Likes Received:
    19
    Looks like a seriously immense piece of hardware, but only 10k being made, that means the UK will see about 500 if we're lucky, and they'll be price gouged up to about twice the price as the $USD RRP

    Ugh...
     
  11. sontin

    sontin Diamond Member

    Joined:
    Sep 12, 2011
    Messages:
    3,024
    Likes Received:
    5
  12. Borealis7

    Borealis7 Platinum Member

    Joined:
    Oct 19, 2006
    Messages:
    2,412
    Likes Received:
    9
    cards like these should come with Water blocks already fitted.
     
  13. n0x1ous

    n0x1ous Platinum Member

    Joined:
    Sep 9, 2010
    Messages:
    2,150
    Likes Received:
    17
  14. BallaTheFeared

    BallaTheFeared Diamond Member

    Joined:
    Nov 15, 2010
    Messages:
    8,128
    Likes Received:
    0
    Some of the things he says just are plan wrong.

    :whiste:
     
  15. SirPauly

    SirPauly Diamond Member

    Joined:
    Apr 28, 2009
    Messages:
    5,187
    Likes Received:
    0
    Absolutely!:)
     
  16. Hypertag

    Hypertag Member

    Joined:
    Oct 12, 2011
    Messages:
    148
    Likes Received:
    0
    Why are you finally acting like the GK110 card is good? You spent a year giving a hundred different reasons why it would barely be better than the 680.
     
  17. RussianSensation

    RussianSensation Elite Member

    Joined:
    Sep 5, 2003
    Messages:
    19,458
    Likes Received:
    695
    BSN reports 875mhz GPU clock. If true, most of you here owe tviceman and I beers! ;)

    I wasn't aware I am not allowed to be impressed or change my mind in light of additional information that leaked. I never said I wouldn't be impressed by a 50-60% gain. What I said is that 80-100% gain from a mythical 2880 SP 1Ghz 240 TMU part was wishful thinking in the discussed 250W power level. Feel free to quote where I said specifically if Titan was "only" 40-60% faster I wouldn't be impressed.

    All my discussions regarding GK110 always centered around of its combination of GPU clocks, die size, fully functional SMX units and real world power consumption/TDP. I even mentioned more than once that if NV wants to break 250W real world power consumption, they can go way higher than 50-60% than the performance of a GTX680 but I deemed that as unlikely due to issues they had with GTX480s at 270W. Even in this very thread people continued to contest my opinion when I questioned the 1019mhz GPU clocks. I could be right, I could be wrong, but I'll throw my opinion in. That's what this forum is for. We share our viewpoints.

    If you read any of my posts last year regarding year GK110 claims, my biggest problems were related to fanboy wishful thinking of 1Ghz 2880 SP, 240 TMU 300W part, or doubling the performance of a GTX680 inside a 250W power level, or even increasing GPU speed by 80%. Based on the rumors I repeatedly said I wouldn't have been surprised by a 40-60% performance increase inside a 235-250W power envelope and what what we are hearing the card is coming in at 60-65% faster than a GTX680, not 80-100%. In case you need me to remind you again, a GTX690 or GTX680 SLI is not 2x faster than a GTX680.

    I questioned the X7000+ 3D Mark (Fire Extreme) scores as well. "Preliminary results show that in 3DMark Fire Strike (Extreme), GTX Titan scores 4870. According to our own testing, this would put the part some 500 points (10%) behind two GeForce GTX 680s in SLI mode. Given that single GTX 680 achieves around 3000 points, almost 1900 points fewer than GTX Titan." 4900 points vs. 3000 points is a 63% increase, not 80-100%. This is not out of line with what I have been saying about GK110 to begin with (40-60%). Not sure what problems you have with my previous opinions.

    Finally, I find it amusing that you have personally come out and questioned why I am suddenly impressed by GK110's specs. The whole point of discussing theoretical GPU specs is to throw around different ideas. That's what makes the forum fun. People here have different viewpoints regarding prospective GPU specs but you have specifically chosen to address me regarding my opinion on GK110 for what reason? I never claimed to be able to predict with 99% certainty the specs of future GPUs. Looking at the past, the forum has been wrong 3/3 times when everyone went gangbusters over GTX480/580/680 hype specs. You failed to mention that part. Did you proceed to tell everyone who guessed wrong on GTX480/580/680's specs of performance estimates?
     
    #967 RussianSensation, Feb 13, 2013
    Last edited: Feb 13, 2013
  18. BallaTheFeared

    BallaTheFeared Diamond Member

    Joined:
    Nov 15, 2010
    Messages:
    8,128
    Likes Received:
    0
    Base, or boost?
     
  19. AdamK47

    AdamK47 Lifer

    Joined:
    Oct 9, 1999
    Messages:
    12,111
    Likes Received:
    68
    I predict nice sales for slightly used GTX 690s will follow the release of this card.
     
  20. SirPauly

    SirPauly Diamond Member

    Joined:
    Apr 28, 2009
    Messages:
    5,187
    Likes Received:
    0
    According to the same BSN article Asus and EVGA have some clock custom flexibility and offered, allegedly 915 base for the Asus sku.
     
  21. RussianSensation

    RussianSensation Elite Member

    Joined:
    Sep 5, 2003
    Messages:
    19,458
    Likes Received:
    695
    What if they disabled Kepler boost but re-enabled full voltage control with a fixed GPU clock speed of 875-915mhz? Or maybe the secret sauce is a huge GPU boost with TDP exceeding 235W for power users?

    10,000 Titans worldwide sounds like a pretty limited edition card.
     
    #971 RussianSensation, Feb 13, 2013
    Last edited: Feb 13, 2013
  22. sontin

    sontin Diamond Member

    Joined:
    Sep 12, 2011
    Messages:
    3,024
    Likes Received:
    5
    It would be the base clock. Tools have no chance to read the boost clock.

    Makes sense. So they can give people, what they: Voltage control. :lol:
     
  23. n0x1ous

    n0x1ous Platinum Member

    Joined:
    Sep 9, 2010
    Messages:
    2,150
    Likes Received:
    17
    Can't wait for this thing! Feels like 8800GTX reborn and something worthy of replacing my 480's
     
  24. RussianSensation

    RussianSensation Elite Member

    Joined:
    Sep 5, 2003
    Messages:
    19,458
    Likes Received:
    695
    I find it hilarious this is launching 2 days before Sony's big announcement for a possible PS4.

    NV's 1 finger salute to PS4.

    [​IMG]
     
  25. wand3r3r

    wand3r3r Diamond Member

    Joined:
    May 16, 2008
    Messages:
    3,187
    Likes Received:
    0
    The only problem is the assumption they are going to put a ridiculous price on it.

    You could have had better performance for a long time with a 690 (two midrange 560ti successors) and now this will only bring about a huge price increase (according to rumors). Yeah I know you don't have to deal with SLI but regardless if the high end prices just went from $550/$600 to $800 then it's just NV delaying the high end and inflating the price. Without a massive price increase this would be a lot more interesting. They are finally able to pass off major price increases and get away with it.
     
Thread Status:
Not open for further replies.