GTX680 reall a 660Ti?

Discussion in 'Video Cards and Graphics' started by Smoblikat, Mar 22, 2012.

  1. Smoblikat

    Smoblikat Diamond Member

    Joined:
    Nov 19, 2011
    Messages:
    4,846
    Likes Received:
    36
    If their high end chip was origonally going to be their mid range chip......then yes i WILL say that the 8900 should be the mid range chip. That only makes sense
     
  2. BallaTheFeared

    BallaTheFeared Diamond Member

    Joined:
    Nov 15, 2010
    Messages:
    8,128
    Likes Received:
    0
    Actually from what I've seen looking back at GF114 vs GF110 there is a 40% TDP increase for a 30%~ performance increase.

    GF110 isn't all that less efficient than GF114.

    Keep in mind Nvidia is shipping GK104 with 8+6+6 connectors on the reference PCB.

    GK104 gave the so called "enthusiast" websites like [H] a product that has a high end name and marginally more performance than the turd AMD put out while being considerably more quite and used less power to do it.

    Neither the 7970 nor the 680 are what I'd call a next gen enthusiast card, Nvidia pandered to the weakest link in our market, the people who buy a i7-2600k and see how far it can go on the stock cooler than complain about how hot/loud it gets at 4GHz.


    Still waiting for an upgrade/680 oc'ing getting fixed/price to go down, hopefully we see a 685 or something based on GF110 with a 250+ TDP that all the so called "enthusiast" websites hate but I can put on water and get 50%+ more performance out of than this 680.
     
  3. Smoblikat

    Smoblikat Diamond Member

    Joined:
    Nov 19, 2011
    Messages:
    4,846
    Likes Received:
    36
    Exactly what i want, ill take the normal TDP of 250w+ for a REAL enthusiasts card. Im thinking a GTX 685 will come.
     
  4. Lepton87

    Lepton87 Platinum Member

    Joined:
    Jul 28, 2009
    Messages:
    2,546
    Likes Received:
    7
    I'm saying that that 460 could easily have GTX560TI performance if they wanted. Only the thermals would be different from GTX560TI.
     
  5. Qbah

    Qbah Diamond Member

    Joined:
    Oct 18, 2005
    Messages:
    3,706
    Likes Received:
    4
    And also would:
    - have lower yields (=higher price), as less chips would hit target clocks
    - need better cooling (=louder or beefier=pricier)
    - need better PSU (OEMs don't like it)
    - have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

    So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really ;)
     
  6. BallaTheFeared

    BallaTheFeared Diamond Member

    Joined:
    Nov 15, 2010
    Messages:
    8,128
    Likes Received:
    0
    I think GF100 had more of a say than the things you listed, Nvidia was behind with the 480, missing their target cores and probably clocks as well, which affected the 470 as well.

    GF104 had to be clocked low enough not to outpace the 470, and still allow the 470 to look decent considering it was $100 more.
     
  7. Lepton87

    Lepton87 Platinum Member

    Joined:
    Jul 28, 2009
    Messages:
    2,546
    Likes Received:
    7
    But if GF100 flopped any more than it did they had plan B. GTX680 seems like plan B all the way.
     
  8. rusina

    rusina Junior Member

    Joined:
    Mar 20, 2012
    Messages:
    24
    Likes Received:
    0
    Die size is one thing indicating this. There's poor double precision performance and relatively low power consumption as well. It's easy to see that this GK104 isn't the best possible GPU Nvidia could do for consumer market.

    Also the claim was that originally GK104 wasn't meant as high end GPU. That doesn't necessary mean that Nvidia couldn't sell it as one.

    It should be only half as big if it was only die shrink. For example GTX 460's GPU had 500 million transistors more than GPU on GTX 260. It was still faster and smaller.
     
  9. Mopetar

    Mopetar Diamond Member

    Joined:
    Jan 31, 2011
    Messages:
    3,875
    Likes Received:
    170
    I don't think it affected the 470 too much. GF100 certainly had problems as the most CUDA cores any of its products shipped with was 480, but the 470 had 448 cores enabled where as GF104 only had 336 cores, leaving plenty of breathing room. The differences in die size alone were quite staggering, leaving plenty of room for the 470 that it wouldn't be necessary to artificially gimp the 460.
     
  10. Mopetar

    Mopetar Diamond Member

    Joined:
    Jan 31, 2011
    Messages:
    3,875
    Likes Received:
    170
    I'm guessing that nVidia just didn't want to repeat the whole thing a second time. Assuming that GK100 was anything like GF100, it would have been massive, probably hot (GK104 is a little hotter than expected, but that could just be the cooler design), and likely would need to have cores disabled just to yield decently. nVidia didn't want to come stumbling out of the gate again, especially if their wafers were even more limited this time around. It just wouldn't be cost effective, or allow them to get things done on time.

    I don't know if they rolled the dice or had enough information about what AMD was doing to make a good judgement call, but it's worked out well enough for them. We don't have any real clue what the schedule was looking like on GK100, but it was apparently either behind by enough that nVidia needed to release something or there were some design hurdles that couldn't be fixed without a major overhaul so that it was scrapped. Without any better evidence it's just as likely that nVidia made a decision to cancel a GK100 that might have eventually worked, albeit with lower yields and at a later date than hoped for, or they had no choice but to cancel a GK100 that had no real hope of working as planned without major work and months of time.
     
  11. BallaTheFeared

    BallaTheFeared Diamond Member

    Joined:
    Nov 15, 2010
    Messages:
    8,128
    Likes Received:
    0
    480 > 470 was only one less cluster, the only way to go from $350 to $500 when the performance difference per clock was going to be around 5% was to reduce the clock rate of the 470 noticeably lower than the 480, the 480 was already gimped at 700MHz...

    So you drop per clock performance by 5%, you need another drop in performance, starting at 700MHz base you're going to need a decent performance hit between the two, 250mb of vram isn't going to be enough to warrant $150.... 5% per clock + 13% clock reduction = $350 470



    Anyways, I'm just saying the 460's TDP is so low and it overclocked well enough that they obviously had room both in TDP and in clocks, but were limited more so by GF100 than they were the other factors listed.


    GF114 is GF104, which is what GK104 is. Will we ever see another GF100, "BigK" card? I dunno, Nvidia took so much shit over GF100 and barely skated by with GF110 what's the point?

    Look at the reviews for the 680, reviewers seem to love these cards.. None of them care it has awful overclocking potential, none of them care it isn't much faster than the 580 they've had for over a year. They don't even care it has no compute power.

    All they care about is that it has better perf/watt than the 580/7970, it's barely any faster, noticeably quieter, and it's called the GTX 680. Personally I've never felt so detached from review sites before in my life, they've seemingly killed BigK and welcomed a product with open arms that has no place in a high end system, yet carries a high end price tag.
     
  12. nenforcer

    nenforcer Golden Member

    Joined:
    Aug 26, 2008
    Messages:
    1,762
    Likes Received:
    0
    [​IMG]

    I think this 660 talk started after only after yesterday's release and the architecture comparison to GF114 that GK104 is.

    The picture above surfaced several weeks ago indicated it may have been originally targeted as a GTX 670 Ti and possibly the second batch of silicon (what do they call it A2?) allowed them to clock fast enough to be marketed as the GTX 680.

    Or its possible that the GPU above is completely unrelated and is now something like GK106.
     
    #87 nenforcer, Mar 23, 2012
    Last edited: Mar 23, 2012
  13. Ajay

    Ajay Diamond Member

    Joined:
    Jan 8, 2001
    Messages:
    3,256
    Likes Received:
    286
    I think most of the 660 talk resulted from the GPU being called the GK104. The 670 Ti talk started because there were some rumors of it being called such - but as you point out, that could have just been confusion by some after seeing 570Ti marked parts (which may have been mock-ups of the real 670 Ti that has yet to come. It's all pretty crazy actually :\
     
  14. Smoblikat

    Smoblikat Diamond Member

    Joined:
    Nov 19, 2011
    Messages:
    4,846
    Likes Received:
    36
  15. Rvenger

    Rvenger Elite Member <br> Super Moderator <br> Video Cards
    Super Moderator

    Joined:
    Apr 6, 2004
    Messages:
    6,293
    Likes Received:
    4
  16. Destiny

    Destiny Platinum Member

    Joined:
    Jul 6, 2010
    Messages:
    2,309
    Likes Received:
    0
    I never doubted you... I had a suspician, but just needed evidence... When I saw leaked test results for the GTX 670 it just didn't add up because performance was too close to the GTX 680 - which is similar like the GTX 560ti and GTX 560... Now they are selling the GTX 660ti (Now the GTX 680) and GTX 660 (Now the GTX 670) as high end cards because the performance does designate them as high performance... it also may explain the low supply because they were caught off guard by the performance of the card...

    Good Job btw! :biggrin:
     
  17. blastingcap

    blastingcap Diamond Member

    Joined:
    Sep 16, 2010
    Messages:
    6,656
    Likes Received:
    5
    Please credit the proper source, which is VR Zone not some forum.

    http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html

    This reminds me of the Intel Core scenario where Intel messed up with Netburst after a while and had to dig deep into their Israeli "high efficiency/mobile/Pentium M" team which was working on innovative energy efficiency measures. They took the high-efficiency architecture lessons from Pentium M and turned those lessons into Core and Core 2 Duo.

    First there were energy guzzlers, but the heat/power/noise got so bad that performance hit a wall.

    Then they turned into high-efficiency architectures.

    Then they supercharged the high-efficiency architectures to make something both fast and relatively efficient.

    P.S. This is also somewhat old news. TPU has had this article up for ages: http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html
     
  18. guskline

    guskline Diamond Member

    Joined:
    Apr 17, 2006
    Messages:
    5,050
    Likes Received:
    246
    You never own the fastest video card made very long because a faster one is just around the corner! That being said last week I bought what was the fastest Nvidia card till the 690 GTX debuted. Did I pay too much ? All relative. I was in the market for a modern single GPU to power 3 monitors at a combined 5760 x 1080 resolution. My realistic choices were an AMD 7970 vs a Nvidia 680 GTX. Was I wrong? I don't think so but that's only me. BTW I have the card and but fot a failed ssd the system runs much better and smoother in games than the 6970. For me it was worth it. I have no buyers remorse as I'm flying a WWI plane over 3 monitors with incredible fps in high resolution. My fellow posters who own the 7970 can make the same claims and I'm glad for them also.
     
    #93 guskline, May 8, 2012
    Last edited: May 8, 2012
  19. SolMiester

    SolMiester Diamond Member

    Joined:
    Dec 19, 2004
    Messages:
    5,317
    Likes Received:
    14
    Don't think it was that simple, the gk110 would of been the 680, but
    Probably had yield issues. The gk1114 didn't have it's big brother compute, so clocked well enough to beat or equal the 7970, so let's get it out the door to start 28nm line up.....probably why they have the 690 out now rather than the smaller gpu line up.....
     
  20. SolMiester

    SolMiester Diamond Member

    Joined:
    Dec 19, 2004
    Messages:
    5,317
    Likes Received:
    14
    Interestingly enough, this is exactly what I would of done too.....that would of killed any enthusiasm for the AMD line up with the thought of higher end parts to come. I guess, it's moot, as their is no further high end, but I guess they wouldn't be able to charge $500 for a mid range.....
     
  21. Magic Carpet

    Magic Carpet Diamond Member

    Joined:
    Oct 2, 2011
    Messages:
    3,107
    Likes Received:
    4
    If memory serves me right, Pentium D (Netburst) had also been developed by the same Israeli team.
     
  22. T_Yamamoto

    T_Yamamoto Lifer

    Joined:
    Jul 6, 2011
    Messages:
    13,860
    Likes Received:
    9
    Still makes me lauhh