Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Home and Garden
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 03-23-2012, 03:12 PM   #76
Smoblikat
Diamond Member
 
Join Date: Nov 2011
Posts: 3,794
Default

Quote:
Originally Posted by Zebo View Post
It's bullshit. nV took their best newest most efficient design, the 560ti, and grew upon it. Just like AMD will take their best, newest , most efficient design, the 7870 and grow upon it. Are we going to say the 8900 is meant to be mid range chip when that happens?


Close.
If their high end chip was origonally going to be their mid range chip......then yes i WILL say that the 8900 should be the mid range chip. That only makes sense
__________________
3770K|ASrock Z77 Extreme11|4x8gb DDR3 1600|4xHD6970|1440P 120hz - Buzzard
X6 1055T|ASUS M4A89GTD-EVO USB3|2x4gb Gskill 1600|HD4870X2 + HD48701gb - Virgo
2xXeon L5639|EVGA SR2|6x8gb DDR3|4x2TB
Smoblikat is online now   Reply With Quote
Old 03-23-2012, 03:36 PM   #77
BallaTheFeared
Diamond Member
 
BallaTheFeared's Avatar
 
Join Date: Nov 2010
Posts: 8,128
Default

Actually from what I've seen looking back at GF114 vs GF110 there is a 40% TDP increase for a 30%~ performance increase.

GF110 isn't all that less efficient than GF114.

Keep in mind Nvidia is shipping GK104 with 8+6+6 connectors on the reference PCB.

GK104 gave the so called "enthusiast" websites like [H] a product that has a high end name and marginally more performance than the turd AMD put out while being considerably more quite and used less power to do it.

Neither the 7970 nor the 680 are what I'd call a next gen enthusiast card, Nvidia pandered to the weakest link in our market, the people who buy a i7-2600k and see how far it can go on the stock cooler than complain about how hot/loud it gets at 4GHz.


Still waiting for an upgrade/680 oc'ing getting fixed/price to go down, hopefully we see a 685 or something based on GF110 with a 250+ TDP that all the so called "enthusiast" websites hate but I can put on water and get 50%+ more performance out of than this 680.
BallaTheFeared is offline   Reply With Quote
Old 03-23-2012, 05:55 PM   #78
Smoblikat
Diamond Member
 
Join Date: Nov 2011
Posts: 3,794
Default

Quote:
Originally Posted by BallaTheFeared View Post
Actually from what I've seen looking back at GF114 vs GF110 there is a 40% TDP increase for a 30%~ performance increase.

GF110 isn't all that less efficient than GF114.

Keep in mind Nvidia is shipping GK104 with 8+6+6 connectors on the reference PCB.

GK104 gave the so called "enthusiast" websites like [H] a product that has a high end name and marginally more performance than the turd AMD put out while being considerably more quite and used less power to do it.

Neither the 7970 nor the 680 are what I'd call a next gen enthusiast card, Nvidia pandered to the weakest link in our market, the people who buy a i7-2600k and see how far it can go on the stock cooler than complain about how hot/loud it gets at 4GHz.


Still waiting for an upgrade/680 oc'ing getting fixed/price to go down, hopefully we see a 685 or something based on GF110 with a 250+ TDP that all the so called "enthusiast" websites hate but I can put on water and get 50%+ more performance out of than this 680.
Exactly what i want, ill take the normal TDP of 250w+ for a REAL enthusiasts card. Im thinking a GTX 685 will come.
__________________
3770K|ASrock Z77 Extreme11|4x8gb DDR3 1600|4xHD6970|1440P 120hz - Buzzard
X6 1055T|ASUS M4A89GTD-EVO USB3|2x4gb Gskill 1600|HD4870X2 + HD48701gb - Virgo
2xXeon L5639|EVGA SR2|6x8gb DDR3|4x2TB
Smoblikat is online now   Reply With Quote
Old 03-23-2012, 06:08 PM   #79
Lepton87
Golden Member
 
Lepton87's Avatar
 
Join Date: Jul 2009
Location: Poland(EU)
Posts: 1,952
Default

Quote:
Originally Posted by Qbah View Post
So they did a better refresh on the GTX460 than they did on the GTX280 - how does that prove anything related to a generation leap?

AMD/ATi:
X700 vs 9800XT - nope
X1600 vs X850XT - nope
HD2600XT vs X1950XTX - nope
HD4850 vs HD3870 - yes... here's the only one for AMD/ATi (HD48xx was a killer product)
HD5770 vs HD4890 - nope (no refresh here), same for HD5830
HD7850 vs HD6970 - nope - jacked up pricing too on the 7-series

nVidia:
6600GT vs FX5900Ultra - yes... here's the only one for nVidia (FX series was a total failure)
7600GT vs 6800Ultra - nope (no refresh here)
8600GT vs 7900GTX - nope
GTS250 vs 9800GTX - those are the same cards...
GTX460 vs GTX285 - nope
GTX660Ti(?) vs 580GTX - can't say yet - wouldn't hold my breath though!
I'm saying that that 460 could easily have GTX560TI performance if they wanted. Only the thermals would be different from GTX560TI.
__________________
5820K @4.25GHz cache ratio 1:1 (4250MHz) 34x125 1.275V,ASUS ROG Rampage V, 16GB DDR4 G.Skill 2667@2750MHz CL 15 4 channels,Noctua NH-D14(CUSTOM WC incoming), Gigabyte GTX Titan SLI, 2x Corsair MX100 256 in Raid 0, 2xSeagate 3TB 7200RPM in RAID 0, Sandforce 2 120GB, Sandforce 64GB, 2TB WD Caviar Green, Seagate 1TB 7200RPM,Seagate 500GB, Seagate 1TB USB 3.0, BE Quiet 1200W, dell u2711
Lepton87 is offline   Reply With Quote
Old 03-23-2012, 06:52 PM   #80
Qbah
Diamond Member
 
Qbah's Avatar
 
Join Date: Oct 2005
Location: Copenhagen, DK
Posts: 3,466
Default

Quote:
Originally Posted by Lepton87 View Post
I'm saying that that 460 could easily have GTX560TI performance if they wanted. Only the thermals would be different from GTX560TI.
And also would:
- have lower yields (=higher price), as less chips would hit target clocks
- need better cooling (=louder or beefier=pricier)
- need better PSU (OEMs don't like it)
- have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really
__________________
Q9450 | GA-EP35-DS3R | 8GB RAM | X25-M | P182 | HX520W | Win8Pro 64bit | GTX670 | 40" 1080p HDTV | Z-2300
Qbah is online now   Reply With Quote
Old 03-23-2012, 07:16 PM   #81
BallaTheFeared
Diamond Member
 
BallaTheFeared's Avatar
 
Join Date: Nov 2010
Posts: 8,128
Default

Quote:
Originally Posted by Qbah View Post
And also would:
- have lower yields (=higher price), as less chips would hit target clocks
- need better cooling (=louder or beefier=pricier)
- need better PSU (OEMs don't like it)
- have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really
I think GF100 had more of a say than the things you listed, Nvidia was behind with the 480, missing their target cores and probably clocks as well, which affected the 470 as well.

GF104 had to be clocked low enough not to outpace the 470, and still allow the 470 to look decent considering it was $100 more.
BallaTheFeared is offline   Reply With Quote
Old 03-23-2012, 07:58 PM   #82
Lepton87
Golden Member
 
Lepton87's Avatar
 
Join Date: Jul 2009
Location: Poland(EU)
Posts: 1,952
Default

Quote:
Originally Posted by Qbah View Post
And also would:
- have lower yields (=higher price), as less chips would hit target clocks
- need better cooling (=louder or beefier=pricier)
- need better PSU (OEMs don't like it)
- have far lower OCing capability (wouldn't be such a hit in the enthusiast segment)

So... it wouldn't be really as popular as the GTX460 was / is. And wouldn't sell as much. Instead, they put out a product on the market which sold like hot cakes and made its buyers happy. As simple as that. There's no conspiracy here, really
But if GF100 flopped any more than it did they had plan B. GTX680 seems like plan B all the way.
__________________
5820K @4.25GHz cache ratio 1:1 (4250MHz) 34x125 1.275V,ASUS ROG Rampage V, 16GB DDR4 G.Skill 2667@2750MHz CL 15 4 channels,Noctua NH-D14(CUSTOM WC incoming), Gigabyte GTX Titan SLI, 2x Corsair MX100 256 in Raid 0, 2xSeagate 3TB 7200RPM in RAID 0, Sandforce 2 120GB, Sandforce 64GB, 2TB WD Caviar Green, Seagate 1TB 7200RPM,Seagate 500GB, Seagate 1TB USB 3.0, BE Quiet 1200W, dell u2711
Lepton87 is offline   Reply With Quote
Old 03-23-2012, 08:13 PM   #83
rusina
Junior Member
 
Join Date: Mar 2012
Posts: 24
Default

Quote:
Originally Posted by OCGuy View Post
Huh? Since when is "high end" determined by die size? It is the fastest GPU available, that alone makes it "high end".

I think people are grasping at straws here. Probably the same people that complained about Fermi 1.0 being big, loud, and hot.
Die size is one thing indicating this. There's poor double precision performance and relatively low power consumption as well. It's easy to see that this GK104 isn't the best possible GPU Nvidia could do for consumer market.

Also the claim was that originally GK104 wasn't meant as high end GPU. That doesn't necessary mean that Nvidia couldn't sell it as one.

Quote:
Originally Posted by evilspoons View Post
Well... we're now seeing a 28 nm GPU vs a 40 nm GPU. If the design were exactly the same as the 560 ti, it would still be significantly smaller due to feature size change... on the order of 50%. So... you've got a chip that SHOULD be half as big but is only 20% smaller, meaning there's a whole bunch of extra stuff.
It should be only half as big if it was only die shrink. For example GTX 460's GPU had 500 million transistors more than GPU on GTX 260. It was still faster and smaller.
rusina is offline   Reply With Quote
Old 03-23-2012, 08:15 PM   #84
Mopetar
Diamond Member
 
Join Date: Jan 2011
Posts: 3,018
Default

Quote:
Originally Posted by BallaTheFeared View Post
I think GF100 had more of a say than the things you listed, Nvidia was behind with the 480, missing their target cores and probably clocks as well, which affected the 470 as well.

GF104 had to be clocked low enough not to outpace the 470, and still allow the 470 to look decent considering it was $100 more.
I don't think it affected the 470 too much. GF100 certainly had problems as the most CUDA cores any of its products shipped with was 480, but the 470 had 448 cores enabled where as GF104 only had 336 cores, leaving plenty of breathing room. The differences in die size alone were quite staggering, leaving plenty of room for the 470 that it wouldn't be necessary to artificially gimp the 460.
Mopetar is offline   Reply With Quote
Old 03-23-2012, 08:26 PM   #85
Mopetar
Diamond Member
 
Join Date: Jan 2011
Posts: 3,018
Default

Quote:
Originally Posted by Lepton87 View Post
But if GF100 flopped any more than it did they had plan B. GTX680 seems like plan B all the way.
I'm guessing that nVidia just didn't want to repeat the whole thing a second time. Assuming that GK100 was anything like GF100, it would have been massive, probably hot (GK104 is a little hotter than expected, but that could just be the cooler design), and likely would need to have cores disabled just to yield decently. nVidia didn't want to come stumbling out of the gate again, especially if their wafers were even more limited this time around. It just wouldn't be cost effective, or allow them to get things done on time.

I don't know if they rolled the dice or had enough information about what AMD was doing to make a good judgement call, but it's worked out well enough for them. We don't have any real clue what the schedule was looking like on GK100, but it was apparently either behind by enough that nVidia needed to release something or there were some design hurdles that couldn't be fixed without a major overhaul so that it was scrapped. Without any better evidence it's just as likely that nVidia made a decision to cancel a GK100 that might have eventually worked, albeit with lower yields and at a later date than hoped for, or they had no choice but to cancel a GK100 that had no real hope of working as planned without major work and months of time.
Mopetar is offline   Reply With Quote
Old 03-23-2012, 09:21 PM   #86
BallaTheFeared
Diamond Member
 
BallaTheFeared's Avatar
 
Join Date: Nov 2010
Posts: 8,128
Default

Quote:
Originally Posted by Mopetar View Post
I don't think it affected the 470 too much. GF100 certainly had problems as the most CUDA cores any of its products shipped with was 480, but the 470 had 448 cores enabled where as GF104 only had 336 cores, leaving plenty of breathing room. The differences in die size alone were quite staggering, leaving plenty of room for the 470 that it wouldn't be necessary to artificially gimp the 460.
480 > 470 was only one less cluster, the only way to go from $350 to $500 when the performance difference per clock was going to be around 5% was to reduce the clock rate of the 470 noticeably lower than the 480, the 480 was already gimped at 700MHz...

So you drop per clock performance by 5%, you need another drop in performance, starting at 700MHz base you're going to need a decent performance hit between the two, 250mb of vram isn't going to be enough to warrant $150.... 5% per clock + 13% clock reduction = $350 470



Anyways, I'm just saying the 460's TDP is so low and it overclocked well enough that they obviously had room both in TDP and in clocks, but were limited more so by GF100 than they were the other factors listed.


GF114 is GF104, which is what GK104 is. Will we ever see another GF100, "BigK" card? I dunno, Nvidia took so much shit over GF100 and barely skated by with GF110 what's the point?

Look at the reviews for the 680, reviewers seem to love these cards.. None of them care it has awful overclocking potential, none of them care it isn't much faster than the 580 they've had for over a year. They don't even care it has no compute power.

All they care about is that it has better perf/watt than the 580/7970, it's barely any faster, noticeably quieter, and it's called the GTX 680. Personally I've never felt so detached from review sites before in my life, they've seemingly killed BigK and welcomed a product with open arms that has no place in a high end system, yet carries a high end price tag.
BallaTheFeared is offline   Reply With Quote
Old 03-23-2012, 09:48 PM   #87
nenforcer
Golden Member
 
nenforcer's Avatar
 
Join Date: Aug 2008
Posts: 1,500
Default

Quote:
Originally Posted by Puppies04 View Post
I do hope that everyone complaining that it should be called a 660 TI (and priced around the 560TI mark) actually realise that if nvidia did do this it would basically kill AMDs high end graphics division.


I think this 660 talk started after only after yesterday's release and the architecture comparison to GF114 that GK104 is.

The picture above surfaced several weeks ago indicated it may have been originally targeted as a GTX 670 Ti and possibly the second batch of silicon (what do they call it A2?) allowed them to clock fast enough to be marketed as the GTX 680.

Or its possible that the GPU above is completely unrelated and is now something like GK106.
__________________
nForcer 2
======
AMD Sempron 3300+ @ 2.2GHz Barton Sock A 512Kb L2
ASUS A7N8X-E Deluxe nForce 2 MB
Seagate 7200.10 7200 RPM 250GB IDE w/ 8MB Cache Perpendicular
EVGA Geforce 7800GS 256MB AGP 8X
BFGTech Ageia Physx 128MB PPU PCI
1GB (512MBx2) Crucial Ballistix DDR400 4-4-4 Dual Channel
nVidia Soundstorm Dolby Digital Coaxial
Sony CPD-E540 21" CRT Monitor 1600x1200 85Hz VSYNC Off
Windows XP SP3

Last edited by nenforcer; 03-23-2012 at 09:51 PM.
nenforcer is offline   Reply With Quote
Old 03-23-2012, 11:23 PM   #88
Ajay
Platinum Member
 
Ajay's Avatar
 
Join Date: Jan 2001
Location: NH, USA
Posts: 2,166
Default

Quote:
Originally Posted by nenforcer View Post

I think this 660 talk started after only after yesterday's release and the architecture comparison to GF114 that GK104 is.

The picture above surfaced several weeks ago indicated it may have been originally targeted as a GTX 670 Ti and possibly the second batch of silicon (what do they call it A2?) allowed them to clock fast enough to be marketed as the GTX 680.

Or its possible that the GPU above is completely unrelated and is now something like GK106.
I think most of the 660 talk resulted from the GPU being called the GK104. The 670 Ti talk started because there were some rumors of it being called such - but as you point out, that could have just been confusion by some after seeing 570Ti marked parts (which may have been mock-ups of the real 670 Ti that has yet to come. It's all pretty crazy actually
__________________
Asus P6T V2 Deluxe Ci7 970 @ 4.2GHz w/HT, Corsair H100i, 2x240GB SanDisk Extreme RAID0, 2x WD VR 300GB RAID0, MSI GTX 680 PE @ 1110MHz, 12GB G.Skill Ripjaws DDR3 1600, Corair 850HX, Corsair 800D case. Win7 x64 Ultimate. Dell U2412M.
Heatware
Ajay is offline   Reply With Quote
Old 05-08-2012, 02:59 PM   #89
Smoblikat
Diamond Member
 
Join Date: Nov 2011
Posts: 3,794
Default

http://www.overclock.net/t/1254426/v...vidia-reshaped

Ya, well to all of you who doubted/flamed me...guess what? I WAS RIGHT!
__________________
3770K|ASrock Z77 Extreme11|4x8gb DDR3 1600|4xHD6970|1440P 120hz - Buzzard
X6 1055T|ASUS M4A89GTD-EVO USB3|2x4gb Gskill 1600|HD4870X2 + HD48701gb - Virgo
2xXeon L5639|EVGA SR2|6x8gb DDR3|4x2TB
Smoblikat is online now   Reply With Quote
Old 05-08-2012, 03:20 PM   #90
Rvenger
VC&G Moderator
 
Rvenger's Avatar
 
Join Date: Apr 2004
Location: Pittsburgh, PA
Posts: 5,251
Default

Quote:
Originally Posted by Smoblikat View Post
http://www.overclock.net/t/1254426/v...vidia-reshaped

Ya, well to all of you who doubted/flamed me...guess what? I WAS RIGHT!

You know thats a good indication that Nvidia doesn't want to see AMD fall flat on its face right?

Kind of like Ford supporting GM when the bankruptcy crisis occured.
__________________
i7-5820k - ASRock X99 Extreme 3 - 16gb Crucial DDR4 2133 - Nvidia Reference GTX 970- 2 x 240gb PNY Optima SSDs in Raid 0 - 3tb Seagate 7200.14 HDD - Rosewill HIVE 850 Modular PSU - Phanteks Enthoo Pro
Rvenger is offline   Reply With Quote
Old 05-08-2012, 03:27 PM   #91
Destiny
Platinum Member
 
Join Date: Jul 2010
Location: Los Angeles, CA
Posts: 2,268
Default

Quote:
Originally Posted by Smoblikat View Post
http://www.overclock.net/t/1254426/v...vidia-reshaped

Ya, well to all of you who doubted/flamed me...guess what? I WAS RIGHT!
I never doubted you... I had a suspician, but just needed evidence... When I saw leaked test results for the GTX 670 it just didn't add up because performance was too close to the GTX 680 - which is similar like the GTX 560ti and GTX 560... Now they are selling the GTX 660ti (Now the GTX 680) and GTX 660 (Now the GTX 670) as high end cards because the performance does designate them as high performance... it also may explain the low supply because they were caught off guard by the performance of the card...

Good Job btw!
__________________
I joined AnAndTech to become informed so I won't get suckered by Tech Companies...

Destiny is offline   Reply With Quote
Old 05-08-2012, 03:37 PM   #92
blastingcap
Diamond Member
 
blastingcap's Avatar
 
Join Date: Sep 2010
Posts: 5,884
Default

Quote:
Originally Posted by Smoblikat View Post
http://www.overclock.net/t/1254426/v...vidia-reshaped

Ya, well to all of you who doubted/flamed me...guess what? I WAS RIGHT!
Please credit the proper source, which is VR Zone not some forum.

http://vr-zone.com/articles/how-the-...ed-/15786.html

This reminds me of the Intel Core scenario where Intel messed up with Netburst after a while and had to dig deep into their Israeli "high efficiency/mobile/Pentium M" team which was working on innovative energy efficiency measures. They took the high-efficiency architecture lessons from Pentium M and turned those lessons into Core and Core 2 Duo.

First there were energy guzzlers, but the heat/power/noise got so bad that performance hit a wall.

Then they turned into high-efficiency architectures.

Then they supercharged the high-efficiency architectures to make something both fast and relatively efficient.

P.S. This is also somewhat old news. TPU has had this article up for ages: http://www.techpowerup.com/162901/Di...X-670-Ti-.html
__________________
Quote:
Originally Posted by BoFox View Post
We had to suffer polygonal boobs for a decade because of selfish corporate reasons.
Main: 3570K + R9 290 + 16GB 1866 + AsRock Extreme4 Z77 + Eyefinity 5760x1080 eIPS
blastingcap is offline   Reply With Quote
Old 05-08-2012, 04:44 PM   #93
guskline
Diamond Member
 
Join Date: Apr 2006
Location: Lebanon, PA
Posts: 3,094
Default

You never own the fastest video card made very long because a faster one is just around the corner! That being said last week I bought what was the fastest Nvidia card till the 690 GTX debuted. Did I pay too much ? All relative. I was in the market for a modern single GPU to power 3 monitors at a combined 5760 x 1080 resolution. My realistic choices were an AMD 7970 vs a Nvidia 680 GTX. Was I wrong? I don't think so but that's only me. BTW I have the card and but fot a failed ssd the system runs much better and smoother in games than the 6970. For me it was worth it. I have no buyers remorse as I'm flying a WWI plane over 3 monitors with incredible fps in high resolution. My fellow posters who own the 7970 can make the same claims and I'm glad for them also.
__________________
3930k @ 4.6 -Asus SbTh X79 - Custom WC - MO-RA3 Pro-420 + RX 360+XSPC-2 Sapphire Tri X R9-290s-CF with EK blocks/bridge -16G (4x4 quad)DDR3-2133 - Intel 530 SSD - Win 8.1 - PC P&C 1200W Silencer Mk III - BenQ BL3200PT

Last edited by guskline; 05-08-2012 at 04:47 PM.
guskline is online now   Reply With Quote
Old 05-08-2012, 05:04 PM   #94
SolMiester
Diamond Member
 
SolMiester's Avatar
 
Join Date: Dec 2004
Location: Napier, New Zealand
Posts: 4,821
Default

Quote:
Originally Posted by Smoblikat View Post
So i heard several rumors about this and found several people posting about it. is it true that nvidia released their GTX 660Ti (or other lower end card) as a 680 because they didnt consider ATI to offer enough competition?
Don't think it was that simple, the gk110 would of been the 680, but
Probably had yield issues. The gk1114 didn't have it's big brother compute, so clocked well enough to beat or equal the 7970, so let's get it out the door to start 28nm line up.....probably why they have the 690 out now rather than the smaller gpu line up.....
__________________
HOME-LianLi PC-9F,ASRock P67Pro3, i5 2500k @4Ghz, 8Gb HyperX, ASUS DC GTX660OC, Corsair Force 120 SSD, HP ML110G6 HOSTING-Plex/W8/MINT/NAS..
My Super 6 Calais
SolMiester is offline   Reply With Quote
Old 05-08-2012, 05:17 PM   #95
SolMiester
Diamond Member
 
SolMiester's Avatar
 
Join Date: Dec 2004
Location: Napier, New Zealand
Posts: 4,821
Default

Quote:
Originally Posted by grkM3 View Post
We agree with everything you are saying...all we are trying to say is this card was not supposed to be nv top end card.

What the heck are they going to call the real 680 when it comes out?if i was nv i would of called it a 660ti just to bust amds ballz and to let them know they have a monster waiting.

The op asked if its a relabled gpu and it is 100% not a real 680 just called one because it beats amds top end gpu.

Dont forget its on a 256bit bus also
Interestingly enough, this is exactly what I would of done too.....that would of killed any enthusiasm for the AMD line up with the thought of higher end parts to come. I guess, it's moot, as their is no further high end, but I guess they wouldn't be able to charge $500 for a mid range.....
__________________
HOME-LianLi PC-9F,ASRock P67Pro3, i5 2500k @4Ghz, 8Gb HyperX, ASUS DC GTX660OC, Corsair Force 120 SSD, HP ML110G6 HOSTING-Plex/W8/MINT/NAS..
My Super 6 Calais
SolMiester is offline   Reply With Quote
Old 05-09-2012, 04:01 AM   #96
Magic Carpet
Golden Member
 
Magic Carpet's Avatar
 
Join Date: Oct 2011
Posts: 1,837
Default

Quote:
Originally Posted by blastingcap View Post
This reminds me of the Intel Core scenario where Intel messed up with Netburst after a while and had to dig deep into their Israeli "high efficiency/mobile/Pentium M" team which was working on innovative energy efficiency measures. They took the high-efficiency architecture lessons from Pentium M and turned those lessons into Core and Core 2 Duo.
If memory serves me right, Pentium D (Netburst) had also been developed by the same Israeli team.
Magic Carpet is online now   Reply With Quote
Old 05-09-2012, 06:14 AM   #97
T_Yamamoto
Lifer
 
T_Yamamoto's Avatar
 
Join Date: Jul 2011
Location: North Carolina State University
Posts: 13,779
Default

Quote:
Originally Posted by Tempered81 View Post
Still makes me lauhh
__________________
Intel i5 4430, MSI B85M-P33, Gigabyte Windforce R9 270x, 6GB ram, Intel 530 128gb SSD, WD 160gb HDD, Toshiba Canvio 1tb External HDD, Windows 7, NZXT Vulcan, Dell E2414H, ROCCAT Kova+ Max, TG3 BL82
Mid 2014 13" Macbook Pro, Intel Core i5-4278U 2.6Ghz, Intel Iris 5100 GPU, 8GB ram, 128gb SSD
iPhone 6 16gb CDMA
Heatware | Steam
T_Yamamoto is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 08:44 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.