Rumor: Price Cuts on GTX660Ti series coming next week

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You are spinning the data.

Anandtech shows the system with the gtx670 drawing 36 less watts.
hardware canucks shows the gtx670 drawing 16 less watts.
Techpowerup shows it is drawing 18 MORE WATTS. If all else would be the same, then the gap between anandtech and techpowerup is 54 watts when comparing the same two video cards (34 watts with hardwarecanucks).

And I agree, it's not enough to fret over. I'm not trying to get to any kind of conclusion as to which card or vendor has the most efficient architecture, I'm pointing out the big difference between techpowerup's methods of measuring power draw and other websites, and why I think overall every other website's power draw numbers add up to show there is more going on than what isolating a GPU's power draw will show.

You are missing the point entirely. They all use different games when benching. They all use different setups. You can't compare them unless they are doing everything the same.

As far as TPU goes it depends on whether you measure Avg. 7950 18w less/peak 7950 8w less/or max 7950 17w more. They are all different. Read them all and figure it out for yourself what the difference is to you.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
are you suggesting that somehow when these reviewers change only the GPU for testing power consumption that their PSU efficiency somehow is effected by going up or down?

It can, depending on how good the PSU is! It is true that how much a system draws often affects your PSU's efficiency. PSU's do not have a linear efficiency curve.

"When you open the Power Consumption section in a hardware review, you will usually find power draw data as measured from the wall outlet. However, we use it only when we need to quickly estimate the power consumption of a computer or some other device (a consumer wattmeter is most handy then because it needs no kind of preparation) but not for serious tests. The fact is, although this measurement method is simple, it provides an impractical result....." See full article for explanations.
PC Power Consumption: How Many Watts Do We Need?

How much your system draws at the wall is affected by the efficiency of the PSU. Therefore, what a system uses in the real world in your house depends on how efficient your PSU is @ specific power load level that those components exert on your PSU (and its specific efficiency curve). You also cannot isolate for GPU power consumption without accounting for PSU inefficiency either by just subtracting the total system power consumption.

This is pure math:

Case 1
250W GPU1 + 150W system @ 80% efficient PSU = 500W at the wall
150W GPU2 + 150W system @ 80% efficient PSU = 375W at the wall

What's the difference in power consumption between 2 GPUs? 125W? Not quite! It's "125W, while using an 80% efficient PSU"

Case 2
250W GPU1 + 150W system @ 92% efficient PSU = 434W at the wall
150W GPU2 + 150W system @ 92% efficient PSU = 326W at the wall

What's the difference in power consumption between 2 GPUs? 108W? Note quite! It's "108W, while using a 92% efficient PSU"

If you just looked at Case 1, you'd conclude that GPU1 draws 125W more power than GPU2. That's 100% wrong. GPU1 draws 125W more power only when the system uses a PSU with 80% efficiency. It tells us nothing about what the power difference would be for my system that uses a Silver, Gold or a 92% Platinum rated PSU.

Thus, adjusting for PSU efficiency is critical since all of us have different PSUs. We need the real component power consumption (unless your PSU is exactly the same then this doesn't matter to you), or you have to perform the math yourself on a napkin. If your country uses 240V input voltage and not 120V, your PSU will have a different efficiency as well. You have to account for this if looking at power consumption numbers at the wall in North America.

***Side-note*** Look at the green numbers. By swapping the PSU from an 80Plus to a Platinum, you just shaved off 66W of power at the wall. Welcome to the world of Platinum PSUs and why they may be worth the cost for enthusiasts who care about power consumption on a heavily overclocked 500-700W gaming system!

The most conclusive and accurate way to compare the real power consumption is at the wall. This is the true power consumption and it is absolute.

"True" power consumption is what the components actually use at the PSU level. Real world power consumption is what those components use at the wall using your particular PSU. Since we have different PSUs and the PSU's power rating also impacts its efficiency curve, if you just measure the power consumption at the wall, it does not apply to another PSU, unless you adjust for the for the efficiency of your PSU. Without this, the review tells me nothing about how much those same components would draw in my own system. If I have a Platinum PSU and you have an 80Plus, you cannot draw any results from the total power consumption numbers of a review that uses a Silver PSU. There is 0 accuracy for your personal needs (unless you make the mathematical adjustments) since your PSU's efficiency is different than that of the PSU used in the review unless you have the same power supply and the same electricity voltage in the country in which you live.

Therefore, you either have to back into the real world power consumption by applying the reviewer's PSU efficiency and then re-applying your own PSU's efficiency rating, or the reviewer has to do this for you from the beginning by isolating the power consumption numbers from the wall, then applying the PSU efficiency and presenting the actual power consumption of the components - say videocard, or GPU, or total system.

The total power consumption at the wall without any adjustment for PSU inefficiency only measures what that particular system uses at load with that particular power supply. This has nothing to do with the accuracy or real world use of the said components in another system that has a different PSU with a different efficiency rating. If you understand this when looking at the total power consumption numbers, then there is no problem with showing power at the wall. You can make the adjustments for your own PSU.

If you use a straight up subtraction like you and tviceman are doing, that's 100% wrong because a person using a less power efficient PSU at home than the reviewer's would use more power, while the person using a more power efficient PSU uses less power. In other words, the more efficient your PSU is vs. the one used in the review, the LESS the delta would be between 2 systems in the review in your own home, while the less efficient your PSU is, the MORE the delta would be between the 2 systems for you.

I am with Tviceman on this, and it appears more reviewers would also agree. It seems like a tiny few do not do a total system draw at the wall method.

No, they would not agree. They main reasons they measure at the wall and stop it there is because they are lazy (or don't have sufficient review time) to make the proper adjustments. The ones who do have actually done a better job.

This explains why power consumption at the wall is impractical (See link above), and pretty much useless unless you make your own adjustments.

You need to understand the context of what it means "At the wall power consumption" if you are going to apply it to everyone on this forum as the universal truth (as you claim the most accurate real power consumption).

No one disagrees that you cannot use the total power consumption of a system at the wall (even though it's not preferable since that means us users have to go through extra steps), as long as you adjust those #s using the PSU's efficiency used in the review to arrive at the actual real world power consumption of the components. This is because the total system power consumption at the wall changes depending on your PSU's efficiency (i.e., if you have an 80% PSU or a 92% one, etc.)

Now I am going to show you why the 18-36W of power you guys keep talking about is blowing hot air, unless you already have a Platinum PSU! :)

80% PSU Scenario

System 1 draws 350W of power at the wall (means 280W actual @ PSU level)
System 2 draws 400W of power at the wall (means 320W actual @ PSU level)

Looks like a gamer here has to put up with 50W of extra power consumption at the wall. System 1 looks great!

Now replace that 80% PSU with a 92% PSU:

92% PSU Scenario

Since System 1 only used 280W of at the PSU level, the actual at the wall power consumption now becomes just 304W (or 46W less than System 1 used at the wall with an 80% PSU)
Since System 2 only used 320W at the PSU level, the actual at the wall power consumption of this system is now only 347W (or 53W less than used at the wall with an 80% PSU).

And now look at the Blue numbers and the punchline:

A PC gaming system which in aggregate draws 280W of actual power at the power supply level, equipped with an 80% efficient PSU, uses more power in the real world at the wall than another system that draws 320W at the power supply level but uses a 92% efficient PSU.

I just eliminated the entire 50W of real world at the wall power consumption penalty of the power hungry 400W System #2 against what at first appears to be a more efficient 350W System #1 by swapping an average 80Plus PSU with a more efficient modern Platinum PSU. This means if people are going to throw their arms in the air about such minute differences as 18-36W of total system power consumption on an enthusiast PC gaming rig that pulls 340-350W of power at the wall (or more), every single person who is claiming this 18-36W to be a material difference needs to immediately go out and buy a Platinum rated PSU. Is everyone here running Platinum PSUs? Didn't think so. So why are we blowing hot air over 18-36W?!

This proves that if you compare total power consumption in a review without accounting for the PSU inefficiency, those #s are meaningless to anyone else that uses a PSU with a different efficiency at home.

So really, unless everyone here has Platinum PSUs, if you are complaining about a 18-36W power delta on your own system that draws around 280-300W of power @ the PSU level, you are just stirring hot air.

If your system has an overclocked Core i5/i7/ Phenom II/X6, etc. + overclocked GPU that uses even more power, if you don't have a Platinum PSU and still discussing 50-60W of power consumption differences as relevant, then please upgrade your PSU to a Platinum one and then I'll take your point more seriously that you really care about power consumption!
 
Last edited:

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
And I agree, it's not enough to fret over. I'm not trying to get to any kind of conclusion as to which card or vendor has the most efficient architecture, I'm pointing out the big difference between techpowerup's methods of measuring power draw and other websites, and why I think overall every other website's power draw numbers add up to show there is more going on than what isolating a GPU's power draw will show.
Argh. I'm not arguing about which card is more efficient. I don't really care. I am arguing about what is the best way to show how efficient something is - by isolating it and ignoring other factors that may skew results when looking at the entire system, or by NOT isolating it and just looking at the entire system. My argument is from an end user point of view, comparing identical systems with different graphics cards is better than isolating the graphics cards because different GPU's are extracting different loads from the CPU and/or other system components.

None of the sites that test power consumption at the wall are doing the best method to finding out efficiency. The best method is to have an average reading; it's quite silly to take the average framerate and put that in a ratio with an instantaneous peak power usage. That doesn't tell you anything, because video cards are variable and will require different levels of power and produce different levels of performance depending on what it is trying to render. And all of these sites that test power at the wall only take instantaneous readings.

In a perfect world, reviewers would log power consumption during the entire benchmark run and give us data per game. We get to see the average performance and the average power consumption per game, and that is going to tell us the efficiency of the video card. That's the best way show it.

If sites were to do this I think the results would be quite interesting. Nvidia and AMD will both have their wins and losses on a per game level concerning power consumption just as they do concerning performance.

Now isolating the video card is valid for a few reasons. One it's simply just interesting to compare the tech of one card to the tech of another with no other variables. On top of just being technically interesting, reducing the variables is good since different boards of the same model already have so many variables, especially with turbo boost. Second, for people pairing these cards with weaker CPUs that are going to "hit a wall", so to speak, then the power consumption difference between systems is going to be less or non existent - the CPU and system is already drawing most of its power. All of these reviewers are using the best of the best processors and most even overclock them.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Running benchmarks should always be like running a controlled experiment. When I ran my benches for 8800GTS 640 and 2900XT, I used 1 computer, but with two identical cloned hard drives save for the graphics driver. Eliminating any and all arguable discrepancies.
With the first hard drive connected, I'd have the 2900XT installed. Ran the benches. Shut it down and disconnect one hard drive and connect the other. Remove the 2900XT and install the 8800GTS 640. Ran the benches.
This is the way all reviewers should run their benchmarks. And in this fashion, you CAN use Tvicemans way of a Kill-0-watt meter at the wall. The efficiency of the PSU would not matter, because you are using the same PSU for both GPUs and the whole system.
A controlled experiment. Only 2 things are different between both test systems. The graphics card and it's driver.
And yes, I'm fully aware that my power draw results may differ from somebody else's who uses this same exact method, but has a power supply with different efficiency. But then you wouldn't compare the actual numbers. Only the differences.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The Sapphire 7950 Vapor -X (Which is a 7950 Boost model) runs at .97v @ 950MHz and O/C's to 1135/1635 with no voltage modification or changes to the stock fan profile.

Nobody's disagreeing that the modified bios that AMD sent to reviewers set the voltage at 1.25v. It was silly of them because nVidia Fanbois will jump all over anything they can and run with it. In actuality it's very hard to buy a card with that voltage specification. There's only one reference 7950B listed on Newegg and it's OoS.

Again, nobody's disagreeing with what you are saying. Just that it's irrelevant.

It was silly? -- or maybe AMD knows what they're doing.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Can you please not quote such large posts?

AMD raised default clocks and volts on 7950B cards. How are you ignoring that AIBs sell 10+ SKUs of non-7950B cards that come undervolted with 880-950mhz GPU clocks? Are you that ignorant? Among those at least 4-5 versions overclock to 1050-1125mhz on 1.175V or less. You tend to ignore those cards that enthusiasts buy and focus on a reference card because taking those other 7950 cards into consideration undermines your entire argument.

This is the most pointless argument for enthusiasts since not a single overclock would buy a reference 7950B card in hopes of getting 1100mhz out of it. I just told you that people looking to buy an overclocked 7950 will buy other models. Then I linked you performance data where MSI TF3 @ 1025mhz > GTX670 and that card used 167W. You still ignored that. Glorious. Ok then, I guess after-market videocards do not count. Next stop, HD7970 GE Reference card is hot and loud and therefore all HD7970 GE cards are hot and loud and run at 1.25V. You and notty22 have been repeating this non-stop how we should only look at reference cards. I am sorry. I didn't know I was forbidden to buy non-reference GPUs. I am sorry that you cannot buy a reference 7970 GE card on the market.....but it's being used over and over and over as evidence of how hot and loud it is.

I find it amusing you won't admit that HD7950 OC = GTX670 OC > GTX660Ti and that there are at least 5 after-market 7950s which do not use 1.25V unless they have GPU Boost BIOS.

Curious, what videocards did you buy from Fall 2009 to March 2012? Where any of them by NV? The reason I ask is because NV hasn't made a single power efficient card vs. the competition in the last 3 years so you must have gone with the AMD solution then, right? Or was Fermi's power consumption worth putting up with because of PhysX? I love it, power consumption didn't matter for 3 years and now 30-40W extra is a major deal-breaker on a 350-400W enthusiast rig with an overclocked i5/i7s and overclocked GPUs. Keep this up. I need a laugh on a Sunday night. :thumbsup:

It's pretty funny how you and notty22 continue to use 1 particular reference design of 7950/7970/7970 GE to represent the entire HD7900 series. I was not aware that after-market 7900 series cards do not exist but after-market 660Ti cards do, especially when they are GPU boosting to 1200mhz in launch reviews against the worst 7950 reference card on the market.

For example this 7970 GE Visiontek card that gets lower power consumption than a reference 680 doesn't NOT exist! LMAO!

How about this, you go out and buy an MSI TF3 7950 6+8 PIN card from Newegg since you are so confident 7950s have 1.25V bioses. If it has GPU-Z voltage / HWInfo64 @ 1.25V at 99% GPU load, I'll send you the $ for the card. If you lose, you donate the card to someone on this forum who needs a new GPU upgrade from this thread.

Last 2-3 NV generations GPGPU compute mattered, overclocking mattered (GTX460/470), performance/$ mattered (8800GT, GTX460), power consumption didn't matter (all GTX400/500 series). This round compute is worthless and performance/watt is the most important metric, overclocking is blasphemy and enthusiasts should forget about it without raising power consumption to nuclear reactor levels.

At least if you are going to pretend to be objective, be consistent in your message.

Not ignoring anything -- just not downplaying that AMD when raising default clocks had to raise volts with the HD 7970 and HD 7950.
 
May 13, 2009
12,333
612
126
I'm liking my new skip a generation strategy. The new cards just don't add much $/performance to the equation so I'm skipping a gen. I've been busy anyways. I'll just come back to bf3 and crysis next year sometime and hopefully run them at max settings on a $250 card. Or maybe I'll be sporting a ps4 since Nvidia and AMD have lost their minds.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I'm liking my new skip a generation strategy. The new cards just don't add much $/performance to the equation so I'm skipping a gen. I've been busy anyways. I'll just come back to bf3 and crysis next year sometime and hopefully run them at max settings on a $250 card. Or maybe I'll be sporting a ps4 since Nvidia and AMD have lost their minds.

By skipping a generation do you mean skipping a node or something? Or is GTX 5xx considered a refresh to you instead of a full generation.
 
May 13, 2009
12,333
612
126
By skipping a generation do you mean skipping a node or something? Or is GTX 5xx considered a refresh to you instead of a full generation.

I just kinda lump the 4xx and 5xx series together. I have also owned a 480 and 580 for a good amount of time before I sold them so I feel I got a good feel for that gen of cards. I briefly owned a gtx 670 then returned it. The 670 is a nice card but I just couldn't swallow the $400 price tag. My $400-$500 card days are over and what's available for $200 (what im willing to spend) isn't much of an upgrade over my 460 which I bought for $80.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Oh dang. Hey man, don't be so mad. How many watts does everyone's overclocked CPU suck up? Why aren't you crying about that? You obviously biased people are sickening. Arguing over power draw less than a freakin light bulb gets you going? :rolleyes:

Are you seriously this daft? Did you not read ANY of the last several posts I have made? For probably the third time, I don't give a sh!t which GPU is the most efficient. I don't give a sh!t which vendor has the most efficient architecture. I don't give a sh!t about overclocking and how much more power is consumed. YOU ARE BRINGING UP NOTHING RELEVANT and are instead trying to read between lines that don't exist to stir the pot.

What I am discussing and trying to argue is that isolating the GPU is not the best way to measure overall efficiency, since other components are involved and apparently are relied upon to deliver different kinds of performance.

Now what exactly am I being biased about, troll?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com

Russian you're a good guy without an agenda, I appreciate that. But your constant massive wall of texts makes it hard to continue discussions. You're bringing up all kinds of stuff that has nothing to do with the point I making. I never once brought up the hd7970GE or the hd7970. In fact, I'm not even talking about any particular video card specifically. There was way more in your post than had anything to do specifically with what I am talking about: measuring power consumption. I was merely using examples, when looking at day 1 review cards across several websites, techpowerup has a significantly slanted power draw conclusion than most other websites. Most websites find that systems with a gtx670 pull about the same amount of power or noticeably (>20 watts) less than the original hd7950. However techpowerup finds that peak and average power consumption with the hd7950 is lower (with average power draw having a larger gap, which makes it even more unusual).

It doesn't add up is what I am saying. Even when taking into account PSU inefficiency,and factoring 10-20% off the watts being drawn at an outlet, most review sites are still showing a ~30 watt discrepancy between their results and techpowerup's. 30 watts isn't really anything to cry about, no, but what I am saying is that if all the results we are seeing are accurate, including techpowerup's, then either Crysis 2 (which techpowerup uses for power draw measurements) has an absurdly higher dependence on Kepler than GCN (very unlikely) OR
there are certain GPU's that rely more on the CPU and other system components in some way, somehow that force them to draw more power.

If you have card X and you're drawing 250 watts from the wall when playing Rainbow Land, then you replace it with card Y that techpowerup shows is 20 watts more efficient than card X, but are now drawing 255 watts at the wall when playing Rainbow Land, then despite the card being more efficient the process by which it ventilates heat or draws from other system components completely negates it's efficiency. Thus, the reason for what I am saying, it's best to draw from the outlet because that is what the entire system is drawing and needs to keep it going.

Again, not trying to argue about a specific architecture or which one is more efficient or red or green or yellow. It's neat and all for Nvidia or AMD to claim they have the most efficient GPU, but we can't just use the GPU alone. Every other component in the system has to be working, or at least be powered up. And apparently it makes a difference.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Russian you're a good guy without an agenda, I appreciate that. But your constant massive wall of texts makes it hard to continue discussions. You're bringing up all kinds of stuff that has nothing to do with the point I making. I never once brought up the hd7970GE or the hd7970. In fact, I'm not even talking about any particular video card specifically. I was merely using examples, when looking at day 1 review cards across several websites, techpowerup has a significantly slanted power draw conclusion than most other websites. Most websites find that systems with a gtx670 pull about the same amount of power or noticeably (>20 watts) less than the original hd7950. However techpowerup finds that peak and average power consumption with the hd7950 is lower (with average power draw having a larger gap, which makes it even more unusual).

It doesn't add up is what I am saying. Even when taking into account PSU inefficiency,and factoring 10-20% off the watts being drawn at an outlet, most review sites are still showing a ~30 watt discrepancy between their results and techpowerup's. 30 watts isn't really anything to cry about, no, but what I am saying is that if all the results we are seeing are accurate, including techpowerup's, then either Crysis 2 (which techpowerup uses for power draw measurements) has an absurdly higher dependence on Kepler than GCN (very unlikely) OR
there are certain GPU's that rely more on the CPU and other system components in some way, somehow that force them to draw more power.

If you have card X and you're drawing 250 watts from the wall when playing Rainbow Land, then you replace it with card Y that techpowerup shows is 20 watts more efficient than card X, but are now drawing 255 watts at the wall when playing Rainbow Land, then despite the card being more efficient the process by which it ventilates heat or draws from other system components completely negates it's efficiency. Thus, the reason for what I am saying, it's best to draw from the outlet because that is what the entire system is drawing and needs to keep it going.

Again, not trying to argue about a specific architecture or which one is more efficient or red or green or yellow. It's neat and all for Nvidia or AMD to claim they have the most efficient GPU, but we can't just use the GPU alone. Every other component in the system has to be working, or at least be powered up. And apparently it makes a difference.


I was wondering when someone else was going to notice that. Talk about trolling in thread, get back a wall of text regarding AIBs with 800 pictures and 300 links none of which have anything to do with what is being replied to, etc. ;)
 

Crap Daddy

Senior member
May 6, 2011
610
0
0
I was wondering when someone else was going to notice that. Talk about trolling in thread, get back a wall of text regarding AIBs with 800 pictures and 300 links none of which have anything to do with what is being replied to, etc.

If anybody has the patience to read the aforementioned walls and walls of text one red line crosses all, Nvidia has no card worth buying this time around (maybe the 670 but hmmm, no). Wait, they had no card worth buying over AMD since 2009.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I just kinda lump the 4xx and 5xx series together. I have also owned a 480 and 580 for a good amount of time before I sold them so I feel I got a good feel for that gen of cards. I briefly owned a gtx 670 then returned it. The 670 is a nice card but I just couldn't swallow the $400 price tag. My $400-$500 card days are over and what's available for $200 (what im willing to spend) isn't much of an upgrade over my 460 which I bought for $80.

Wtf, $80 for a GTX 460 is a good deal even by today's standards, I can see why you are happy staying on that for a while longer.

Btw speaking of the 460, I remember tussling with at least one forumer here about that card vs a 6850 since they were almost tied in performance and price, leaving operating costs (mostly power bills) and NV-specific stuff (mostly CUDA, Physx) being the distinguishers. People were crucifying me for using power draw as a tie-breaker stat when they were within 10% power draw of each other even at load, but think about it: people make a lot out of a 10% performance differential, or 10% difference in initial price. So why not power draw, in particular IDLE power draw, not LOAD, because most people idle more than they put their stuff at load.

IIRC, there was up to $50 difference in power costs assuming 3 years of 24/7 operation with light to modest gaming loads for a portion of that, for the 6850 vs Gtx 460.

If you do a similar analysis (assume 21 hours idle, 2 hours gaming, 1 hour flash/youtube/movies every day on average) and multiply over 3 years for your card of choice, at common voltages (remember that there is no stock voltage on some cards anymore as they allow the stock voltage to fluctuate with binning sometimes), and the answer is, say $50, would that change anything? Perhaps yes. Even for high end cards, $50 is a substantial amount of the total 3-year cost. So folks arguing power draw are on solid footing, though the focus should be on idle power draw, not load; or at least include idle in the discussion even if you don't use leave your PC on 24/7. Even when my PC is on, I browse the web or whatever for more hours than I game.

Also it's kind of dubious to say "hey just get a (more expensive) platinum PSU and resell your old PSU or whatever, and you can cover the extra wattage for HD7xxx as opposed to GTX 6xx." That's an additional cost over and above that of the HD7xxx. And besides, you can pair a platinum PSU with a GTX 6xx as well--it's not like you only pair platinum PSUs with AMD cards.
 
Last edited:

AgentUnknown

Golden Member
Apr 10, 2003
1,527
5
81
Anyone upgrading from. 560ti to 660ti? Is it worth it? I play bf3 and old steam games mostly (tf2, csgo, sf4)
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Anyone upgrading from. 560ti to 660ti? Is it worth it? I play bf3 and old steam games mostly (tf2, csgo, sf4)

For the valve source games, no it's not worth it. For BF3, yes you will get a substantial performance boost. However, I'd wait a few days (Sept. 13th) to see if and how much Nvidia drops the price on the gtx660ti and whether or not it is the best deal for what you are looking to spend (assuming you don't care who your card's GPU was designed by).
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
It doesn't add up is what I am saying. Even when taking into account PSU inefficiency,and factoring 10-20% off the watts being drawn at an outlet, most review sites are still showing a ~30 watt discrepancy between their results and techpowerup's. 30 watts isn't really anything to cry about, no, but what I am saying is that if all the results we are seeing are accurate, including techpowerup's, then either Crysis 2 (which techpowerup uses for power draw measurements) has an absurdly higher dependence on Kepler than GCN (very unlikely) OR
there are certain GPU's that rely more on the CPU and other system components in some way, somehow that force them to draw more power

http://techreport.com/review/23150/amd-radeon-hd-7970-ghz-edition/10

Interesting. These results are pretty different from what we saw when we used Skyrim to generate the load. Really didn't expect to see the stock 7970 drawing less power than the GeForce GTX 680. We may have to use multiple games next time around, if time permits.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com

Thank you for that. So it appears that different games can cause vastly different power draws depending upon the architecture (I say vastly, because a 10-15% swing in power draw, I think, is a vast difference when talking about the EXACT SAME CHIP). And this brings up another point, expanding on my argument. Hardocp uses several games to measure power draw, and it's measuring the entire system. I think THIS is the best way to go about measuring power draw. Just like it would be complete crap to review a GPU with 1 benchmark, I think the same can be said for measuring power draw.

I know it's probably a PITA for reviewers, but for those who want to argue perf/watt until they're blue in the face, I think having a diverse set of results is the best and most accurate scenario. And I still think measuring from the wall is what matters most, in the end.
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
I know it's probably a PITA for reviewers, but for those who want to argue perf/watt until they're blue in the face, I think having a diverse set of results is the best and most accurate scenario. And I still think measuring from the wall is what matters most, in the end.

It should not be much more effort assuming they have the system plugged in to the power meter before starting any benchmark runs...just record power use during the benchmark runs which they have to do anyway for the rest of the review.