A couple of things wrong with that.
1. You're comparing 560 Ti's Furmark power consumption to GTX 960's Metro: LL power consumption:
NVIDIA 560 Ti (Techpowerup)
Asus 960 Strix (techpowerup)
2. You're comparing a stock 560 Ti vs a factory overclocked GTX 960. Always compare stock vs stock or OC vs OC if you want to shed light on generational differences. Here's the
Asus 560 Ti DC II review which shows a bit higher readings than the stock NVIDIA one, and as such is a better point of comparison to Asus 960 Strix.
3. Furmark creates an artificially high load that doesn't represent reality in any way. Also, newer generations of graphics cards have stricter and smarter TDP limits in place which restrict their power use in Furmark type tests. This makes Furmark useless for comparing actual power consumption as it makes older cards look much worse than they actually are.
Techpowerup Maximum number means the single highest reading during Furmark, while the Peak number means the single highest reading during a game or gaming benchmark.
Peak vs peak:
Asus 560 Ti (3Dmark03) - 157W
Asus 960 (Metro:LL) - 129W
->
28W difference
Maximum vs maximum (Furmark)
Asus 560 Ti - 228W
Asus 960 - 147W
->
81W difference, or a +189% error compared to real world use
4. Usually we're not interested in peak wattage, but average load wattage, when talking about from-the-wall power consumption. AC watts have to do with running costs, not with reliability. So to find out how much more power a 560 Ti is drawing from the wall compared to GTX 960, with CX430 V2 as the power supply, let's use the Average chart in Techpowerup's reviews:
Asus 560 Ti (3Dmark03) - 137W
Asus 960 (Metro:LL) - 114W
Efficiency is 83%.
137W / 0.83 = 165W
114W / 0.83 = 137W
-> 28W difference from the wall.
Of course, none of this directly translates to advice for Charlie98 since he's using a 560 Ti 448 Cores which is closer to GTX 570 level power consumption.