GTX 960 is expected to launch next month.

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exar333

Diamond Member
Feb 7, 2004
8,518
8
91

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Could our beloved mods rename this thread to 750 Ti vs R9 270?

thanks
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Could our beloved mods rename this thread to 750 Ti vs R9 270?

thanks

LOL, right?

Anyways, the 960/960Ti will definitely be a popular SKU for dropping-in a lot of OEM builds. While it probably will not sip the same amount as the 750, it should work great with pretty much any 400w or greater PSU. The 950 will probably follow soon after as a refresh of the 750...

Lots of people forget that 'upgrading' a PSU can often be difficult or near impossible on an OEM build. They don't always use standard PSU pin connectors...for the youngster with limited funds, and a computer they cannot just go out and build themselves, the best GPU for their build is important to them. For us (the enthusiast crowd) we have a lot more latitude. :)
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
Absolutely.

An i3 or lower-end i5 will only ill about 100w total system power. Add 60w to that on the 750 and you are still absurdly low...

Look at this review that even got the 750ti working fine on a 250w psu, from a low-end eom machine...

http://www.pcper.com/reviews/Graphics-Cards/Upgrade-Story-Can-GTX-750-Ti-Convert-OEMs-PCs-Gaming-PCs

Ugh, the low end machines most people buy in the store are typically AMD machines when they are on a bare bones budget. Which use more power and still have the same crappy power supplies in them. They can't handle a 750ti. I mean it will boot up fine, and play a non strenuous game for awhile, but the computer will crash after a bit of use. The budget celeron systems that people buy are just as bad because instead of a 300watt power supply, they usually get a 200 watt which is still just as badly built most of the time.

You don't realize how much these pre-built system makers skimp on power supplies on purpose. They skimp on them as well as the motherboard to force more costs on the consumer down the road. Either in upgrades or service costs for replacements. They are designed to fail. They are designed to go belly up. It keeps people coming back to buy more because they aren't smart enough to realize they are being screwed. And since ALL brands do it, there isn't one brand of budget machine "better" than the others at all. This isn't a new concept and has been going on for decades.

So an average budget celeron system is going to be drawing around 55 watts for the cpu + another 50-70 watts for the rest of the system while running near max usage. Playing a decent game will more than likely hit that max usage. So 110-130 watts are going to be drawn from the average intel budget system and about 150-160 watts from the average budget AMD system. With those budget PSU's barely able to handle 180 watts if your lucky, both systems can squeeze by. Adding a 750ti, the intel system "might" be alright for awhile, but it won't be the most stable for long. Those caps in those PSU are going to stress, heatup, and start popping. The ripple coming out of them are going to be bad and they will cause system crashes even with the 750ti in either budget system.

Yes the average joe with a budget intel system with one of those craptastic PSU's might get by on a 750 ti without realizing the harm to their system it is causing, but the system will fail harder and faster. It could also fail in glorious fashion from when that PSU goes catastrophic. Friend set his couch on fire once when the PSU caught fire with the fan going 100% and shooting the flames out the back of the PSU about 6 feet long. That was a sight to see.

A halfway decent 300 watt power supply will be able to do much more than 300 watts. Hell the old magic Fortron/Sparkle 300w units would easily do 450 watts and were cheap buggers back in the day to boot. That was my most recommended and used power supply back in the late 90's and early 00's.


And no, an r9 270 doesn't require a 500 watt power supply at all. I can run one with a decent 300w power supply easily enough. The reason for the 500 watt recommendation with the card is to account for the shitty rated power supplies out there which don't even handle 500 watts without blowing up because they really are 300 watts supplies that had a label slapped on with something higher so the maker can charge more to the average uninformed consumer.
 

B-Riz

Golden Member
Feb 15, 2011
1,595
762
136
Ugh, the low end machines most people buy in the store are typically AMD machines when they are on a bare bones budget. Which use more power and still have the same crappy power supplies in them. They can't handle a 750ti. I mean it will boot up fine, and play a non strenuous game for awhile, but the computer will crash after a bit of use. The budget celeron systems that people buy are just as bad because instead of a 300watt power supply, they usually get a 200 watt which is still just as badly built most of the time.

You don't realize how much these pre-built system makers skimp on power supplies on purpose. They skimp on them as well as the motherboard to force more costs on the consumer down the road. Either in upgrades or service costs for replacements. They are designed to fail. They are designed to go belly up. It keeps people coming back to buy more because they aren't smart enough to realize they are being screwed. And since ALL brands do it, there isn't one brand of budget machine "better" than the others at all. This isn't a new concept and has been going on for decades.

So an average budget celeron system is going to be drawing around 55 watts for the cpu + another 50-70 watts for the rest of the system while running near max usage. Playing a decent game will more than likely hit that max usage. So 110-130 watts are going to be drawn from the average intel budget system and about 150-160 watts from the average budget AMD system. With those budget PSU's barely able to handle 180 watts if your lucky, both systems can squeeze by. Adding a 750ti, the intel system "might" be alright for awhile, but it won't be the most stable for long. Those caps in those PSU are going to stress, heatup, and start popping. The ripple coming out of them are going to be bad and they will cause system crashes even with the 750ti in either budget system.

Yes the average joe with a budget intel system with one of those craptastic PSU's might get by on a 750 ti without realizing the harm to their system it is causing, but the system will fail harder and faster. It could also fail in glorious fashion from when that PSU goes catastrophic. Friend set his couch on fire once when the PSU caught fire with the fan going 100% and shooting the flames out the back of the PSU about 6 feet long. That was a sight to see.

A halfway decent 300 watt power supply will be able to do much more than 300 watts. Hell the old magic Fortron/Sparkle 300w units would easily do 450 watts and were cheap buggers back in the day to boot. That was my most recommended and used power supply back in the late 90's and early 00's.


And no, an r9 270 doesn't require a 500 watt power supply at all. I can run one with a decent 300w power supply easily enough. The reason for the 500 watt recommendation with the card is to account for the shitty rated power supplies out there which don't even handle 500 watts without blowing up because they really are 300 watts supplies that had a label slapped on with something higher so the maker can charge more to the average uninformed consumer.

Also, quite a few new Intel boxes use an external AC/DC brick now, like a laptop, with no PCI-E slot anyway, sad days.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Ugh, the low end machines most people buy in the store are typically AMD machines when they are on a bare bones budget. Which use more power and still have the same crappy power supplies in them. They can't handle a 750ti. I mean it will boot up fine, and play a non strenuous game for awhile, but the computer will crash after a bit of use. The budget celeron systems that people buy are just as bad because instead of a 300watt power supply, they usually get a 200 watt which is still just as badly built most of the time.

You don't realize how much these pre-built system makers skimp on power supplies on purpose. They skimp on them as well as the motherboard to force more costs on the consumer down the road. Either in upgrades or service costs for replacements. They are designed to fail. They are designed to go belly up. It keeps people coming back to buy more because they aren't smart enough to realize they are being screwed. And since ALL brands do it, there isn't one brand of budget machine "better" than the others at all. This isn't a new concept and has been going on for decades.

So an average budget celeron system is going to be drawing around 55 watts for the cpu + another 50-70 watts for the rest of the system while running near max usage. Playing a decent game will more than likely hit that max usage. So 110-130 watts are going to be drawn from the average intel budget system and about 150-160 watts from the average budget AMD system. With those budget PSU's barely able to handle 180 watts if your lucky, both systems can squeeze by. Adding a 750ti, the intel system "might" be alright for awhile, but it won't be the most stable for long. Those caps in those PSU are going to stress, heatup, and start popping. The ripple coming out of them are going to be bad and they will cause system crashes even with the 750ti in either budget system.

Yes the average joe with a budget intel system with one of those craptastic PSU's might get by on a 750 ti without realizing the harm to their system it is causing, but the system will fail harder and faster. It could also fail in glorious fashion from when that PSU goes catastrophic. Friend set his couch on fire once when the PSU caught fire with the fan going 100% and shooting the flames out the back of the PSU about 6 feet long. That was a sight to see.

A halfway decent 300 watt power supply will be able to do much more than 300 watts. Hell the old magic Fortron/Sparkle 300w units would easily do 450 watts and were cheap buggers back in the day to boot. That was my most recommended and used power supply back in the late 90's and early 00's.


And no, an r9 270 doesn't require a 500 watt power supply at all. I can run one with a decent 300w power supply easily enough. The reason for the 500 watt recommendation with the card is to account for the shitty rated power supplies out there which don't even handle 500 watts without blowing up because they really are 300 watts supplies that had a label slapped on with something higher so the maker can charge more to the average uninformed consumer.

Give me a break. If this was even close to true, you would see many issues of computers failing for this very reason.

I am NOT saying a cheap 300-350w PSU is an ideal option, but it will serve just fine for a cheap GPU like the 750. Heck, many of those OEM systems you claim will 'catch fire' probably can be configured out of the box to similar wattage with a higher-spec CPU, RAM and extra HDDs.

Exaggerate much?
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
Give me a break. If this was even close to true, you would see many issues of computers failing for this very reason.

I am NOT saying a cheap 300-350w PSU is an ideal option, but it will serve just fine for a cheap GPU like the 750. Heck, many of those OEM systems you claim will 'catch fire' probably can be configured out of the box to similar wattage with a higher-spec CPU, RAM and extra HDDs.

Exaggerate much?

Yes you do. Again I worked at these places. Without these computers failing all the time there would be no need for the "geek" squad at best buy in the first place. Those budget PC's are DESIGNED to fail after about 6 months to a years use on purpose without even adding something like a video card. This is the average for those budget pcs on purpose.

I'm not exaggerating because this is what I worked at for years at best buy, compusa, and circuit city. I worked in PC repair shops as well in my time. Many friends worked at building comps for HP and Dell and I know all the reasons for all the design decisions made for various PCs and price points.

Let me put it to you a different way.

The old Fortron/Sparkle miracle 300 watt power supply in the 90's and early 00's was about $1.50 more expensive from an OEM ordering perspective than the equivalent Allied/Deer 300 watter that went into 99% of PCs made back in that time frame which used a 300 watter. I don't the average consumer would have cared to have paid an additional $1.50 in cost for a very reliable PSU that would not cause any system instability at the time nor have any problems with future upgrades for the consumer. Can you guess why the decision for Dell, HP, Gateway, and other brands to not use the better PSU? On top of that, the Allied/Deer or other cheaper brands would make "proprietary" versions of the PSU for each computer brand for a specific motherboard. This means that the Dell PSUs at one point would only work with Dell motherboards and such. Such a setup actually was more costly to Dell to make the computers that way by a few bucks of there the "standard" factor put out on the shelves. Care to guess why this was done for YEARS this way?

Money on the back end. PSU that were designed to blow out that could only be replaced by Dell. Such a PSU would normally cost Dell something like $10 a unit and they would resell the replacement to consumer along with a "free install" for $150 per. I saw this EVERYDAY at computer repair shops for years. All the major pre-built computer brand suppliers did it this way for a long time. They still do it this way for the budget systems even today. Things have not changed because 99% of consumers are still too stupid to realize they are being fleeced over and don't care that they are.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
LOL, right?

Anyways, the 960/960Ti will definitely be a popular SKU for dropping-in a lot of OEM builds. While it probably will not sip the same amount as the 750, it should work great with pretty much any 400w or greater PSU. The 950 will probably follow soon after as a refresh of the 750...

Lots of people forget that 'upgrading' a PSU can often be difficult or near impossible on an OEM build. They don't always use standard PSU pin connectors...for the youngster with limited funds, and a computer they cannot just go out and build themselves, the best GPU for their build is important to them. For us (the enthusiast crowd) we have a lot more latitude. :)

That's actually why I was reading this thread in the 1st place.

The 960 reportedly has ~100W dissipation, which means its rec PSU will likely be 400-450W. That means it will work in a lot of the higher end off the shelf boxes (like the XPS 8700 in my sig w/460W PSU).

I can't help but think that is part of Nvidia's strategy. They're targeting the 95% of folks who get off the shelf systems, which is smart IMO. People buy these high-power cards and they don't work in their machines, then they return them, and that's a problem for Nvidia and AMD. Nvidia seems to have a solution - at least in the mid range, keep the power req down. I bet they have far fewer returns than AMD.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
That's actually why I was reading this thread in the 1st place.

The 960 reportedly has ~100W dissipation, which means its rec PSU will likely be 400-450W. That means it will work in a lot of the higher end off the shelf boxes (like the XPS 8700 in my sig w/460W PSU).

I can't help but think that is part of Nvidia's strategy. They're targeting the 95% of folks who get off the shelf systems, which is smart IMO. People buy these high-power cards and they don't work in their machines, then they return them, and that's a problem for Nvidia and AMD. Nvidia seems to have a solution - at least in the mid range, keep the power req down. I bet they have far fewer returns than AMD.

Exactly.

Obviously the better solution is to throw that XPS in the trash and build a new CPU with a R9 270 (according to this thread) but hey!....;)
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
The 960 reportedly has ~100W dissipation, which means its rec PSU will likely be 400-450W. That means it will work in a lot of the higher end off the shelf boxes (like the XPS 8700 in my sig w/460W PSU).

I would expect the 960 to consume more power than that, if it indeed uses the GM204 GPU and is 256b/4GB. Maybe 120W-130W, and maybe only needing 1 6-pin PCI-E.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I would expect the 960 to consume more power than that, if it indeed uses the GM204 GPU and is 256b/4GB. Maybe 120W-130W, and maybe only needing 1 6-pin PCI-E.

That would make sense. I don't see it using less than 75w that just the slot can provide. A single 6-pin would make sense....75w+ 75w would provide 150w, which would be definitely more than needed, but still give ~15-20% headroom for OCing or higher-spec'd SKUs.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,738
7,348
136
TLDR: So is the 960 going to be a good upgrade from say a 660?

Depends how much RAM it's going to ship with. If the 2GB rumors are true then I think the R9 290, R9 290x, and GTX 970 make the most sense, as no one wants to spend $250 on a card and have to run a game with medium textures on day 1 (e.g., Shadow of Mordor, which recommends 3GB for high textures). The rumors and projections have been all over the map for the 960, so who knows what this thing is going to be like. Maybe CES will shed some real light on it next month.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Yes you do. Again I worked at these places. Without these computers failing all the time there would be no need for the "geek" squad at best buy in the first place. Those budget PC's are DESIGNED to fail after about 6 months to a years use on purpose without even adding something like a video card. This is the average for those budget pcs on purpose.

I'm not exaggerating because this is what I worked at for years at best buy, compusa, and circuit city. I worked in PC repair shops as well in my time. Many friends worked at building comps for HP and Dell and I know all the reasons for all the design decisions made for various PCs and price points.

Let me put it to you a different way.

The old Fortron/Sparkle miracle 300 watt power supply in the 90's and early 00's was about $1.50 more expensive from an OEM ordering perspective than the equivalent Allied/Deer 300 watter that went into 99% of PCs made back in that time frame which used a 300 watter. I don't the average consumer would have cared to have paid an additional $1.50 in cost for a very reliable PSU that would not cause any system instability at the time nor have any problems with future upgrades for the consumer. Can you guess why the decision for Dell, HP, Gateway, and other brands to not use the better PSU? On top of that, the Allied/Deer or other cheaper brands would make "proprietary" versions of the PSU for each computer brand for a specific motherboard. This means that the Dell PSUs at one point would only work with Dell motherboards and such. Such a setup actually was more costly to Dell to make the computers that way by a few bucks of there the "standard" factor put out on the shelves. Care to guess why this was done for YEARS this way?

Money on the back end. PSU that were designed to blow out that could only be replaced by Dell. Such a PSU would normally cost Dell something like $10 a unit and they would resell the replacement to consumer along with a "free install" for $150 per. I saw this EVERYDAY at computer repair shops for years. All the major pre-built computer brand suppliers did it this way for a long time. They still do it this way for the budget systems even today. Things have not changed because 99% of consumers are still too stupid to realize they are being fleeced over and don't care that they are.
deft ears bro, time for you to give up ^_^
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
Perhaps he tries so hard explaining because he will never understand why a 960 can just slip into prebuild systems but a 270 needs 500watt. :)

Uhh, a 960 can slip in? Where can you find such an item? Are you buying them next to the large end cap of unicorns?

And a 270 needs 500watts? Really? Because I'm pretty sure I can run one just fine on a good 300 watter :)
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
That would make sense. I don't see it using less than 75w that just the slot can provide. A single 6-pin would make sense....75w+ 75w would provide 150w, which would be definitely more than needed, but still give ~15-20% headroom for OCing or higher-spec'd SKUs.

Tweaktown notes it's supposed to land in the 100-120W range on the reference card and I've seen some references to it being as low as 90W. Any of that is possible for a *non* OC/SC card I think, given the 750 / Ti draws 55W / 60W and the 970 reference is rated to draw 145W.


"GIGABYTE and others that will ramp things up with custom coolers and increased power requirements, but the reference GTX 960 will sip between 100-120W with a total TDP of 150W."

http://www.tweaktown.com/news/42142...ce-gtx-960-january-150w-power-draw/index.html


I suspect it will look a lot like the GTX 970M, only with slightly higher clock speed and memory speed :

42142_04_nvidia-rumored-launch-geforce-gtx-960-january-150w-power-draw.png
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Uhh, a 960 can slip in? Where can you find such an item? Are you buying them next to the large end cap of unicorns?

And a 270 needs 500watts? Really? Because I'm pretty sure I can run one just fine on a good 300 watter :)

I was just having fun ;) One of the kids had a 7850 1ghz+ running on a fanless 300w for 2 years. A 8 year old ps btw. Then he got a 7970 1ghz but then i changed the ps to a 400w. All good quality of course. And yes there is crappy ps only with a fantasy watt stamp on them.

The point was nv marketing makes people say the most stupid. Like a 70w card needs 200 watts less powersuply capacity than a 135w card. Its incredible what a brand can do to logic.

But its nice to see a new single pin champ comming.
 
Last edited:

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Feels like I'm talking with Rollo all over again :)


I'm really not sure how much simpler this can get, so here's a picture.

The reference 750 Ti is circled at 58W, so is the reference R9 270 at 151W. That's a 93W difference.

Since we seem to be mathematically challenged, a 70% efficient PSU you need a PSU rated 133W higher to use an R9 270 vs one you need for a GTX 750 Ti.

The math : (151-58) / .70 = X where X is the increased PSU requirement.

That is the difference between needing a 300W PSU as on the GTX 750 Ti vs a 450W (the R9 270 specs call for a 500W..).

Those are two entirely different worlds in terms of power supply requirements, many many off the shelf systems have >= 300W PSUs but very few have 450W+ PSUs.

If the 960 comes in at 100W - that's 42W more than the 750Ti.

That means the 960 would need a PSU rated 60W higher than one needed by a 750Ti, and since the 750Ti specs for a 300W+ it implies its requirement will fall into the sub-400W range (but will probably spec for a 400W+). Many of the higher end off the shelf systems have sufficient PSUs for that.



1YfOX6h.jpg
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^For the 10th time, an after-market R9 270 does not use 150-151W of power. Your entire claim rests on some dubious/erroneous information from Tom's when dozens of far more reputable sites disprove the data. You realize that an i7 3770K system with an after-market 270 uses LESS than 200W of power? Even R9 270X/7870 and an i7 3770/4770K combo uses barely more than 210W:

http://techreport.com/review/25642/amd-radeon-r9-270-graphics-card-reviewed/8

But I guess we should ignore real world data and believe claims that a system with an R9 270 needs a 450-500W PSU despite users running an overclocked i7 920/860 with R9 290/7970Ghz max overclocked on a 500-550W units for years. I personally ran my i7 860 @ 3.9Ghz with Fermi 470 @ 750Mhz both maxed out for at least 1 year doing distributed computing projects that loaded the CPU and GPU to 99% on a Corsair 520W.

Let me guess your recommendation for a system with a single 290/7970Ghz is a 750W PSU and for 290X CF is a 1200W PSU?

I don't know since when it became gospel that AIB's conservative guidance for PSU recommendations is suddenly some hard written rule?

I mean do you even bother reading professional reviews of how much a modern i7 with a high-end card actually uses? You really should start before polluting our forum with uninformed data that misleads potential system builders. I may sound harsh but it has to be said to provide facts.

I'll even get you started to understand just how wrong your claims of needing a 450-500W PSU for an R9 270 style card are:

http://www.techspot.com/article/928-five-generations-nvidia-geforce-graphics-compared/page8.html

For crying out loud even the crazy inefficient reference R9 290/290X system just cracks 400W at max load:

http://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed/12

While mid-range cards like 770 and 280X paired with a modern i5/7 coming in well below 325W:
http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards/10

You also need to understand that power usage at the wall is not the load the PSU actually experiences. If the total system uses 350W at the wall and the PSU is 88% efficient, your load at the PSU level is actually 308W.

No, my replies in this thread aren't about 750Ti vs. 270/270X. It's about getting people informed about PSU ratings, overall system power usage under gaming and about true ratings of quality PSUs. Unfortunately the FUD keeps getting spread on forums like ours and poor new system builders keep buying low end power cards because they are mislead that their 350-400W Corsair, Antec, Sparkle, XFX PSU isn't sufficient.

Exactly.

Obviously the better solution is to throw that XPS in the trash and build a new CPU with a R9 270 (according to this thread) but hey!....;)

So what you are saying it's not possible at all to upgrade OEM systems' power supplies, ever? I don't know if you guys are trolling or what.

The XPS 8700 easily runs an i7 4770 with a GTX 660:
http://www.pcmag.com/article2/0,2817,2424864,00.asp?fullsite=true

The difference in peak load between a 660 and 270X is 3W:
http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/25.html

I guess by end of 2014, facts start getting in the way of perf/watt and TDP marketing being used to gauge power usage, like the absolutely worthless TDP claims of a 970 card depicted to represent its actual real world power usage.

http://www.techpowerup.com/mobile/reviews/MSI/GTX_970_Gaming/25.html
 
Last edited:

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
Yes you do. Again I worked at these places. Without these computers failing all the time there would be no need for the "geek" squad at best buy in the first place. Those budget PC's are DESIGNED to fail after about 6 months to a years use on purpose without even adding something like a video card. This is the average for those budget pcs on purpose.

I'm not exaggerating because this is what I worked at for years at best buy, compusa, and circuit city. I worked in PC repair shops as well in my time. Many friends worked at building comps for HP and Dell and I know all the reasons for all the design decisions made for various PCs and price points.

Let me put it to you a different way.

The old Fortron/Sparkle miracle 300 watt power supply in the 90's and early 00's was about $1.50 more expensive from an OEM ordering perspective than the equivalent Allied/Deer 300 watter that went into 99% of PCs made back in that time frame which used a 300 watter. I don't the average consumer would have cared to have paid an additional $1.50 in cost for a very reliable PSU that would not cause any system instability at the time nor have any problems with future upgrades for the consumer. Can you guess why the decision for Dell, HP, Gateway, and other brands to not use the better PSU? On top of that, the Allied/Deer or other cheaper brands would make "proprietary" versions of the PSU for each computer brand for a specific motherboard. This means that the Dell PSUs at one point would only work with Dell motherboards and such. Such a setup actually was more costly to Dell to make the computers that way by a few bucks of there the "standard" factor put out on the shelves. Care to guess why this was done for YEARS this way?

Money on the back end. PSU that were designed to blow out that could only be replaced by Dell. Such a PSU would normally cost Dell something like $10 a unit and they would resell the replacement to consumer along with a "free install" for $150 per. I saw this EVERYDAY at computer repair shops for years. All the major pre-built computer brand suppliers did it this way for a long time. They still do it this way for the budget systems even today. Things have not changed because 99% of consumers are still too stupid to realize they are being fleeced over and don't care that they are.

1) I do agree that OEMS skimp on PSU and Motherboard
2) I don't think that is quite enough to make some of the claims you have made in this paragraph, going to need some citation on these facts you have presented please. I suspect it is just hyperbole and opinion in the guise of fact, but if your statements and statistics are true, I would like a source.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
1) I do agree that OEMS skimp on PSU and Motherboard
2) I don't think that is quite enough to make some of the claims you have made in this paragraph, going to need some citation on these facts you have presented please. I suspect it is just hyperbole and opinion in the guise of fact, but if your statements and statistics are true, I would like a source.

Just go read all the old PSU forums here, on hardcore, and a few others across the net. Read a bunch of old jonnyguru and larvae stuff. This is experience I'm speaking from of years of working this stuff. I am not aware of a major study being done to collect such data for your easy consumption though. It's basically and industry joke. Go talk to just most good PC repair shop people and I'm not talking young geek squad kids but the old farts like me that have been doing it for decades.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
Uhh, a 960 can slip in? Where can you find such an item? Are you buying them next to the large end cap of unicorns?

And a 270 needs 500watts? Really? Because I'm pretty sure I can run one just fine on a good 300 watter :)

A good 300W, definitely - especially with a modern Intel processor... the AMD chips use so much more 'juice.

That's exactly what I did; an i5 3470, 1HDD, 2 sticks of RAM, 300W PSU of moderate-at-best quality, and a Radeon R9 270. Worked great all day long.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
"Strange" phenomenon is happening

We have 180w rated gpu that physically demands a 6 pin and 8 pin connection.
We have 6 pin gpu "needing" 500w powersuply

lol. When i heard the 970 power rating i thought great - thats for 6pins and fit for one of the small media pc in the house with a 300w ps (the old 300w fanless). But man even dual 6pin is a stretch for that dude. The old ps would have exploded by the crazy peaks 970 gives. Buying ps for the new nv gpu gen is not that simple as it used to be. Now i look forward to the 960 to see if it fits.