http://www.newegg.com/Product/...x?Item=N82E16814103067
It keeps getting more enticing to those whose case won't hold a 4870X2.
It keeps getting more enticing to those whose case won't hold a 4870X2.
Originally posted by: WT
*copy*
450 Watt or greater power supply with 75 Watt 6-pin PCI Express power connector recommended
*paste*
But it states "Power Connector - 2 x 6 Pin"
I assume you need both 6 pins plugged in, but its slightly confusing when reading the PSU requirement that I pasted. I'll assume you need BOTH 6 pin PCIe power plugs plugged in.
Originally posted by: WT
Thanks for the clarification, apoppin. I was 90% sure that's what it needed (they don't put them on there unless you NEED it) but I was remembering possibly the older Radeon line that you could function on one power connex, but needed both plugged in to do any OC'ing of the card.
I was sure I was going with the 4870 1gb cards until the huge GTX260 price cuts, so now I am again undecided. I have an eVGA 750FTW board, so my gut tells me to go Nv and SLI on this one, even tho I prefer to support ATI for a change.
Originally posted by: apoppin
Originally posted by: WT
Thanks for the clarification, apoppin. I was 90% sure that's what it needed (they don't put them on there unless you NEED it) but I was remembering possibly the older Radeon line that you could function on one power connex, but needed both plugged in to do any OC'ing of the card.
I was sure I was going with the 4870 1gb cards until the huge GTX260 price cuts, so now I am again undecided. I have an eVGA 750FTW board, so my gut tells me to go Nv and SLI on this one, even tho I prefer to support ATI for a change.
See .. Nvidia IS smart
their marketing and pricing strategy is working
![]()
you have a very difficult choice .. the 260 is very appealing for $80 less
- and the 280 is also a choice for some for $80 more
i told you they would figure it out
- what do you think they will do when they transition to 55nm? Become less aggressive? i doubt it; they are in no danger of losing further market share unless AMD can transition to their own smaller process very quickly
it appears to me the price war is mostly over until the new GPUs arrive ... but the prices are also really good for us
Originally posted by: taltamir
ah, but nvidia is using GDDR3, which is much cheaper.
nVidia made the decision that it would be cheaper for them to make a huge die and use inexpensive GDDR3, instead of a tiny die and very expensive GDDR5. Unless you can tell me how much nvidia pays for their ram, and how much AMD does, you cannot assume that AMD is making more money per chip sold.
At the end of the day, nvidia's cards are prices very agressively.
Originally posted by: Extelleron
Originally posted by: taltamir
ah, but nvidia is using GDDR3, which is much cheaper.
nVidia made the decision that it would be cheaper for them to make a huge die and use inexpensive GDDR3, instead of a tiny die and very expensive GDDR5. Unless you can tell me how much nvidia pays for their ram, and how much AMD does, you cannot assume that AMD is making more money per chip sold.
At the end of the day, nvidia's cards are prices very agressively.
No way is the price of GDDR3 vs. GDDR5 going to make a difference that would make the HD 4870 anywhere near as expensive as the GTX 260.
Remember the GTX 260 has 896MB of GDDR3, the HD 4870 has 512MB GDDR5. I would expect that 512MB of GDDR5 costs about the same amount as 896MB GDDR3.
nVidia did not design GT200 to be a competitor to RV770, they did not design it to be a cost effective chip. They designed it to be a chip that sold in cards priced at $450-650. Now they are selling those same cards for $250-400. Not very good for business, not just for nVidia but their partners as well. AMD designed the HD 4870 to be sold at the $300 price point, and they are doing just that. Same thing with the HD 4670; it was designed to sell for $80, the 9600GSO/8800GS was designed to sell more in the $120-150 price range.
nVidia doesn't have a choice about big die vs. small die at this point, the G80 architecture is not efficient in using die space so until they release a new architecture they are stuck. Going from G92 -> GT200 doesn't quite double the chip's resources and it is 78% larger. Going from RV670 -> RV770 provides 2.5x the shading/texture resources and improved pixel performance, yet is 33% larger in size. RV770 is a bit of a special case because AMD also removed the ring bus architecture which freed up a good amount of space, but regardless AMD's architecture is more efficient/mm^2.
nVidia did not design GT200 to be a competitor to RV770, they did not design it to be a cost effective chip. They designed it to be a chip that sold in cards priced at $450-650. Now they are selling those same cards for $250-400. Not very good for business, not just for nVidia but their partners as well. AMD designed the HD 4870 to be sold at the $300 price point, and they are doing just that. Same thing with the HD 4670; it was designed to sell for $80, the 9600GSO/8800GS was designed to sell more in the $120-150 price range.
nVidia doesn't have a choice about big die vs. small die at this point, the G80 architecture is not efficient in using die space so until they release a new architecture they are stuck. Going from G92 -> GT200 doesn't quite double the chip's resources and it is 78% larger. Going from RV670 -> RV770 provides 2.5x the shading/texture resources and improved pixel performance, yet is 33% larger in size. RV770 is a bit of a special case because AMD also removed the ring bus architecture which freed up a good amount of space, but regardless AMD's architecture is more efficient/mm^2.