GTX650Ti Review Leak

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Surprised? Talking about Romania.

You said GTX660 costs as much as HD7850 2GB. I can find HD7850 2GB for cheaper.

Also, the 7870 you linked is an 1100mhz version and that's substantially faster than a GTX660. At 1150mhz, HD7870 is 18% faster at 1080P than a GTX660. That still makes the 10% higher priced 7870 no worse in price/performance.

What about on the whole? I see MSI TF GTX680 is 2308 vs. 2095 for MSI Lightning 7970.The latter ships with top-of-the-line premium components and 1070mhz clocks and huge overclocking headroom. HD7950 Gigabyte Windforce 3x for 1329 vs. Gigabyte Windforce GTX660Ti OC for 1346. The former is faster out of the box and puts up solid overclocks. Doesn't look like NV has cheaper prices in Romania either. The high prices of 7850 in your country probably haven't accounted for price drops over there yet that have taken place worldwide, but it will happen soon.

Your other point is that prices around the world differ and that is true. However, a quick glance at Europe wide prices, North America and Oceania and you can see that in 8/10 cases AMD provides superior price/performance worldwide at the moment.

Also, I've heard many people in Europe import their GPUs from neighbouring countries where the cost is less and importing fees can be as low as 10-15 euro.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
RS your scan.co.uk GTX 660/7870 listed prices are not so innocent too ^_^

What do you mean? I picked up the cheapest after-market 7870/660 I could find. In fact, Scan.co.uk only has 1 GTX660 under 200 pounds. I am not going to sit there and look at comparative prices of 200 shops in the UK. This website does the job well at a quick glance. There are 6 HD7870 cards under £200 and only 1 GTX660 below that price.

The website shows 3 7870 under 190 pounds even. Here is MSI TF 7870 for 183.34.

That's not even the best part. Now only is HD7870 cheaper in most places, but it is also a faster GPU than GTX660 to begin with as 660's performance with MSAA is poor. Even if GTX660 magically cost the same in Europe, it still offers worse price/performance since it's slower by about 6-7% overall. Of course none of this matters since NV customers will continue to pay premiums for NV, so as long as this continues, NV has no incentive to lower prices. How well NV cards sell does not determine how good their performance is for the price.

What I find interesting is that when AMD launched at high prices, many of us criticized its price/performance. Yet when NV has offered inferior price/performance since at least June, the company is being defended left right and center because it has what, PhysX? It's always the same story here. When AMD cards are overpriced, we call them on it, but when NV cards are overpriced, NV specific features are brought into discussion as if AMD has no features to speak off. Based on price/performance, NV cards continue to be overpriced and have been since June 2012.

It still has not been answered why in the US anyone should buy a $149 GTX650Ti over $159-165 HD7850 1GB? The population of the US is more than 300 million, which is nearly 15x the size of Romania. If NV considers GTX650Ti an HD7770 competitor as Keysplayer keeps pointing out, then it can't be $149 as HD7770's official MSRP is $119. Maybe GTX650Ti will launch at $129-139 but $149 for this level of performance is disappointing against the 7850 1GB. Not only that, it's horrendous against old products such as HD6950 2GB, HD5850 OC, GTX470 OC/GTX560Ti. At least HD7850 has huge overclocking headroom that lets it perform up to GTX580 level of performance. What's GTX650Ti bringing to the table against the 7850 at $149 other than $10-15 lower price?
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
"I could be wrong" part?

Remember HD5000 and AMDs race for market share back then?
What good was it, if apparently NV can still do pretty much as they please?

And you're saying that at THIS point in time, Mr. Rory Read & Co. are willing to trade financial result for market share? Well I guess anything is possible :p

"I could be wrong" was not in the original post I responded to. You were speaking with utter authority like you had inside info on AMD stock levels and were sitting in on board meetings.

Again you are doing it with the 5000 series. You are assuming the reason for the 5000's pricing was solely to garner market share when you don't know that. There were, and always are, many more forces acting on the market. If it were as simple as you are making it we could all be CEO's.

You are also very wrongly assuming that market share isn't important. It's very very important. Especially in a tech industry where you rely on people making products to drive your business. If you are insignificant no one will concern themselves with tailoring their games for your arch. Any idea what BF3 benches at on a Matrox card? :)
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
And possibility of having even a single GTX650 Ti listed for less than MSRP is a leap of faith :rolleyes:

This is a good point, and I also think we need to have more info about things like typical oc headroom. I would wait for official reviews from reputable sources to roll in and for overclocking headroom numbers to roll in before casting judgment.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
"
Again you are doing it with the 5000 series. You are assuming the reason for the 5000's pricing was solely to garner market share when you don't know that. There were, and always are, many more forces acting on the market.

Well ofc that there's a flow and a balance between the goals, and you don't just follow your single target no-matter-what.
They would be giving them HD5000 free otherwise, no?

But Northern Island offered bigger boost compared to GTX200, than HD7000 over GTX500, and yet back then AMD asked less money.

So in which of the two cases was AMD pursuing market share MORE?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well ofc that there's a flow and a balance between the goals, and you don't just follow your single target no-matter-what.
They would be giving them HD5000 free otherwise, no?

But Northern Island offered bigger boost compared to GTX200, than HD7000 over GTX500, and yet back then AMD asked less money.

So in which of the two cases was AMD pursuing market share MORE?

I don't think they ever pursue market share less. They are always trying to "sell as much as they can for as much as they can".

Evergreen (not NI) launched on a much more mature process. If you look at the performance of SI now, the 7970 is trading blows with the 6990, just like the 5870 vs. the 4870x2. I don't think the % of performance was much of a determining factor for pricing. Besides, it's generally agreed that the 5870 was launched at too low of a price. It sold out very quickly and contrary to what nVidia were saying, the 480 was nowhere in sight. If the 480 was actually going to be launched in a few weeks like everyone thought, and had 512cc @ whatever the full clocks were supposed to be (I honestly don't remember now), performing at ~50% faster than the 5870, that $380 (IIRC) launch price wouldn't have seemed so low looking back.

Also, let's not forget that AMD has new management. It makes it impossible to draw too many direct comparisons. Still, people in those positions looking at the same situation will often do similar things.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I don't think they ever pursue market share less. They are always trying to "sell as much as they can for as much as they can".

So market share gains/losses are pretty much accidental byproducts of short-term pursuit of profit?
That was my taking on the AMD pricing issue, to which you offered:

A wants to gain market share. A increases production and reduces pricing to meet the desired goal.

Which would be a weird timing imho, although WTH... since GPU prices won't make or brake AMD, why the hell not :)
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So market share gains/losses are pretty much accidental byproducts of short-term pursuit of profit?
That was my taking on the AMD pricing issue, to which you offered:

A wants to gain market share. A increases production and reduces pricing to meet the desired goal.

Which would be a weird timing imho, although WTH... since GPU prices won't make or brake AMD, why the hell not :)

You are purposely ignoring 90% of my post's content to obscure the point of them. I'm trying to have a discussion, not play games with you.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Like the part where you accuse me of assuming that market share isn't important

Sorry if I don't respond to all the issues you've raised, but that does not mean I'm playing games... :(
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Like the part where you accuse me of assuming that market share isn't important

Sorry if I don't respond to all the issues you've raised, but that does not mean I'm playing games... :(

As time goes by the manufacturing BoM goes down. Assuming you have the ability for manufacturing to handle increased demand, prices go down and you sell more. nVidia has said they are supply constrained. This would stop them from responding and offering similar perf/$.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
"A non overclocked GeForce GTX 660 Ti video card will need to play with PhysX set to Medium or else the performance will drop under 35 FPS for extended periods of time during the combat intense areas of the game." HardOCP

We've already established in the BL2 thread that even GTX680/GTX680 SLI can drop to below 30fps with PhysX High. PhysX on GTX650Ti is an oxymoron as the card is way too slow to use it.

I was hoping somebody would present this argument, but I'm a bit disappointed it was you Russian. Because I've been playing BL2 at 1920x1200, All settings high and farthest. PhysX high. On my................ wait for it................... GTX280 and maintaining playable framerates. Single player mode and framerates stay above 25fps and hover around 32-37 on average. That is a GTX280 dude. And you sit there and have the you know what to type out that a 660Ti will need to be set to medium PhysX?

Look man, it aint so. I'm trying all different combinations of single cards and primary cards with a dedicated PhysX card as well. Call me crazy, but a GTX660Ti is just a "scooch" quicker than a GTX280. I am so tired of the BS around here.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
1348883228bOv7Bq3E2t_4_2_l.jpg

As we mentioned in the previous page, the NVIDIA GeForce GTX 660 standard video card was not playable with PhysX set to High at 1080p. At these settings it averaged 42.1 FPS. When increased PhysX from Medium up to High the more demanding areas of the run-through had completely different experiences. The framerate dropped down from 35 and 40 FPS to between 30 and 35 FPS. This 5 FPS performance loss really hurt in the demanding areas of the game and created an unplayable environment. Any users playing at a lower resolution than 1920x1080 will experience a performance boost identical to the GTX 680, 670, and 660 Ti moving from 2560x1600 to 1080p

A gtx 660 is close to 2x as fast as a gtx 280.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
That is a GTX280 dude. And you sit there and have the you know what to type out that a 660Ti will need to be set to medium PhysX?

Well to be more specific, are you calling out the cite that RS provided to back up his claim? I mean that's pretty odd that he actually went through the trouble of citing support for his statement, going through the trouble of citing it.

And it is kind of weird that you counter his specific citation with some anecdotal evidence, using a completely different video card. It's perhaps a logical fallacy known as http://en.wikipedia.org/wiki/Non_sequitur_(logic)
 

Keromyaou

Member
Sep 14, 2012
49
0
66
I think that this PhysX is a major marketing ploy for Nvidia. PhysX is not a big deal actually. However if you can't run games without PhysX you feel missing something in there. There are not many PhysX games. But Nvidia put PhysX in several popular games such as Batman AA&AC, Borderlands 2. Frankly outside of these three games, a lack of PhysX is not a biggie at all. But if you have a choice between a card with PhysX and the one without PhysX, you would probably choose the one with PhysX as far as the other conditions are the same. Nvidia knows it very well. Although PhysX is a gimmick, its power as a marketing ploy can't be underscored, I guess.

Another issue is about so-called royal customers. Those who defend either Nvidia or ATI no matter need to understand that companies in reality don't give a shit about their fanboys. On the side of manufacturers, there is an issue Nvidia vs. ATI because they are competitors. However on the consumer side, the real issue should not be about Nvidia vs. ATI, but about manufacturers vs. consumers. If you follow what Nvidia or ATI does to customers at the face value, you should realize that they really do very sneaky things. When consumers stop buying overpriced goods, manufactures will try to sell better goods at better prices. Seriously this fanboyism needs to go away.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I was hoping somebody would present this argument, but I'm a bit disappointed it was you Russian. Because I've been playing BL2 at 1920x1200, All settings high and farthest. PhysX high. On my................ wait for it................... GTX280 and maintaining playable framerates. Single player mode and framerates stay above 25fps and hover around 32-37 on average. That is a GTX280 dude. And you sit there and have the you know what to type out that a 660Ti will need to be set to medium PhysX?

Look man, it aint so. I'm trying all different combinations of single cards and primary cards with a dedicated PhysX card as well. Call me crazy, but a GTX660Ti is just a "scooch" quicker than a GTX280. I am so tired of the BS around here.


Hmm, doesn't the GTX 280 have a 512bit memory bus vs. a 192 bit memory bus?? BL2 is a DX9 game for crying out loud. Performance is going to be all over the board with the GTX660ti. You are comparing a go kart to an RV.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Hmm, doesn't the GTX 280 have a 512bit memory bus vs. a 192 bit memory bus?? BL2 is a DX9 game for crying out loud. Performance is going to be all over the board with the GTX660ti. You are comparing a go kart to an RV.

I don't think that this would be relevant for physx, at the end of the day, it's only 240 "cuda cores" of a much older gen...

also it's 512bit 2200MHz vs 192 6000MHz memory bandwidth is close.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I don't think that this would be relevant for physx, at the end of the day, it's only 240 "cuda cores" of a much older gen...

also it's 512bit 2200MHz vs 192 6000MHz memory bandwidth is close.

Remember that the 600 series relies on cuda cores to compensate for the missing shader clock.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Remember that the 600 series relies on cuda cores to compensate for the missing shader clock.

I think it's not really relevant in this case... it's 240 from 2008 gen at 1.3GHz vs 1344 at near 1GHz... no way...
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Guys let's just relax. 650 Ti is supposed to launch Oct. 9 right? That's very soon now... very soon. We'll have a LOT more data to go on very soon.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Well to be more specific, are you calling out the cite that RS provided to back up his claim? I mean that's pretty odd that he actually went through the trouble of citing support for his statement, going through the trouble of citing it.

And it is kind of weird that you counter his specific citation with some anecdotal evidence, using a completely different video card. It's perhaps a logical fallacy known as http://en.wikipedia.org/wiki/Non_sequitur_(logic)

Or you could of mannered up and asked, "Keys, could you perhaps snap some video of you playing BL2 with your 280 and the settings you used?"

My reply: "Sure KingFatty, no problem at all. Just give me til after work today and I'll have it sometime early evening tonight."

:colbert:
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Or you could of mannered up and asked, "Keys, could you perhaps snap some video of you playing BL2 with your 280 and the settings you used?"

My reply: "Sure KingFatty, no problem at all. Just give me til after work today and I'll have it sometime early evening tonight."

nvm
 
Status
Not open for further replies.