[TPU] Nvidia prepares "price cuts across it's entire lineup"

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
58681.png


I still find it funny that despite posting on Anandtech so frequently, you never use their benchmarks.. I wonder why? :sneaky:

A 13.8% increase in performance using a factory OC'd GTX 770 to a "Referenced" clocked XFX 280x. Shrinks to 5.8% comparing the two factory overclocked cards. Worth the $90-$100 difference? How much is the factory overclocked GTX 770 used in this review? It's also a 2GB card.

For a $310 card the Asus 280X VS. the GTX 780 which is only 16% faster in this game makes a $649 card look like a horrible buy...
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
The Tech Report review used the slowest reference GTX 770 model with a clock speed of 1045, and reviewed it against aftermarket specimens from Asus and XFX..

Using an aftermarket GTX 770 with a higher clock and boost speed would have produced different results.. With that said:

58681.png


I still find it funny that despite posting on Anandtech so frequently, you never use their benchmarks.. I wonder why? :sneaky:
Crysis3_2560x1440_OFPS.png

Crysis3_2560x1440_PER.png

Le cherrypick no :sneaky:
 

ams23

Senior member
Feb 18, 2013
907
0
0
A 13.8% increase in performance using a factory OC'd GTX 770 to a "Referenced" clocked XFX 280x.

Nope, Anandtech appears to have used a stock clock reference GTX 770. In fact, if you look at Anand's original review on GTX 770 more than four months ago (!), the fps difference is 0.1fps between then and now. So a Superclocked GTX 770 should be a bit faster than reference clock, which would further the lead in this one Anandtech benchmark (note that the SC editions are only 3% faster than stock at 25x16 in Crysis 3, probably due to being bandwidth limited). That doesn't make GTX 770 an amazing value of course, but for anyone who plans to purchase Batman Arkham Origins, that would add about $50 extra value to the GTX card that is built into the price at this moment. A $349 GTX 770 would clearly look a lot nicer though, and I'm sure it will get there in the coming months.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136

So just because you don't like it, it's a gimmick? You can hate on PhysX all you want bro, but the fact is, PhysX ain't going anywhere. It's the most versatile and powerful Physics middleware available today, as seen by the fact that it's integrated into the Unreal Engine 4.

2.Bigger Gimmick

Right, and how many games uses AMD's HD3D?

*crickets*

3.I think the 7990 works pretty well with up to date drivers bro what are you talking about ? Nvidia has and had SLI hiccups too stop bullshiting.

Compared to the rough, pot hole filled road that is Crossfire, SLI is like the Autobahn..

4.So forcing ambient oclussion and some AA are determining factors when purchasing a GPU bro ? grasping at straws there aren't we ?

They aren't determinant factors, but they make the gameplay experience better than stock.

5. Eyefinity

Eyefinity is awesome I will admit, but good luck playing at those ultra high resolutions on the stuttery mess that is Crossfire. AMD's frame pacing driver doesn't even work on resolutions above 1600p if I'm not mistaken.

6. Oh the "better driver" line...

How long does it take AMD to fix a problem or bug with a game?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
A 13.8% increase in performance using a factory OC'd GTX 770 to a "Referenced" clocked XFX 280x. Shrinks to 5.8% comparing the two factory overclocked cards. Worth the $90-$100 difference? How much is the factory overclocked GTX 770 used in this review? It's also a 2GB card.

The GTX 770 in the Anandtech review was a reference model as well apparently. The factory overclocked versions like the Gigabyte OC, MSI Lightning etc would have had an even larger lead.

For a $310 card the Asus 280X VS. the GTX 780 which is only 16% faster in this game makes a $649 card look like a horrible buy...

O agree wholeheartedly. The GTX 780 is a horrible buy, but then again, the only reason why it cost that much was because AMD was late to the party. Had the 290x been around, it would have been priced much more reasonably.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Le cherrypick no :sneaky:

Those graphs look pretty even to me, with a slight lead to the 280x which can be explained by the fact that it has 1GB of extra VRAM.

Crysis 3 will easily use over 2GB if it's available. Thats one of the things I like about AMD, is that they usually tend to be more generous with the amount of VRAM on their cards..

But it's also impressive that the GTX 770 can keep up the 280x when it has not only less VRAM, but a lot less bandwidth as well.

The Kepler architecture is extremely efficient...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So just because you don't like it, it's a gimmick? You can hate on PhysX all you want bro, but the fact is, PhysX ain't going anywhere. It's the most versatile and powerful Physics middleware available today, as seen by the fact that it's integrated into the Unreal Engine 4.

It's pretty hard to talk about price/performance and general value when you spent $450x2 on 770 4GB cards + say $100 on that slave PhysX card? In total then your GPU setup was $1000 USD and yet HD7970GE/R9 280X offers 95% of the performance for $600-620.

How can anyone create a list of features/advantages of AMD cards when you paid $400 USD more, or 61% more and yet your cards still cannot really outperform 7970GE/R9 280X in CF in performance.

The reason I didn't link AT's scores is because they don't have an average graph at the end. Also, what's the point of linking BF3 when BF4 is coming out next month? Even with the BF3 win you linked there is no getting around the BF4 recommends 3GB of VRAM and we have this:

MSI Gaming 770 4GB = $465
MSI Gaming R9 280X = $310

bit better performance in some NV-friendly games (such as Bioshock Infinite, Battlefield 3, Batman Arkham City and Origins, Crysis 3, Assassin's Creed 3, Borderlands 2, Call of Duty Black Ops 2, etc.).

We are just going in circles and circles. There is a list of AMD friendly games just as long including Tomb Raider, Sleeping Dogs, Hitman Absolution, Dirt 3, GRID 2, Company of Heroes 2, Rome Total War 2, Far Cry 3: Blood Dragon, Metro games, etc.

AC3 performance has been fixed by AMD:
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/6.html

Performance in Black Ops 2 flies on all modern cards. Even a $199 R9 270X crushes this title:
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/9.html

Crysis 3 is hardly a win for 770 as at 1080 none of these cards has the power to max it out:
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/12.html

A significant win for 770 is SC:Blacklist:
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/19.html

I agree with Blackened that while some small price premium may be assigned to 770, it should be much less than $90-150. GTX770 2GB at $329 and GTX770 4GB at $349 is a lot more reasonable.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's pretty hard to talk about price/performance and general value when you spent $450x2 on 770 4GB cards + say $100 on that slave PhysX card? In total then your GPU setup was $1000 USD and yet HD7970GE/R9 280X offers 95% of the performance for $600-620.

1000 is better than the 1300 I would have spent if I had gone GTX 780 SLI.

My GTX 770s have 1GB of extra VRAM over the GTX 780, and overclocked to 1280/8ghz, are nearly as fast as stock GTX 780s, which are blisteringly fast.

And until I see the Crossfire quality of the R9 series, I can't say whether they are worth purchasing.

The 7900 series had excellent performance in Crossfire judging by FRAPS, but when you looked at FCAT, it was another story.

How can anyone create a list of features/advantages of AMD cards when you paid $400 USD more, or 61% more and yet your cards still cannot really outperform 7970GE/R9 280X in CF in performance.

Performance is one thing, quality is another. What use is it to get the high frame rates that multi GPU provides if it's a stutter or lag fest?

Crossfire still hasn't proven itself the equal of SLI, so it's not directly comparable imo.

The reason I didn't link AT's scores is because they don't have an average graph at the end. Also, what's the point of linking BF3 when BF4 is coming out next month? Even with the BF3 win you linked there is no getting around the BF4 recommends 3GB of VRAM and we have this:

MSI Gaming 770 4GB = $465
MSI Gaming R9 280X = $310

Someone in this thread suggested that NVidia should discontinue the 2GB versions of the GTX 770 and sell the 4GB exclusively. I agree completely, as 2GB is really not enough for smooth, stutter free performance for the upcoming generations of games.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
770gtx is a bad value compared to amd's 280x and 7970ghz. I think only a few people here are defending nvidia's outrage pricing. Nvidia needs a price cut.
 

Shakabutt

Member
Sep 6, 2012
122
0
71
wowtrainer.net
So just because you don't like it, it's a gimmick? You can hate on PhysX all you want bro, but the fact is, PhysX ain't going anywhere. It's the most versatile and powerful Physics middleware available today, as seen by the fact that it's integrated into the Unreal Engine 4.



Right, and how many games uses AMD's HD3D?

*crickets*



Compared to the rough, pot hole filled road that is Crossfire, SLI is like the Autobahn..



They aren't determinant factors, but they make the gameplay experience better than stock.



Eyefinity is awesome I will admit, but good luck playing at those ultra high resolutions on the stuttery mess that is Crossfire. AMD's frame pacing driver doesn't even work on resolutions above 1600p if I'm not mistaken.



How long does it take AMD to fix a problem or bug with a game?

I can count on 1 finger the number of dudes ive seen talking about 3D Vision or the games that take advantage of the tech.

Just like in the movie industry, 3D is fizzling out, thing is in the PC gaming space nobody gave 2 shits about it.

Physix has been chugging along on Nvidia's moneyhating efforts for 10 years now, all you get is 1 maybe 2 games a year that incorporates them and even then its a joke, exaggerated debris ala Borderlands 2 that looks just silly or some fancy smoke in Batman games, thats not game enhancing stuff bud, its just a gimmick.

The stuttering has been fixed with the frame pacing stuff from what i know so the 7990 looks like a mighty fine buy compared to the 690 right about now. (lol at that 1000 bucks price tag).

Edit : yep looks like im right

Crysis3-FOT.png

TombRaider-FOT.png

FarCry3-FOT.png


Look at that, look at that Nvidia value, for just 400 bucks more you get....Hbao and shiny AA.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
AMD's looking really good at dominating consoles with an inside track to PC gaming (due to consolification, Mantle). I don't know how NV pulls a rabbit out of the hat from here unless they are true believers in cloud-based gaming, and even then, that would probably hurt AMD and NV's profit margins, not that I believe cloud-based gaming is going to take off anytime soon for hardcore games.

AMD's CPU side, on the other hand... ugh.

Anyway, look at BF4 and other heavy hitters. Nobody cares if you beat the competition in a lightweight game where pretty much any GPU sticks to 60fps like glue. The heavy hitter games are the ones that demand higher performance. So beware of simply taking averages unless you are averaging the top 6 heavy games or something.
 

Ibra

Member
Oct 17, 2012
184
0
0
I can count on 1 finger the number of dudes ive seen talking about 3D Vision or the games that take advantage of the tech.

Just like in the movie industry, 3D is fizzling out, thing is in the PC gaming space nobody gave 2 shits about it.

Physix has been chugging along on Nvidia's moneyhating efforts for 10 years now, all you get is 1 maybe 2 games a year that incorporates them and even then its a joke, exaggerated debris ala Borderlands 2 that looks just silly or some fancy smoke in Batman games, thats not game enhancing stuff bud, its just a gimmick.

The stuttering has been fixed with the frame pacing stuff from what i know so the 7990 looks like a mighty fine buy compared to the 690 right about now. (lol at that 1000 bucks price tag).

Edit : yep looks like im right

Crysis3-FOT.png

TombRaider-FOT.png

FarCry3-FOT.png


Look at that, look at that Nvidia value, for just 400 bucks more you get....Hbao and shiny AA.

AMD is gone from high-end CPU market = high-end CPU market is dead.
AMD is gone from 3D market = 3D market is dead.

Classic. :thumbsup:

After reading Gravity (2013) reviews it's best 3D movie ever made. :cool:

AMD's looking really good at dominating consoles with an inside track to PC gaming (due to consolification, Mantle). I don't know how NV pulls a rabbit out of the hat from here unless they are true believers in cloud-based gaming, and even then, that would probably hurt AMD and NV's profit margins, not that I believe cloud-based gaming is going to take off anytime soon for hardcore games.

AMD's CPU side, on the other hand... ugh.

Anyway, look at BF4 and other heavy hitters. Nobody cares if you beat the competition in a lightweight game where pretty much any GPU sticks to 60fps like glue. The heavy hitter games are the ones that demand higher performance. So beware of simply taking averages unless you are averaging the top 6 heavy games or something.

CoD : Ghosts, Watch Dogs, AC: Black Flag system requirements are crazy. Nvidia won heavy hitters. No?
 

Shakabutt

Member
Sep 6, 2012
122
0
71
wowtrainer.net
AMD is gone from high-end CPU market = high-end CPU market is dead.
AMD is gone from 3D market = 3D market is dead.

Classic. :thumbsup:

After reading Gravity (2013) reviews it's best 3D movie ever made. :cool:



CoD : Ghosts, Watch Dogs, AC: Black Flag system requirements are crazy. Nvidia won heavy hitters. No?

Game Changer man. :rolleyes:

And what did Nvidia win by moneyhatting Ubi and Activision ?

Better performance ? Well see about that, knowing these games in particular are turning points in sis req and the specs they put out look more and more like what consoles are running on, i wouldn't jump the gun and praise NV just yet.

If you are referring to the gimmicks ive just posted (Physix, 3D)...then whatever.(guess 5 mil bucks doesn't buy you much dev support these days haha)
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
1.Gimick
2.Bigger Gimmick
3.I think the 7990 works pretty well with up to date drivers bro what are you talking about ? Nvidia has and had SLI hiccups too stop bullshiting.
4.So forcing ambient oclussion and some AA are determining factors when purchasing a GPU bro ? grasping at straws there aren't we ?
5. Eyefinity
6. Oh the "better driver" line...
Lol, pretty much. I think NVIDIA is still ahead in the multi-GPU department, but the rest of the inane arguments fanboys try put forward as "facts" are ridiculous. I'll be interested to see how Mantle develops just to see how nvidia fanboys will try to spin it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Lol, pretty much. I think NVIDIA is still ahead in the multi-GPU department, but the rest of the inane arguments fanboys try put forward as "facts" are ridiculous. I'll be interested to see how Mantle develops just to see how nvidia fanboys will try to spin it.

Based on current performance of GTX770 in BF4, R9 290X should beat GTX780 without Mantle. If R9 290 is $499-519, the $450 GTX770 4GB would need an immediate $100 price drop.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Based on current performance of GTX770 in BF4, R9 290X should beat GTX780 without Mantle. If R9 290 is $499-519, the $450 GTX770 4GB would need an immediate $100 price drop.
If the R9-290X is actually as fast as the Titan, I agree. The bigger benefit the 780 has is the slew of aftermarket parts available that offer more value for a very modest price increase. Just a hypothesis, given the stagnation of tech, if AMD really does have a home run with Mantle, they may price the R9-290X higher (as expected) as to increase their margins by capturing the early adopters as well as any new buyers once Mantle comes to fruition. Anyway, I'm still keeping my eye out for R9-290X performance on water.