Are you buying a GTX 1070/1080?

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Are you buying either the GTX 1070 or the GTX 1080?

  • Yes, a GTX 1080 for me.

  • Yes, a GTX 1070 for me.

  • No, and I'm not in the market for a new GPU anytime soon.

  • No, I plan to buy something else instead (please elaborate in thread).


Results are only viewable after voting.

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
What has me feeling frustrated is the release cadence and performance increases each year. They make it almost not matter which one you get. You can spend $700 for 30% increase on a mid range card, or wait another year and spend the same on 30% more from that on a big die card. They know it doesn't benefit someone very much to wait for a big die chip, given these insane prices for even the mid range, so they bank on you buying both. If I wait for dig Pascal, 1 year later mid range Volta comes along with 30% more performance for the same price. So suddenly my Big Pascal looks like a mid range card. You see the frustration?
It used to be you spend top dollar and you get top performance, or close to it, for about 2 years roughly. They literally crapped on my sandwich here with their new release schedules. I will choose to stay on the big die schedule, won't buy over priced mid range cards, and certainly won't give them the satisfaction of buying twice in a generation.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Easy. There is a Pascal chip ABOVE and BELOW it. What is the word for something inbetween the top and bottom. Middle. Mid-range. This is irrespective of die size or anything else. Purely on a product positioning basis.

Oh, please provide what this is? A future Titan maybe released in 6 months? A Ti version a year from now? Do those exist and will they also be available in a week or two? I think we all know the answer to that...
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
How about the P100 they literally demonstrated first. Really? Did you not watch that? I'd say the fact they're already selling them pretty well confirms that there is a big Pascal chip. Last I checked 3820 was bigger than 2560 and 16GB was bigger than 8GB.

If you want to play games and act like nvidia isn't going to ever sell a consumer Big Pascal, you're entitled to put your head in whatever sand you want. But back here in reality, GP104 is the MIDDLE chip as has been every xx4 chip before it and as indicated by the CONFIRMED existence of a chip above it. GP106 isn't confirmed yet but since I don't stick my head in the sand I will count on it coming out below GP104. Like every xx6 named chip before it.

But if you want to give nVidia $700 for the 16nm equivalent of a 560 Ti, be my guest. Not my money.
 
Last edited:

MagickMan

Diamond Member
Aug 11, 2008
7,460
3
76
From this new information, we know MagickMan's source is within 5 miles of his location.
GOGOGO!
/sarcasm

Reading comprehension is hard? It would be easy for NV to track down who I am. You guys? Not so much. Here's a pro tip, wait `til tomorrow and see if I'm right? :eek:
 

maddie

Diamond Member
Jul 18, 2010
5,152
5,540
136
Oh, please provide what this is? A future Titan maybe released in 6 months? A Ti version a year from now? Do those exist and will they also be available in a week or two? I think we all know the answer to that...
Doesn't this depend on if AMD is competitive.

If Vega is early and performs well, do you honestly think nvidia will allow them to remain at the top unchallenged? They will respond even if they have to swallow lower margins to do it.

That's why I don't begin to understand the rabid anti AMD posters. Talk about cutting off your nose to spite your face. So dumb.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I want a strong AMD. If there was an AMD announcement to parallel the Nvidia products then 1080 wouldn't be $700. It probably wouldn't even be released yet. We might be looking at a release a few months from now in the form of big pascal @ $650 if AMD came out swinging with a fierce big die card. Then 3 or 6 months later we'd get the benefit of the mid range series at $250-$400. But with Nvidia standing alone, they seem to have an odd desire to price damn near everyone out of the market LOL. Weird.
 

Timmah!

Golden Member
Jul 24, 2010
1,571
934
136
What makes you think the 1080 Ti, whatever that GPU will actually look like, will come out at the same price as the GTX 1080? Was the GTX 980 at $549 a "bad deal" because the GTX 980 Ti came out at $650 nine months later?

Yes.

And this coming from someone who actually wants to buy 1080.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
How about the P100 they literally demonstrated first. Really? Did you not watch that? I'd say the fact they're already selling them pretty well confirms that there is a big Pascal chip. Last I checked 3820 was bigger than 2560 and 16GB was bigger than 8GB.

If you want to play games and act like nvidia isn't going to ever sell a consumer Big Pascal, you're entitled to put your head in whatever sand you want. But back here in reality, GP104 is the MIDDLE chip as has been every xx4 chip before it and as indicated by the CONFIRMED existence of a chip above it. GP106 isn't confirmed yet but since I don't stick my head in the sand I will count on it coming out below GP104. Like every xx6 named chip before it.

But if you want to give nVidia $700 for the 16nm equivalent of a 560 Ti, be my guest. Not my money.

:rolleyes:

You mean a professional product that would be multiple thousands of dollars? Then you would be complaining that NV didn't sell it for $300.

I guess no one should buy any Intel CPUs because there are Xeons out there with 20C and 4C/8T are all low-end?

Please, go ahead and wait for the consumer 20C part while the rest of us enjoy the available products. Same with Pascal. If it made sense to release a $2500 300w consumer part, I'm sure NV would figure it out...They have a business to run and probably will sell all their viable die parts for a lot more than a Geforce would fetch.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
I want a strong AMD. If there was an AMD announcement to parallel the Nvidia products then 1080 wouldn't be $700. It probably wouldn't even be released yet. We might be looking at a release a few months from now in the form of big pascal @ $650 if AMD came out swinging with a fierce big die card.
Yeah, lack of competition, that's what it is :/

They have a business to run and probably will sell all their viable die parts for a lot more than a Geforce would fetch.
Yeah. Makes sense.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Just curious if you're planning to buy a GTX 1070 and/or 1080. If not, what will you buy instead?

I'll wait for Polaris 10 and if the price/performance is good enough will pick one up else I'll just wait for Vega. I'm still somewhat disgusted with Nvidia's way of doing business and the way they dropped support for older cards has left a bad taste in my mouth.

Hard to support a company like this.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I just completely ignore the mid range products as far as making a purchase decision is concerned. I won't play that over priced mid range game. I don't buy mid range cards so I just wait for a real card. I used to buy mid range cards, if the performance and price was right. There were some real winners in the past, such as the Geforce 8800GT. That card was about $300 and was nearly as fast as its CURRENT high end counter part. Now that's an exciting mid range product. Not one that costs MORE than a high end product, lol.
The good thing about mid range cards coming out first though is they give us a potential indication of how potent the architecture is and I can at least try to guestimate how well the big chip will perform. In this case the clock speeds are pretty damn exciting. I bet the real high end card will perform blistering fast and be a worthy candidate for an upgrade, possibly a single GPU solution if it OC's anything like the 1080 appears to.

It was mid-range only in MSRP and perhaps cooling which was obnoxious on most cards to the point that a lot of them overheated the rest was pretty much high-end. 112shaders out of 128 was very good. It's like 980Ti compared to TitanX. What mid-range cards perform within 10%-15% of the high-end? I got the high-end card back then but only because I bought it at launch for less then the price-gauged undercooled 8800GTs were going for. I mean 8800GTS 512 which was as fast as 8800GTX sometimes slower other times faster but on average pretty much the same. There was 8800Ultra but that card was a rip-off. I also try to ignore mid-range cards and try to go straight to the high-end but right now it won't be easy because I burned my 980Ti and I would have to rely on that antiquated Titan. It's a shame it aged so fast. From that purchase I learned one thing to also skip ultra high-end. When I bought it I thought that it would be the only consumer card with GK110 for at least 10 months and it turned out that 780 with practically the same performance was released right after it.
 
Feb 19, 2009
10,457
10
76
Someone at NV HQ decided to put their card center of that price range.

For sure, IHVs will have full knowledge of the market.

The 1070 like the 970 is priced to sell, typically 10:1 compared to the 1080 and 980. The 1080's price being jacked up even more is no coincidence, its beyond reach for most of the market as it approaches enthusiast pricing category.

Likewise the reference 1070 at $449 is $1 away from hitting the "nawww, too expensive" circuits within consumer's brains. See within that $350 and $450 sweet-spot for mid-range, $449 is at the upper level but still somehow deemed more "affordable".

Marketing and focus group folks got this covered.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
What has me feeling frustrated is the release cadence and performance increases each year. They make it almost not matter which one you get. You can spend $700 for 30% increase on a mid range card, or wait another year and spend the same on 30% more from that on a big die card. They know it doesn't benefit someone very much to wait for a big die chip, given these insane prices for even the mid range, so they bank on you buying both. If I wait for dig Pascal, 1 year later mid range Volta comes along with 30% more performance for the same price. So suddenly my Big Pascal looks like a mid range card. You see the frustration?
It used to be you spend top dollar and you get top performance, or close to it, for about 2 years roughly. They literally crapped on my sandwich here with their new release schedules. I will choose to stay on the big die schedule, won't buy over priced mid range cards, and certainly won't give them the satisfaction of buying twice in a generation.

I thought the same only I wasn't even tempted to buy the mid-range cards. If I still had my 980Ti at 1500MHz I wouldn't even be considering being ripped off on cards that will perform like crap in a year. How could they not implemented Async-Compute? Probably intentionally because it suits their planned obsolescence business scheme. But after the premature aging of Kepler, I remember when the wither 3 came out and I had Titan SLI and I even couldn't enjoy the game... That was 2000$ worth of graphics hardware for crap sake. I probably will accept a bit slower cards like choosing fury instead of 980Ti and forget about NV altogether. I've had enough of their crap. I only wish I hadn't bought a G-SYNC montor. There are some people afraid that if they stay with maxwell they won't be able to enjoy the games in the near future. I don't get it how they can feel that and still buy NV... FuryX is 97% of the 980Ti performance at stock after OC it is about 20% slower then 980Ti but. 20% isn't noticeable and at least fury owners are not afraid not to upgrade. It's preposterous, really. 1080 still doesn't have async compute, if maxwell falters in DX12 1080 won't be much better either. 1080 is not future-proof by any means of the imagination.
 
Last edited:
Feb 19, 2009
10,457
10
76
I thought the same only I wasn't even tempted to buy the mid-range cards. If I still had my 980Ti at 1500MHz I wouldn't even be considering being ripped off on cards that will perform like crap in a year. How could they not implemented Async-Compute? Probably intentionally because it suits their planned obsolescence business scheme. But after the premature aging of Kepler, I remember when the wither 3 came out and I had Titan SLI and I even couldn't enjoy the game... That was 2000$ worth of graphics hardware for crap sake. I probably will accept a bit slower cards like choosing fury instead of 980Ti and forget about NV altogether. I've had enough with their crap. I only wish I hadn't bought a G-SYNC montor.

Expect to see real Async Compute in Volta, with a return of the hardware scheduler that will make it a compute beast.

And I was expecting NV to make more noise about VR since Pascal finally has functional preemption.

The funny thing is NV's Pascal reviewer guide asks folks to talk about preemption and how it's great for the VR experience... well, I wonder how many tech sites will call out NV for lying about Maxwell and preemption. They didn't have it and still talked about it when Maxwell debut in 2014. In fact, none of the VR features NV claimed about with Maxwell eventuate into any VR game.

For Pascal? It's not the SM-Projection, that's useless because it also won't eventuate into many VR games (for reasons which are obvious if you have some basic logic). The biggest thing is they can finally get Asynchronous Timewarp working. This is already in the Oculus SDK and it's an automatic feature built-in. At least this is real.

The biggest takeaway is NV knows how to game the market. Prior to the recent Rift SDK launch, Async Timewarp was NOT a thing outside of developer circles. Reviews can't test it, they can't say whether AMD or NV produces a lower latency, since it's not a metric which can be easily tested.

Now that it's a functional feature, Pascal brings support. It's going to result in VR reviewers saying stuff like "Oh! It's so much smoother on the 1080 vs the 980/Ti!" or "Now I can play for much longer without feeling ill!"... these things sell. AMD's advantage with fine-grained preemption in GCN for all these years? Nada. It was way too ahead for it's time, never utilized outside of AMD demos.

Likewise, I feel the same about Async Compute. It's only the start, as devs get better at it, by 2018, expect 20-30% performance gains routine. Guess what? Volta will arrive with fully capable hardware Async Compute. Just in time to a market ripe for the picking.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
I won't be supporting a company that sells me a midrange chip at $600/$700. If AMD completely drops the ball and delivers crap in regards to Vega, I'll get something NV secondhand.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I won't be supporting a company that sells me a midrange chip at $600/$700. If AMD completely drops the ball and delivers crap in regards to Vega, I'll get something NV secondhand.

Tahiti was 365mm2 (14% larger than GP104) when it came out over 4 years ago in January of 2012, with 5 less gb's of significantly slower memory, and with inflation would have cost $575 today. The dollar per mm2 value proposition isn't amazingly better than GP104 at $599. Did you complain then?

Vega 11 is probably going to have a die size in the range of GP104; maybe slightly larger. Lets say 350mm2. Lets say it comes out in December, it's 10% faster than GP104, and AMD prices it at $599. Would you reign down the same criticism and/or would you consider buying it?
 
Last edited:

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Well, let's see, maybe I wasn't buying hardware back then so I couldn't have? My join date is 2013 after all. Possible.

Last time AMD released a chip at $550, it was 438mm^2 regardless, so I don't really get your point.

And I'm not in the market for smaller Vega, but yes, I would criticize it the same if it turns out like that. However, since Vega isn't out, and we know little to nothing about it, I don't know what relevance it has to the discussion.

This is the thing that I see from NV/Intel fans all the time, in pretty much any discussion, "AMD would do the same thing in this situation", but see the thing is, 90% of the time, they aren't, and they don't. It's a complete red herring.
 
Feb 19, 2009
10,457
10
76
The thing we can be sure about with Vega, it ain't gonna be cheap, not with HBM2. If you want value, that's what Polaris and the 1070 is for.
 

renderstate

Senior member
Apr 23, 2016
237
0
0
For Pascal? It's not the SM-Projection, that's useless because it also won't eventuate into many VR games (for reasons which are obvious if you have some basic logic).

You already decided it's useless without even knowing how it works..wow :) Care to elaborate some more and tell us why it won't be used?
 

renderstate

Senior member
Apr 23, 2016
237
0
0
If Vega die size is similar to GP104 it won't use HBM2, unless AMD has made no significant progress on the compression front. HBM2 is cool and everything but it's a margins killer.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
The thing we can be sure about with Vega, it ain't gonna be cheap, not with HBM2. If you want value, that's what Polaris and the 1070 is for.

That is fine. As long as the performance is kickass, it is OK to be expensive. I want at least 40% faster over 1080; max oc vs max oc. Heck it could be 1k if it's 2x faster than 1080.