G84 and G86 to hit in Q1 07

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
I've changed my mind, even for a step-up card, a 8800GTX simply due to its hefty price tag, $400. I'm looking at the alternative, an xbox 360 with HD-DVD player which can be had for less than $450.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: josh6079
I don't want this to be another price war thread where people complain about the current G80 pricing, but instead I'd rather us reflect more on the note that the G80's may be getting replaced by the G84/86's this early.

Would Nvidia really release a better performing card than the 8800GTX by Q1 07? Could they? That would be about 3 or 4 models to compete against the R600 and its immediate SKU's for its expected arrival.

The G80 won't be going anywhere because of these cards (though I fully expect that around the same time these new mid/low range chips launch G80 itself will undergo a die shrink to reduce manufacturing costs).

G84 and g86 are what will replace the 7600 & 7300 (G73/74) mid range lineup. It's no secret that most consumers purchase mainly lower mid-range when it comes to graphics cards (we on forums like anandtech are the exception for the most part). G80 will continue to be available (though as I said above it may well have undergone a die shrink and be known as G81).

The G7x range is what will become unavailable.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
G80 will continue to be available (though as I said above it may well have undergone a die shrink and be known as G81).
But do people think that that would be possible in this time period? I'm not saying that G80 will cease to be available, just that Nvidia is probably looking to replace it as fast as possible with a refresh--one that I'm doubtful will be here in ~3 months.

These mid-range to low-end G80 derivatives will be nice, but how much longer do you think we have until a solid 8800GTX replacement? ~3 months? ~6 months? ~9 months?
 

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
Good points Insidious.

But by the time I'm ready (if ever) for Vista this System will probably be getting pretty long in the tooth.

I'm also hoping to skip the C2D (whatever Socket it is) and AM2 and pickup the next generation.

I'm not really familar with Vista other than some bits I've seen on the Web. Sounds like with DX10 (which really isn't out yet?) and all enabled it's going to be a resource hog.

If RAM weren't so darn expensive these days that'd make a decent dent in the pricing.

Having to get a good 500W-600W PSU is not something I really want to do either, but it really looks like it's going to be needed with these ever increasing power hungry cards.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: josh6079
G80 will continue to be available (though as I said above it may well have undergone a die shrink and be known as G81).
But do people think that that would be possible in this time period? I'm not saying that G80 will cease to be available, just that Nvidia is probably looking to replace it as fast as possible with a refresh--one that I'm doubtful will be here in ~3 months.

These mid-range to low-end G80 derivatives will be nice, but how much longer do you think we have until a solid 8800GTX replacement? ~3 months? ~6 months? ~9 months?

I don't know if it will actually happen, but is quite possible. There is nothing preventing two chips from taping out at the same time (especially when they utilize different process nodes). The conventional wisdom so far has been that TSMC is still refining their 0.08 nm process compared to their 0.09 nm process.

In light of that, nvidia likely decided to launch with the safer 0.09 nm version, keeping the 0.08nm version as emergency firepower. If it turns out that the "emergency firepower" isn't really required, then perhaps nvidia launch a more modified G81 later on (april/may timeframe). Given Jen-Hsun Huang's love of 40% plus margins though, you can't rule out a straight die shrink as soon as possible.

As for how long should G80 stay in the market, normally around six to nine months of production for a chip (budget chips are an exception). Though this isn't a rule written in stone as NV30 proved.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Dethfrumbelo
I think the economy overall is in decline. The housing market is down significantly from several years ago and there's less credit available.

Bringing that type of reality into our world of fantasy games is unacceptable. :beer:

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
the cancellation of the R560 core from ATi...

Since when was the r560 cancelled? Link?

Link

As mentioned the X1650 XT is built on a new graphics core from ATI codenamed RV560. It?s an 80nm part and measures about 16.7mm by 15mm. It is, in fact, the exact same size as the X1950 Pro?s RV570 core and, as far as logic goes, would suggest that the RV560 has the same 330 million transistors as the RV570 does (Oct.30 Update: just got confirmation that this is, indeed, the case. The RV560 is essentially the same chip as the RV570 but with lower specs, of course). It has 24 pixel shaders units with 8 ROPs thus following ATI's traditional 3:1 pixel shader to ROP ratio, with 8 vertex shaders and a 128bit bus attached to 256MB of GDDR3. The reference X1650 XT we are testing today comes with a stock core clock frequency of 575MHz and a memory frequency of 675MHz (or 1.35GHz effective).

Something went boo boo and the RV560 got scrapped all together. The current R560 is nothing but R570 with one third of it switched off. Could be possible to unlock, who knows.
 

JBT

Lifer
Nov 28, 2001
12,095
1
81
$650 for a video card is just to much. You can by an entire console for less. Not that i'm into consoles but I'd be more inclined right now to buy a brand new PS3 rather than a new video card. The console will last me 3-4 years the video card MIGHT last me a 1 year.
Also my last video card purchase was for $430 including the price of a 3rd party cooler and personally the increase in performance just isn't there for me as the X1900XT can play CS:S and BF2 just fine.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Dethfrumbelo
I think the economy overall is in decline. The housing market is down significantly from several years ago and there's less credit available.

Either that, or everyone decided to blow their $3000 wads on PS3s.

Actually, the economy is currently in a state of expansion, expanding at an annual rate of 3%.

Sorry, just got outta econ class and couldnt help myself.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: josh6079
G80 will continue to be available (though as I said above it may well have undergone a die shrink and be known as G81).
But do people think that that would be possible in this time period? I'm not saying that G80 will cease to be available, just that Nvidia is probably looking to replace it as fast as possible with a refresh--one that I'm doubtful will be here in ~3 months.

These mid-range to low-end G80 derivatives will be nice, but how much longer do you think we have until a solid 8800GTX replacement? ~3 months? ~6 months? ~9 months?

That depends on how far Nv is able to shrink it. If it only switches from 90nm to 80nm, then I dont expect any significant changes in specs, it'll be something like the x1900gt -> x1950pro transition, but cheaper to manufacture. If they shrink it to 65nm, then we're probably looking at a nice clockspeed boost.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
seriously what were they expecting :( huge sales when no games to take real advantage of these GPU. Anyways also R600 is suppose to wip G80 ass so they are just preparing to wip R600 ass with G81 :? ...

AMD has also got angry that ATI hasn't released crap all yet and don't want anymore delays or the management will start getting blame and some axing will be involved. The demand for the GPU will go up when Alan Wake , Bioshock , UT2007 , Quake Wars and other next part DX9C/DX10 game start to come out. Also Nvidia needs to release its DX10 compatible driver since Microsoft is almost ready to release DX10 soon.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: tuteja1986
seriously what were they expecting :( huge sales when no games to take real advantage of these GPU. Anyways also R600 is suppose to wip G80 ass so they are just preparing to wip R600 ass with G81 :? ...

AMD has also got angry that ATI hasn't released crap all yet and don't want anymore delays or the management will start getting blame and some axing will be involved. The demand for the GPU will go up when Alan Wake , Bioshock , UT2007 , Quake Wars and other next part DX9C/DX10 game start to come out. Also Nvidia needs to release its DX10 compatible driver since Microsoft is almost ready to release DX10 soon.

They will. The thing is that nVIDIA hasnt delivered the performance drivers yet. Not to mention some of the G80 shader capabilities such as MUL is not even activated. (As they are "missing").

GF3 vs 8500 anyone? ;)
 

Golgatha

Lifer
Jul 18, 2003
12,650
1,512
126
Originally posted by: sodcha0s
Well lets see... most of us on these forums have either a 7900 or a 1900 card, which cost on average in the neighborhood of $400. These cards are still plenty powerfull enough for todays games, and while having an 8800 card now would be great, it's kind of hard to justify spending another $450-$650 on another video card in less than a year. The others that are still on x800 or 6k series cards may be waiting for R600 to see what it offers, or figuring prices will drop on 8800 cards when it's released.

These things are great cards and will sell, but I think the price will have to drop quite a bit first, especially on the GTX. As for me, I'm gonna have to stick with my 1900xtx until I just can't play any new games at all.... I spent way too much money on it, and I'm not going to spend that kind of money on a single video card just to play games again.


Caught the Dell deal today for a GTS at ~$400. I'm going to sell the X1900XTX while it still has some value. I figure ~$125 out of pocket isn't bad to get my game on at the maximum settings available and to have DX10 compliant hardware.
 

Greenman

Lifer
Oct 15, 1999
20,358
5,112
136
Did nvidia really not know that most people won't spend 6 bills on a video card? Think about, someone walks into BB, the salesmen says you can have this cool video card for $600, or you can get 2 computers or a pretty good laptop for the same price. It's a no brainer for most people.
With all that said, I'm going to get one, when the price hits $200.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Greenman
Did nvidia really not know that most people won't spend 6 bills on a video card? Think about, someone walks into BB, the salesmen says you can have this cool video card for $600, or you can get 2 computers or a pretty good laptop for the same price. It's a no brainer for most people.
With all that said, I'm going to get one, when the price hits $200.

200 bucks will get you a severely used GTS near the end of it's life. So enjoy.

 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
Originally posted by: keysplayr2003
All can be blamed on Vista. The only OS one can run DX10 on. So not only do people have to buy a G80/R600, but also shell out for a minimum of a 400W good quality PSU, and THEN spend hundreds more on Vista to utilize DX10 when there are no freaking DX10 games out or even near out to justify all the costs to get ready for when they do. I think everyone (well most everyone) is taking the sit back, and relax and wait for all prices to come down, wait until the first Service Pack for Vista comes out, because in most of our minds, XP and DX9 hardware is cutting it just fine right about now.

EDIT: Oh yeah, lets not forget the recommended 2GB of memory to comfortably run Vista. Yet more of an expense.

If you aren't using 2 gigs of ram yet you probably aren't someone who needs Vista anyway.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
DailyTech says the cards are going to be entry-level and mid-range video cards

http://www.dailytech.com/article.aspx?newsid=5011
Digitimes reports NVIDIA is expected to unveil G80-derived graphics processors in Q1?2007. The reported G80-derivatives are currently known as G84 and G86. Not much detail is available on the upcoming G84 and G86 at the moment except they will be entry-level and mid-range offerings. On the NVIDIA chain of products it would appear G84 and G86 will replace the current GeForce 7600 and 7300-series products.

I paid $650 for my eVGA GeForce 8800GTX and ya know what, given the same choice to make, I'd do it again in a heart beat. I'm extremely happy with the card and the ability to throw anything at it without reaching an unplayable level. Also, everything looks b.e.a.-utiful with forced 16x QAA and 16x AF :D.

EDIT: Made the link clickable.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: Greenman
Did nvidia really not know that most people won't spend 6 bills on a video card? Think about, someone walks into BB, the salesmen says you can have this cool video card for $600, or you can get 2 computers or a pretty good laptop for the same price. It's a no brainer for most people.
With all that said, I'm going to get one, when the price hits $200.

200 bucks will get you a severely used GTS near the end of it's life. So enjoy.

very true ... for nvidia GPUs

otoh, you WILL get a high end R600 for $200 ... after a year or so . ;)
[8500-128MB/9800xt/x850xt ... x1900xt all selling for ~$200 within a year or so ...
... x2900xt, doubt anything will change quickly]

maybe that's why i always 'end up' with ATi cards. :p
:Q
[i'm cheap but still like good performance]
 

enz660hp

Senior member
Jun 19, 2006
242
0
0
Originally posted by: schneiderguy
Originally posted by: Ibiza

Anyone else think the PC games scene is getting boring just lately?

a lot of the "big" games that were supposed to release this year got delayed :( UT2k7, Crysis, Alan Wake, Spore, HL2 Ep2, etc. all come out next year :)

yeah, major bummer for me and other PC gamers....

To do any gaming on vista...2gigs is recommended, but for the person who doesnt do gaming, 1gig is just fine. I ran vista beta 2 on my a64, 1gig and it wasnt slow at all.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Next gen console demand has probably been a big factor in the G80 selling so poorly.

Personally I'm trying to get a Wii to play Zelda.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I dont understand why everyone is whining about the price tag of G80. Nothing has really changed.

A lot of people here on this board bought an X1900XTX for $650 when it launched and it only offered half the performance improvement that G80 is offering.

Although I do agree with jiffy that consoles are severly cutting into G80 sales. Very poor timing on Nvidia's part releasing G80 the week before PS3 and Wii. Who has $650 to spend on a video card when they just dropped $600 on a PS3 (or plan to)?

Bottom line is that when people stop paying $650 for a video card, Nvidia and ATI will stop charging $650 or face taking losses. Since there will always be early adopters, we're just going to have to live with it.
 
Dec 27, 2001
11,272
1
0
Well, I needed a video card since I'm currently without one. But I never pay full price for anything and when yesterday's Dell deal came along, I jumped on it and after cash back and CC rewards I'll end up having paid less than $380 for an 8800GTS that will OC to GTX levels.
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
I'd love to get a GTS, but I just spent $500 on an 1900XT not too long ago... need recovery time. Though if Vista and DX10 games had arrived then I might jump. There's really not a lot of reasons to get a G80 for me right now
 
Jun 14, 2003
10,442
0
0
well im not suprised, the amount of dosh you gotta spend on a good gaming set up is just daft now. and a year after you bought it another wedge of cash will be shovelled into the boiler to keep it going. £400+ quid on a GPU no thanks, i dont care if it wipes my ass or cooks me egg sarnies in the morning.

a PS3 or 360 and decent HDTV/LCD might be of similar outlay initially(£300 for 360 +£500 for samsung 26inch HDTV or £370 for samsung 21inch LCD)....but it probably wont need upgrading for the next 4-5 years. (yeah theres HD-DVD addons etc...but they wont stop you playin games if u dont buy them) thats my reasoning for chucking PC gaming and becoming a cosole gamer. so far i've had a zillion times more fun on my 360 than i have with the PC.

a decent c2d dell will do you fine for computing needs and if you get a good deal u may even get a good 20inch LCD in with it
 
Jun 14, 2003
10,442
0
0
Originally posted by: wanderer27
You know, I've been sitting here reading this, and I've come to the conclusion that Vista is going to be a really expensive upgrade:

2-4GB RAM - $300-$500

CPU - $$$

MB - $$$

PSU - $100-$200

Grapics Card (DX10) - $400-$600

Vista - $100-$300


Not counting CPU & MB, a lot of us are looking at least at $900-$1600+


These are really rough numbers, but right now I'm looking at close to $2000 just to upgrade from my current platform to the next gen hardware/software.

That's a bit steep, especially when my current System is doing just fine.

This one's going to wait for a bit . . . .

360 and a mac is my move

screw waiting for and then upgrading for vista..... ill take a good OS thats already got alot the features vista brings and a hardware software combo that should be longer in the tooth than a MS based system before it needs an upgrade