NVIDIA to Launch GeForce 8600 Series on April 17th

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
The reduced memory interface is already balanced out by a high memory clock. You dont need such big memory interface if you can match it with a high memory clock. Samsung's latest GDDR4 really allows to do this especially when games arent as severly bandwidth limited unless your thinking of 2560x1600 with AA/AF.

A 7600GT has 22.4gb/s of bandwidth. Now if the rumoured 8600GTS has a memory clock of 2000mhz, thats 32gb/s. But likewise as coldpower said, the performance increase from the increased memory interface doesnt outweigh the overall increase in cost of the midrange GPU, the die size of the midrange GPU and a much more complex PCB required to accomodate the memory and its interface. Not to mention power consumption increases. What matters is the efficency of the architecture. Just take a look at 7600GT (128bit) vs the 6800 ultra (256bit). Or better, X850XT PE (256bit) vs X1650XT (128bit).

Anyway, why do you think the R560 ala X1650XT with 21gb/s bandwidth outperform the 7600GT which has 22.4gb/s at higher res with AA/AF?

 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: Cookie Monster
The reduced memory interface is already balanced out by a high memory clock. You dont need such big memory interface if you can match it with a high memory clock. Samsung's latest GDDR4 really allows to do this especially when games arent as severly bandwidth limited unless your thinking of 2560x1600 with AA/AF.

A 7600GT has 22.4gb/s of bandwidth. Now if the rumoured 8600GTS has a memory clock of 2000mhz, thats 32gb/s. But likewise as coldpower said, the performance increase from the increased memory interface doesnt outweigh the overall increase in cost of the midrange GPU, the die size of the midrange GPU and a much more complex PCB required to accomodate the memory and its interface. Not to mention power consumption increases. What matters is the efficency of the architecture. Just take a look at 7600GT (128bit) vs the 6800 ultra (256bit). Or better, X850XT PE (256bit) vs X1650XT (128bit).

Anyway, why do you think the R560 ala X1650XT with 21gb/s bandwidth outperform the 7600GT which has 22.4gb/s at higher res with AA/AF?

the obvious answer is ATI >Nvidia.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: ahmurat
Have you tried playing the "high bitrate" 720p quicktime videos , as in from apple.com/trailers/hd
My system can't play those however i could play a divx 1280x720 no problem.

Most likely your CPU, not your GPU.

Oh and for these.. I may pick up an 8600GT to replace my old 6800GT in my other PC... look a nifty swap in numbers :D.
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Originally posted by: ahmurat

Have you tried playing the "high bitrate" 720p quicktime videos , as in from apple.com/trailers/hd
My system can't play those however i could play a divx 1280x720 no problem.

Those aren't "high bitrate" unless the quotes are intended to designate irony. Indeed, even the 1080p clips are half the bitrate of broadcast, which in turn are half that possible with current disc formats.


Originally posted by: Mem
So the 8600 Ultra is now the 8600GTS?

8600 Ultra never was. That info was erroneous along with the claimed 256-bit memory bus. Which should have been obvious given past naming conventions. Here are the models listed in the latest driver:

8800 GTX
8800 GTS
8800 Ultra
8600 GTS
8600 GT
8500 GT
8400 GS
8300 GS
 
Dec 21, 2006
169
0
0
8600 Ultra never was. That info was erroneous along with the claimed 256-bit memory bus. Which should have been obvious given past naming conventions. Here are the models listed in the latest driver:

strange... I seem to remember an artice on AT or DT regarding NVidia ressurecting the GTS and Ultra naming conventions... GTS is a deviation in itself. However, I could just be wrong on that one.
I am quite interested in whether we'll see some third variant of the 8800, as in previous generations (ie. 8800 GT).
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: shadowofthesun
8600 Ultra never was. That info was erroneous along with the claimed 256-bit memory bus. Which should have been obvious given past naming conventions. Here are the models listed in the latest driver:

strange... I seem to remember an artice on AT or DT regarding NVidia ressurecting the GTS and Ultra naming conventions... GTS is a deviation in itself. However, I could just be wrong on that one.
I am quite interested in whether we'll see some third variant of the 8800, as in previous generations (ie. 8800 GT).

Not to mention that Ultra was a mid range name aswell as a high end name in the FX series. It is possible, just unlikely to happen.

I'd have liked to have seen 192bit at least though.
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Difference being the Ultra suffix designated higher clocks, not a diff'rent memory bus from the same numerical model. Interestingly, there is no 8600 GS listed to match the rumoured AGP part but hopefully that will turn out true (and indeed a GT).
 
Jun 14, 2003
10,442
0
0
Originally posted by: DeathReborn
http://www.vr-zone.com/?i=4723

NVIDIA is set to launch mainstream 8600GTS (G84-400) and 8600GT (G84-300) as well as 8500GT (G86-300) on 17th of April. GeForce 8600GTS and 8600GT will have 256MB GDDR3 memories onboard and sports 128-bit memory interface but no HDMI yet. GeForce 8600GTS is meant to replace 7950GT and 7900GS while 8600GT to replace 7600GT and 8500GT to replace 7600GS.

8600GTS will be clocked at 700MHz core / 2GHz memory and comes with dual D-DVI, HDTV, HDCP but requires external power. 8600GTS will be priced between US$199-$249. Another mainstream model, the 8600GT will be clocked at 600MHz core / 1.4GHz memory and has 2 variants; one with HDCP (G84-300) and the other without (G84-305). This model doesn't requires an external power. 8600GT will be priced between US$149-$169.

The last model meant for budget segment is actually a G84 core but downgraded to meet the value segment pricing structure. The 8500GT will be clocked at 450MHz core / 800MHz 256MB DDR2 memory and comes in 2 variants; one with HDCP (G86-300) and the other without HDCP (G86-305). 8500GT will be priced between US$79 to US$99. Towards end of April, we can expect NVIDIA to release GeForce 8300GS for the budget market to replace 7300 series.

The NVIDIA 80nm G84 and G86 line-up will meet head on with ATi's DX10 65nm offerings where mainstream RV630 is slated to arrive in May and value RV610 is slated to arrive earlier in April.

700/2000 looks pretty nice specs for midrange, even if it is 128bit.

thats honestly a disappointment. with the move to 320 and 384bit memory buses (512bit for Ati when it turns up) i was sincerely hoping the mid range would get a boost...even if it was just to 192bit or something, 256bit would of been nice also and still comes in under the 8800's. but seems like theyve decided not to move on at all which is poor game IMO.
256mb of memory is less weak, but again i would of liked to see more.

2Ghz memory you say? so what. its nothin special, its still gonna be a relatively bandwidth starved card.

ill wait for the reviews, but for now, poor show nvidia, poor show. i bet the 8800GTS 320mb is only a couple of £ more on release too so the 8600GTS wont even be worth it.

 
Jun 14, 2003
10,442
0
0
i figuer the 8600GTS will have 32gig of bandwidth for its memory.

and may i just point out

im not troubled at the numbers, more so the lack of progress. my 9500pro has a 128bit memory interface and thats ancient. previous cards had 128bit memory buses too.

thats along time with the same bus size, are they even trying to progress or what? it must be dirt cheap to make a 128bit bus now, its that fricken old.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: Auric
Difference being the Ultra suffix designated higher clocks, not a diff'rent memory bus from the same numerical model. Interestingly, there is no 8600 GS listed to match the rumoured AGP part but hopefully that will turn out true (and indeed a GT).

nVidia don't have any plans for DX10 AGP cards at all. It was a rumour that got shot down.


Originally posted by: otispunkmeyer
i figuer the 8600GTS will have 32gig of bandwidth for its memory.

and may i just point out

im not troubled at the numbers, more so the lack of progress. my 9500pro has a 128bit memory interface and thats ancient. previous cards had 128bit memory buses too.

thats along time with the same bus size, are they even trying to progress or what? it must be dirt cheap to make a 128bit bus now, its that fricken old.

Then you must be even more peeved with ATI's decision to have the X2600 on a 128bit bus too. It's worse dropping from 512bit than it is from 384bit.

As for the 256MB you mentioned above, they are leaving it up to EVGA, XFX etc to decide if they want to put 512MB on the cards. I myself won't be purchasing a card with less than 512MB at any time in the future.
 

R3MF

Senior member
Oct 19, 2004
656
0
0
i think it is a screwup.

the mid-range chips should have a 256bit bus with the intention of providing over 50GB of bandwidth (1600MHz speed).

this is not unreasonable in an era where a high-end card will have over 128GB of bandwidth.

32GB sucks!
 

XMan

Lifer
Oct 9, 1999
12,513
49
91
Will this release drop the prices of the high-end cards any, you think?

I'm eyeing a 640MB GTS but I'm hesitant to jump now if I can save $50 bucks by waiting a few weeks.
 

Conky

Lifer
May 9, 2001
10,709
0
0
Originally posted by: XMan
Will this release drop the prices of the high-end cards any, you think?

I'm eyeing a 640MB GTS but I'm hesitant to jump now if I can save $50 bucks by waiting a few weeks.
Wait, the prices will definitely fall. They ALWAYS do.

I have a lowly 7900GS and I am hanging onto that until I actually need to upgrade. I was tempted to get the 8800GTS 320mb now that it's well under $300 but then I realized that there is nothing my 7900GS can't do right now for me and that I should probably upgrade my 1280x1024 LCD monitor first.

 

rmed64

Senior member
Feb 4, 2005
237
0
0
Originally posted by: otispunkmeyer
Originally posted by: DeathReborn
http://www.vr-zone.com/?i=4723

NVIDIA is set to launch mainstream 8600GTS (G84-400) and 8600GT (G84-300) as well as 8500GT (G86-300) on 17th of April. GeForce 8600GTS and 8600GT will have 256MB GDDR3 memories onboard and sports 128-bit memory interface but no HDMI yet. GeForce 8600GTS is meant to replace 7950GT and 7900GS while 8600GT to replace 7600GT and 8500GT to replace 7600GS.

8600GTS will be clocked at 700MHz core / 2GHz memory and comes with dual D-DVI, HDTV, HDCP but requires external power. 8600GTS will be priced between US$199-$249. Another mainstream model, the 8600GT will be clocked at 600MHz core / 1.4GHz memory and has 2 variants; one with HDCP (G84-300) and the other without (G84-305). This model doesn't requires an external power. 8600GT will be priced between US$149-$169.

The last model meant for budget segment is actually a G84 core but downgraded to meet the value segment pricing structure. The 8500GT will be clocked at 450MHz core / 800MHz 256MB DDR2 memory and comes in 2 variants; one with HDCP (G86-300) and the other without HDCP (G86-305). 8500GT will be priced between US$79 to US$99. Towards end of April, we can expect NVIDIA to release GeForce 8300GS for the budget market to replace 7300 series.

The NVIDIA 80nm G84 and G86 line-up will meet head on with ATi's DX10 65nm offerings where mainstream RV630 is slated to arrive in May and value RV610 is slated to arrive earlier in April.

700/2000 looks pretty nice specs for midrange, even if it is 128bit.

thats honestly a disappointment. with the move to 320 and 384bit memory buses (512bit for Ati when it turns up) i was sincerely hoping the mid range would get a boost...even if it was just to 192bit or something, 256bit would of been nice also and still comes in under the 8800's. but seems like theyve decided not to move on at all which is poor game IMO.
256mb of memory is less weak, but again i would of liked to see more.

2Ghz memory you say? so what. its nothin special, its still gonna be a relatively bandwidth starved card.

ill wait for the reviews, but for now, poor show nvidia, poor show. i bet the 8800GTS 320mb is only a couple of £ more on release too so the 8600GTS wont even be worth it.

You have to remember, these cards arent for the "elite" resolution crowd that plays at more than 1280x1024 resolutions

64 shaders, 700 core/2000 memory is pretty good for midrange 8 series....but I do agree it is time to move on from 128-bit

 
Jun 14, 2003
10,442
0
0
Originally posted by: rmed64
Originally posted by: otispunkmeyer
Originally posted by: DeathReborn
http://www.vr-zone.com/?i=4723

NVIDIA is set to launch mainstream 8600GTS (G84-400) and 8600GT (G84-300) as well as 8500GT (G86-300) on 17th of April. GeForce 8600GTS and 8600GT will have 256MB GDDR3 memories onboard and sports 128-bit memory interface but no HDMI yet. GeForce 8600GTS is meant to replace 7950GT and 7900GS while 8600GT to replace 7600GT and 8500GT to replace 7600GS.

8600GTS will be clocked at 700MHz core / 2GHz memory and comes with dual D-DVI, HDTV, HDCP but requires external power. 8600GTS will be priced between US$199-$249. Another mainstream model, the 8600GT will be clocked at 600MHz core / 1.4GHz memory and has 2 variants; one with HDCP (G84-300) and the other without (G84-305). This model doesn't requires an external power. 8600GT will be priced between US$149-$169.

The last model meant for budget segment is actually a G84 core but downgraded to meet the value segment pricing structure. The 8500GT will be clocked at 450MHz core / 800MHz 256MB DDR2 memory and comes in 2 variants; one with HDCP (G86-300) and the other without HDCP (G86-305). 8500GT will be priced between US$79 to US$99. Towards end of April, we can expect NVIDIA to release GeForce 8300GS for the budget market to replace 7300 series.

The NVIDIA 80nm G84 and G86 line-up will meet head on with ATi's DX10 65nm offerings where mainstream RV630 is slated to arrive in May and value RV610 is slated to arrive earlier in April.

700/2000 looks pretty nice specs for midrange, even if it is 128bit.

thats honestly a disappointment. with the move to 320 and 384bit memory buses (512bit for Ati when it turns up) i was sincerely hoping the mid range would get a boost...even if it was just to 192bit or something, 256bit would of been nice also and still comes in under the 8800's. but seems like theyve decided not to move on at all which is poor game IMO.
256mb of memory is less weak, but again i would of liked to see more.

2Ghz memory you say? so what. its nothin special, its still gonna be a relatively bandwidth starved card.

ill wait for the reviews, but for now, poor show nvidia, poor show. i bet the 8800GTS 320mb is only a couple of £ more on release too so the 8600GTS wont even be worth it.

You have to remember, these cards arent for the "elite" resolution crowd that plays at more than 1280x1024 resolutions

64 shaders, 700 core/2000 memory is pretty good for midrange 8 series....but I do agree it is time to move on from 128-bit

if a 8800GTS 640 can workin that shuttle with the 400w PSU then i wont care, but if it doesnt ill need this and it'll need to play 1680x1050 with good settings
 

XMan

Lifer
Oct 9, 1999
12,513
49
91
Originally posted by: Conky
Originally posted by: XMan
Will this release drop the prices of the high-end cards any, you think?

I'm eyeing a 640MB GTS but I'm hesitant to jump now if I can save $50 bucks by waiting a few weeks.
Wait, the prices will definitely fall. They ALWAYS do.

I have a lowly 7900GS and I am hanging onto that until I actually need to upgrade. I was tempted to get the 8800GTS 320mb now that it's well under $300 but then I realized that there is nothing my 7900GS can't do right now for me and that I should probably upgrade my 1280x1024 LCD monitor first.

I know, I know. Even a 7900GS would be an upgrade for me at the moment - I've got a vanilla 6600 256MB. Although it's not too shabby, really. I just can't play most games at native res (1600x1200).
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: otispunkmeyer
i figuer the 8600GTS will have 32gig of bandwidth for its memory.

and may i just point out

im not troubled at the numbers, more so the lack of progress. my 9500pro has a 128bit memory interface and thats ancient. previous cards had 128bit memory buses too.

thats along time with the same bus size, are they even trying to progress or what? it must be dirt cheap to make a 128bit bus now, its that fricken old.

There isn't much they can do as a 256Bit Interface has minimum die size requirements if unless you want to really up the complexity of the PCB, which already happens on 256Bit Interface cards anyway.

There hasn't been any significant advancements that have made production of 128Bit any cheaper, unless somehow it was easy to make a card that had only 4 paths to memory each 64Bit on a relatively simple PCB, the only viable cheap way to increase bandwidth on the midrange would be to increase memory clock which actually do get higher over time and cheaper as well for each bin.

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: R3MF
i think it is a screwup.

the mid-range chips should have a 256bit bus with the intention of providing over 50GB of bandwidth (1600MHz speed).

this is not unreasonable in an era where a high-end card will have over 128GB of bandwidth.

32GB sucks!

Why? Memory bandwidth doesn't help very much once a certain threshold is reached, and only ever higher amounts are needed on the upper resolutions when you need to apply HDR + AA, for the mainstream increase in shader power which help across the board are far more important.

Considering their 384Bit Memory Interface, Nvidia still has a ways to go yet to break 128GB/s Bandwidth considering they are at 86.4GB/s now.

32GB/s is probably be a little less then 1/3 of what the flagship will have at that point in time for Nvidia, but performance of this card will likely be closer to the 45% range, as long as you stick to the resolutions this card was designed for aka, 12x10 and 10x7.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: rmed64
You have to remember, these cards arent for the "elite" resolution crowd that plays at more than 1280x1024 resolutions

64 shaders, 700 core/2000 memory is pretty good for midrange 8 series....but I do agree it is time to move on from 128-bit

I am sure Nvidia and ATI would like to, but there hasn't been any significant advancement to make manufacturing 256Bit PCB practical at the mainstream level as well as the associated ~200mm2 die size requirement, the only reason were getting higher memory interfaces now is because there the die sizes of the G80 and R600 can support them with ease now. G80 could probably support 512Bit if Nvidia wanted, they chose not to to allow a possible refresh to have it and to save costs. Nvidia and ATi can also just jack the high end price upwards if they need to pass the cost on to someone. At the mainstream 199USD price point, they have no room and it simple eats into their profits. Not to mention no wiggle room.

Even ATI isn't going to have 512Bit across the entire high end X2K Line, more like the top 2 or 3 models only and back to 256 Bit.

Look at older comparison where 1 card had a 256Bit Interface and the competing card has 256Bit, like the 6600 GT vs the X800 GT or 7600 GT vs the X1800 GTO lots of bandwidth is worthless unless shader power is up to snuff.

 

hans007

Lifer
Feb 1, 2000
20,212
18
81
i am hoping that one of these with say 64 shaders will be there for me. i am currently running a 6200TC! a 6200Tc and barely playing world of warcraft. i'm not much of a gamer, hell if the 8300gt is cheap i could even just get that and i figure itd be a giant leap in performance.
 

R3MF

Senior member
Oct 19, 2004
656
0
0
Originally posted by: coldpower27
Originally posted by: R3MF
i think it is a screwup.

the mid-range chips should have a 256bit bus with the intention of providing over 50GB of bandwidth (1600MHz speed).

this is not unreasonable in an era where a high-end card will have over 128GB of bandwidth.

32GB sucks!

Why? Memory bandwidth doesn't help very much once a certain threshold is reached, and only ever higher amounts are needed on the upper resolutions when you need to apply HDR + AA, for the mainstream increase in shader power which help across the board are far more important.

Considering their 384Bit Memory Interface, Nvidia still has a ways to go yet to break 128GB/s Bandwidth considering they are at 86.4GB/s now.

32GB/s is probably be a little less then 1/3 of what the flagship will have at that point in time for Nvidia, but performance of this card will likely be closer to the 45% range, as long as you stick to the resolutions this card was designed for aka, 12x10 and 10x7.

a brand new mid-range card from NV usually retails at £160 vs about £320 for a GTX version.

given that the new top end cards are coming out with as much as 128GB+ bandwidth i expect more than one quarter of the bandwidth for half the money.

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
GTX only has 86gb/s. GTS has 64gb/s.

People dont seem to understand that bandwidth isnt a major factor in performance especially when the card (8600 series) is aimed at consumers/gamers who play at 10x7 or 12x10 res.

nVIDIA wont disappoint because for the past two generations they have released very good mid range cards. Namely the 6600GT and the 7600GT. Im willing to bet that the 8600GT is as fast as the X1950pro/7900GT with more features and better IQ. Or else ill eat a shoe. :D
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: Cookie Monster
GTX only has 86gb/s. GTS has 64gb/s.

People dont seem to understand that bandwidth isnt a major factor in performance especially when the card (8600 series) is aimed at consumers/gamers who play at 10x7 or 12x10 res.

nVIDIA wont disappoint because for the past two generations they have released very good mid range cards. Namely the 6600GT and the 7600GT. Im willing to bet that the 8600GT is as fast as the X1950pro/7900GT with more features and better IQ. Or else ill eat a shoe. :D

Exactly..
Even on the high end you have to wonder (and I have stated this numerous times in the past) how will this excess memory bandwith will be used in R600,taking two major considerations into account.. 24 ROPS vs 16 and especially knowing that R600 will feature the same samples of multisampling AA.. Makes you wonder what they have in mind..