ConstipatedVigilante
Diamond Member
- Feb 22, 2006
- 7,670
- 1
- 0
64 SPs seems a bit low for a high-end part (I mean, half the GTX?) unless the 65nm clocks are quite high indeed. I would guess at more in 70's or 80's. Hopefully, at least.
Originally posted by: Shaq
Originally posted by: munky
The new gts is still the same 90nm g80 core, only this time with 112 shaders enabled. But even so, at 650mhz the new gts would be a real competitor to a stock gtx, unless the extra memory and bandwidth on the gtx make a significant difference.
That's unfortunate. That wouldn't be worth upgrading for current 640 owners. Why can't Nvidia take the same 8800GT GPU, make it dual slot and up the voltage, put 1GB of DDR4 memory and sell it for $499? It seems it would be no problem for them. It would be faster than a GTX so they could discontinue them and would save them money due to the smaller process. They would sell extremely well since it is Xmas time and Crysis, UT3, Gears and several other AAA games are coming out.
Why do they want to drop the ball and give DAAMIT time to catch up? The last 2-3 years they have been good at staying one step ahead of ATI.
According to this link off firing squad the current 640 is about 10% slower overall when OC'd compared to a stock GTX. So it should be almost the same with the extra shader processors. http://www.firingsquad.com/har..._gtx_gts_overclocking/
Originally posted by: munky
80 shaders seems more likely, because with 64 shaders I don't see how the card would threaten a 8800gts in performance. Plus, with the 320mb gts selling for under $300 and the negative reviews the 8600gts received due to underwhelming performance, I don't think Nvidia wants to drop the ball again on this one.
Originally posted by: munky
Of course for current gts owners it's not worth it. But for me, seeing how I'm still using a 1900xt, it would be exactly what I've been waiting for. $350 right now seems too much to pay for a card that debuted at $400 a year ago.
Originally posted by: lopri
Sad. In the past we would see this part as 9900 GT and there would be a 9900 GTX/Ultra that'll perform more or less at 150% of 8800 GTX. How long will 8800 GTX stay at the top? 2 years?
Don't get me wrong. This part is indeed like a hypothetical 9900 GT and I do like the return of single-slot card. If this card has the 8600-like (or even better) video processing capability for HD conents, it's a near-ideal solution for mid to mid-high market. I just lament the lack (or delay) of ultra high-end updates.
Originally posted by: chizow
Originally posted by: coldpower27
My guess is that the new card has 80 stream processors and 20 TMU, at 600MHZ this means 48K Shader Cycles = to the old 8800 GTS. Hence the new 8800 GTS 640 at 112 Stream processors for 57.6K Shader Cycles and 28 TMU. This would put some distance between them. Remember if 112 is possible which is 7/8 there is no reason why 5/8 is impossible.
I am going to assume the 10.7K 3D Mark 2006 Score without any other available information was derived using a Core 2 Extreme QX6850. Anyone have a gather at what a reference 8800 GTS 640 gets with that processor?
10k 3DMark06 is about the same as a 640MB GTS with a C2D, so ya, the 8800GT is very close in performance.
I also think the number of shaders on the GT would have to be closer to the 96 on the GTS based on clock speeds in order to be competitive, so 80 is definitely a possibility, HOWEVER, I don't think they're simply disabling shader quads, or more accurately, octets on the GT like they did with the GTS. GT is on a 65nm process and geared for the mainstream. Disabling shaders wouldn't make sense as there's no higher end part on this process yet and would eat much of the cost-saving benefits of a move to a smaller, cooler part. This will definitely be a part to keep an eye on for OC'ers if it has closer to 96 shaders on a smaller process, as much higher clock speeds should be possible with cooler temps and less power draw. I wouldn't worry too much about the 256-bit memory interface either, as I think the 320 and 384 on the GTS/GTX is still overkill with current clock speeds.
Originally posted by: n7
Nothing better than the 8800 Ultra anytime soon then it seems :frown:
My 2560x1600 needs feeding...c'mon nV/AMD...one of you step up...
Originally posted by: pcslookout
Originally posted by: n7
Nothing better than the 8800 Ultra anytime soon then it seems :frown:
My 2560x1600 needs feeding...c'mon nV/AMD...one of you step up...
There is always SLI Geforce 8800 GTX! If you must have 2560x1600 resolution in all games like Crysis. I knew people were going to regret getting such a high native resolution on a monitor. SLI can barely drive 2560x1600 but it works well enough. Bare minimum.
Originally posted by: Canterwood
Hmmm still nothing thats going to be able to perform well in DX10.
Looks like I'll be missing this round of cards as well, and wont be upgrading until well into 2008.
And I can see price gouging by retailers to ensure customers get the 112 stream GTS, a bit like the G0 stepping Q6600 debacle.
The video card market is in a terrible state for consumers atm.
Originally posted by: pcslookout
Look on the bright side! All the folks who bought Geforce 8800 series video cards now can rest easy knowing their purchase was well worth it! Even the ones who bought them when they first came out!
Originally posted by: Azn
innovation? These corps just want to milk you.
Originally posted by: lopri
Get ready for a rude-awakening. NV is getting all worked up to swing the SLI flag in full force once again. They just couldn't do it because their drivers were just.. not ready. :laugh: But there are many forward-looking signs (for SLI) in current and upcoming NV hardware. NV has already experimented with I/O and processing separation (NVIO), and I expect their upcoming high-end platform will have this video I/O chip built-in. With the die-shrink of G80, introduction of PCIE 2, and the integration of video I/O processing in the platform they will attempt to rectify the well-known shortcomings of SLI.
If I dare to predict, this attempt will drag many early adopters to seemingly eternal misery for one good year, at least, if we go by their past. I'm personally determined not to be their beta-tester ever again, but admittedly there was a period of time that I enjoyed SLI. Thankfully at that time the first generation SLI was as mature as could be (7900 GTX SLI in mid 2006), and I enjoyed quite a few games at quite-high-back-then resolution of 1920x1200 for about 6 months.
So yeah, if you visit NV's website, you will see inordinary amount of SLI propaganda. A taste of things to come.![]()
Ya I agree, just wanted to clarify for others but the 16 LCD was a good point worth emphasizing. Also wanted to emphasize that whatever # of shaders G92 comes with will most likely be native in the absence of a high-end part on the same process.Originally posted by: coldpower27
Oh sorry, I was just saying that a number like 80 Shaders is possible due to the 112 Shaders on the new G80 SKU of 8800 GTS 640 I didn't imply that the G92 will have disable shader "octets" I think it's going to be something like the 7600 GT and natively only have 5 blocks of 16 for 80 Shader Units.
All I way saying that Nvidia isn't restricted to using numbers like 32/64/96/128 etcc.. since the LCD is 16 in this case.
Originally posted by: Canterwood
Originally posted by: Azn
innovation? These corps just want to milk you.
Yeah, and ATI's lack of a competition hasn't helped.
Hopefully though thats about to change.
Originally posted by: Azn
Originally posted by: Canterwood
Originally posted by: Azn
innovation? These corps just want to milk you.
Yeah, and ATI's lack of a competition hasn't helped.
Hopefully though thats about to change.
We all knew ATI wasn't going to lower their brand new released 2900xt prices any time soon. Nvidia had their card out 6 months prior. Nvidia could have lowered the price if they wanted on their high end part some. Yup Nvidia milked the PC gamers for good 6 months for lack of competition.
Things are about to change though in the middle to high end range.