The True DX10 cards may be coming as early as March

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_WhatTheHell

Junior Member
Jan 27, 2008
6
0
0
Originally posted by: ArchAngel777
AMD has already released samples of the 55nm RV770 GPU to its partners for certification and cards are expected to launch at late second quarter, added the paper.

March is 1st quarter anyway. Revise the thread title summary. We are looking at a release around June I reckon.
So 8800 GTX as the top-performing single-chip card for 1.5 years? That really sucks.
 

Demoth

Senior member
Apr 1, 2005
228
0
0
Originally posted by: ArchAngel777
AMD has already released samples of the 55nm RV770 GPU to its partners for certification and cards are expected to launch at late second quarter, added the paper.

March is 1st quarter anyway. Revise the thread title summary. We are looking at a release around June I reckon.

The title only says it is possible to see a card released as early as March and I stand by that. Nvidia has production all ready to go and have already pushed up their schedule, on a moments notice, quite a bit. They very well might want to upstage ATI who seems to have a good contender this round.

Both the dual GPU high end cards coming now are more for giving their designers a hands on experience for the future. Expect both cards to have the availability and life span of the HD2950 Pro. Still good cards in their own right, if you can live with the extra noise and power reqs.

As far as True DX10 cards...

I would consider a card truely DX10 if it shows enough power to handle games for at least the next two years on good settings. No matter what the coding problems in Crytek's engine are, performace in Crysis is indicitive of what is going to be needed for other top end games down the road.

Granted, I am still talking about occassional big titles. The majority of good games released should still play good on 8800GTs and the like.

All I am saying is if your thinking of pulling the trigger on a $400+ card, just might want to be aware that the release cycle for next gen seems to be under acceleration. Hopefully we may catch up to where things should be. We have seen this in the past when things were stagnant for a while then suddenly got a few new releases in a very short period.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Demoth
Originally posted by: ArchAngel777
AMD has already released samples of the 55nm RV770 GPU to its partners for certification and cards are expected to launch at late second quarter, added the paper.

March is 1st quarter anyway. Revise the thread title summary. We are looking at a release around June I reckon.

The title only says it is possible to see a card released as early as March and I stand by that. Nvidia has production all ready to go and have already pushed up their schedule, on a moments notice, quite a bit. They very well might want to upstage ATI who seems to have a good contender this round.

Both the dual GPU high end cards coming now are more for giving their designers a hands on experience for the future. Expect both cards to have the availability and life span of the HD2950 Pro. Still good cards in their own right, if you can live with the extra noise and power reqs.

As far as True DX10 cards...

I would consider a card truely DX10 if it shows enough power to handle games for at least the next two years on good settings. No matter what the coding problems in Crytek's engine are, performace in Crysis is indicitive of what is going to be needed for other top end games down the road.

Granted, I am still talking about occassional big titles. The majority of good games released should still play good on 8800GTs and the like.

All I am saying is if your thinking of pulling the trigger on a $400+ card, just might want to be aware that the release cycle for next gen seems to be under acceleration. Hopefully we may catch up to where things should be. We have seen this in the past when things were stagnant for a while then suddenly got a few new releases in a very short period.

I cannot knock you for a prediction or viewpoint, but I don't see how that should go in the title when the item you linked to clearly said 'late second quarter' for ATI and nothing of nVidia's release at this point. Now, if you were talking the GX2 or 3870X2, then I would say that likely possible. But it is clear from your post that you are talking about the next architecture update.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I think the guy who originally said this meant that all new cards will come out in X2 versions.
That's simply not possible as only the high-end market will tolerate such cards.

As for single cores, they must also get faster otherwise X2 cards would never get faster either.

History teaches us that some games require multi card solutions to run well, especially at high resolution.

Look at Crysis- arguably the most popular game out there.

http://www.techspot.com/articl...ce-multigpu/page5.html

A lot better to play at 33fps average than 23fps average, wouldn't you say?
Huh? The benchmarks you quoted are running at 1680x1050 while you?re extolling the virtues of tri-SLI because you have a 2560x1600 display.

So tell me, is Crysis playable at 2560x1600 on your tri-SLI system?
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
i dont want ati or Nvidia to be dependent on SLI/CF or 2 cards stuck together, i just want a single rapid card that can chew through every game thats thrown at it.

SLI/CF is just ridicules here you have 2 cards but only get upto 1.5 performance. im sorry but if i pay double i want double. :)
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
At some point multi-GPU solutions will have PCs bumping into residential home wiring limitations, at least in the US. There's a reason hairdryers, space heaters, etc are limited to 1200 watts -- 'normal' rooms are wired with 10 amp fuses in the circuit. The wiring wasn't designed to carry more than 1200 watts. UL has a serious problem with certifying any consumer device designed to plug into any generic socket and suck out > 1200 watts.

I can almost see the PSU solutions designed to be plugged into multiple rooms and/or oven and dryer circuits now...

 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Originally posted by: v8envy
At some point multi-GPU solutions will have PCs bumping into residential home wiring limitations, at least in the US. There's a reason hairdryers, space heaters, etc are limited to 1200 watts -- 'normal' rooms are wired with 10 amp fuses in the circuit. The wiring wasn't designed to carry more than 1200 watts. UL has a serious problem with certifying any consumer device designed to plug into any generic socket and suck out > 1200 watts.

I can almost see the PSU solutions designed to be plugged into multiple rooms and/or oven and dryer circuits now...

The normal fuse in my area is 15 amps. The only system that could pull 1000 watts from the wall is a skulltrail system with duel quad core cpu's with 4x slied 8800gtx's all overclocked and 10 hard drives. I don't see it happening.

 

Arkitech

Diamond Member
Apr 13, 2000
8,356
4
76
So has there been any rumours on the prices for the upcoming cards or is it still to early for that info?
 

Demoth

Senior member
Apr 1, 2005
228
0
0
Originally posted by: Demoth
Both the dual GPU high end cards coming now are more for giving their designers a hands on experience for the future. Expect both cards to have the availability and life span of the HD2950 Pro. Still good cards in their own right, if you can live with the extra noise and power reqs.


Looks like I was proven wrong on this point already. The 3870X2 seems to have shipped in mass judging by the number of retailers selling numerous brands close to or at (Newegg) MSRP. Hopefully I won't be too far off on the time frame for the next gen release. If the dual GPUs do hit it big, then yes, the true next gen will be delayed. This could be possible with the 3870X2 OCed card going for MSRP of $449. If stocks can hold and Nvidia uses a similar pricing scheme we could see a price war. There are a lot of gamers that would buy a new PSU to run something this powerful in the sub $400 range, provided the driver support for most games is well maintained.
 

Arkitech

Diamond Member
Apr 13, 2000
8,356
4
76
Originally posted by: Demoth
There are a lot of gamers that would buy a new PSU to run something this powerful in the sub $400 range, provided the driver support for most games is well maintained.

I don't know if there are that many gamers who would be willing to shell out 400+ bucks for a card and then an additional 50-150 for a psu. I mean honestly 200-300 bucks for a gpu is a lot to pay every year or two years to keep current with games. Now we're looking at a future where its gonna cost nearly 500 bucks for a card to play games at a decent frame rate. I don't have an issue with companies making a profit but at some point reality needs to set in with these guys. I'm in the market right now for a card and I'm reluctantly going to spend about 300 dollars, I can't imagine paying more than that to play games. But I guess some people don't mind the expense, I just wish pricing was a little more competitive.
 

Demoth

Senior member
Apr 1, 2005
228
0
0
Originally posted by: Arkitech
Originally posted by: Demoth
There are a lot of gamers that would buy a new PSU to run something this powerful in the sub $400 range, provided the driver support for most games is well maintained.

I don't know if there are that many gamers who would be willing to shell out 400+ bucks for a card and then an additional 50-150 for a psu. I mean honestly 200-300 bucks for a gpu is a lot to pay every year or two years to keep current with games. Now we're looking at a future where its gonna cost nearly 500 bucks for a card to play games at a decent frame rate. I don't have an issue with companies making a profit but at some point reality needs to set in with these guys. I'm in the market right now for a card and I'm reluctantly going to spend about 300 dollars, I can't imagine paying more than that to play games. But I guess some people don't mind the expense, I just wish pricing was a little more competitive.

I understand what your saying, but history has shown there is a large portion of hardcore gamers that will buy near top end in the $550+ range. Many of the gamers I know did, in fact, pick up a 8800GTX on release. They also made out well despite the price paid considering the top end lifespan of the card (or lack of innovation lately).

I agree with you though. I am still holding my 7900GT. Before that I was using an OCed 6600 for my transition to PCIe going from a 9700 Pro. And before that used a Ti 4200. I have gotten at least 2 good years from every card owned except for the $40 transition card from AGP to PCIe and only paid over $200 for the 9700 Pro which was good enough at the time to justify $350. I expect the same life from my next card and am willing to pay up to $500 in order to have something that will last me while maintaining good FPS at decent settings. However, I also know that because of how much processing power DX10 applications require, this future card will require at least double the performance of the 8800 Ultra.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Originally posted by: Arkitech
Originally posted by: Demoth
There are a lot of gamers that would buy a new PSU to run something this powerful in the sub $400 range, provided the driver support for most games is well maintained.

I don't know if there are that many gamers who would be willing to shell out 400+ bucks for a card and then an additional 50-150 for a psu. I mean honestly 200-300 bucks for a gpu is a lot to pay every year or two years to keep current with games. Now we're looking at a future where its gonna cost nearly 500 bucks for a card to play games at a decent frame rate. I don't have an issue with companies making a profit but at some point reality needs to set in with these guys. I'm in the market right now for a card and I'm reluctantly going to spend about 300 dollars, I can't imagine paying more than that to play games. But I guess some people don't mind the expense, I just wish pricing was a little more competitive.

Doing my new build, the video card was the most expensive component. I don't see this changing anytime soon. Frankly, I'm deeply reluctant to spend even $300, but then I'm not a serious gamer. I want bang for the buck. At least ATI/AMD priced the 3870x2 in the ballpark of a single 3870 times 1.5. In most situations the X2 doesn't appear to scale at a 2X level, but does scale at a 1.5X level in many games, so the price/performance delta is roughly the same. At this point, the 3870x2 roughly matches the 8800GTX based on the reviews I've read. (This might imply that the 8800GTX may be in line for a long-overdue price cut, for which I will not be holding my breath.)

I think the reason for the longevity of the 8800GTX is that it was such a breakthrough product for its time. It just so thoroughly spanked everything else out there at the time that it's taken this long for the ATI people to begin to come close to it. So, as others have noted, until now NVidia really has had the high-end to itself for quite awhile.

The good thing about it is, ATI has finally gotten off the mat and is a competitive force in the enthusiast and power-user communities again, apparently requiring NVidia to start pullin' out the ol' can of whoop-ass again.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
by "not true DX10 cards' they mean that if you run them with DX10 features high enough to look BETTER then in DX9 (ie, have DX10 features come into play), your frame rate will be unplayable. Sure its running in DX10 mode, but for most people that just results in lower framerate on the exact same imagine that looks like DX9 (unless they are running it as a sideshow).
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Originally posted by: taltamir
by "not true DX10 cards' they mean that if you run them with DX10 features high enough to look BETTER then in DX9 (ie, have DX10 features come into play), your frame rate will be unplayable. Sure its running in DX10 mode, but for most people that just results in lower framerate on the exact same imagine that looks like DX9 (unless they are running it as a sideshow).

Yeah, I guess this is one of those extremely rare times when the software guys are pushing the hardware guys to strengthen their product. Usually it's the other way around. DX10 might not be a failure or a waste; until we get cards that run the unique DX10 capabilities gracefully we probably won't know for sure. I don't pretend to know what NVidia has up its sleeve, but maybe DX10 requires a major re-architecting of GPU technology.

OTOH, DX10 could just be an incredibly badly-written spec. A lot of things about Vista have pleasantly surprised me; DX10 is not one of them.

I just hope we'll soon see major improvement in this area from the hardware guys.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Dadofamunky
Originally posted by: taltamir
by "not true DX10 cards' they mean that if you run them with DX10 features high enough to look BETTER then in DX9 (ie, have DX10 features come into play), your frame rate will be unplayable. Sure its running in DX10 mode, but for most people that just results in lower framerate on the exact same imagine that looks like DX9 (unless they are running it as a sideshow).

Yeah, I guess this is one of those extremely rare times when the software guys are pushing the hardware guys to strengthen their product. Usually it's the other way around. DX10 might not be a failure or a waste; until we get cards that run the unique DX10 capabilities gracefully we probably won't know for sure. I don't pretend to know what NVidia has up its sleeve, but maybe DX10 requires a major re-architecting of GPU technology.

OTOH, DX10 could just be an incredibly badly-written spec. A lot of things about Vista have pleasantly surprised me; DX10 is not one of them.

I just hope we'll soon see major improvement in this area from the hardware guys.

Hellgate:London surprised the hell out of me ... i mean aside from being ultimately a huge disappointment - it DOES run and look better in DX10 than in DX9 ... at least after the last patch
:Q

there is hope
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Originally posted by: apoppin

Hellgate:London surprised the hell out of me ... i mean aside from being ultimately a huge disappointment <snip>

Lol. I hate it when that happens.