• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

No more super fast high end cards?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Azn
Not exactly. 4 3850 has roughly have power of 2 gtx. Frame wise 4 3850 with 256mb ram would win over a single gtx. Maybe not gtx sli.

Theoretically speaking... But, if dual card crossfire sometimes actually sees a performance drop over a single HD3850, is that a safe assuption to make about four card Crossfire X? Even if each HD3850 is only $180, you'd still be dropping $720 for four of them, wouldn't you want to be almost certain that your $720 investment for four HD3850s can beat a $500 8800GTX in almost every situation?

The fact of the matter is that in AT's review of the HD38xx cards they got roughly GTX performance out if dual HD3850s, but they were only able to do so in two of the four games they tried. The other two actually saw a drop in performance when Crossfire was enabled.

http://anandtech.com/video/showdoc.aspx?i=3151&p=10

 
Now now. I never said anything about being logical and get 4 256mb gpu. I just said it would still beat a single gtx.
 
Originally posted by: Azn
Now now. I never said anything about being logical and get 4 256mb gpu. I just said it would still beat a single gtx.

..my point is that it might, or it might not...
 
Well, they do have a 3870 x2 card on the horizon(basically 2 3870 GPUs on one card. That doesn't run through any crossfire software AFAIK (please correct this if it is incorrect). Thus you are getting the performance of 2 3870 GPUs without the Crossfire limitation of bad software and no support.

Again if this is wrong please correct me.
 
Originally posted by: cmdrdredd
Well, they do have a 3870 x2 card on the horizon(basically 2 3870 GPUs on one card. That doesn't run through any crossfire software AFAIK (please correct this if it is incorrect). Thus you are getting the performance of 2 3870 GPUs without the Crossfire limitation of bad software and no support.

Again if this is wrong please correct me.

I hope you're not wrong! 😛
I'd be much more interested in a dual-gpu card if it relied on some more efficient and advanced method than crosfire or sli.
 
Originally posted by: munky
Originally posted by: cmdrdredd
Well, they do have a 3870 x2 card on the horizon(basically 2 3870 GPUs on one card. That doesn't run through any crossfire software AFAIK (please correct this if it is incorrect). Thus you are getting the performance of 2 3870 GPUs without the Crossfire limitation of bad software and no support.

Again if this is wrong please correct me.

I hope you're not wrong! 😛
I'd be much more interested in a dual-gpu card if it relied on some more efficient and advanced method than crosfire or sli.

I only know what I've read thus far and I saw pics and reports on 2 3870x2 cards running in crossfire. I think that one card doesn't rely on crossfire to get performance. I don't know, and I don't think anyone outside of AMD knows really. I would bet that it has some internal hardware to handle dual gpu.
 
I think Nvidia are just waiting till the next gen of Xbox (Xbox720?) and Playstation4 are around the corner. They will then release an architecture for PC that is similar to the capabilites of those new consoles.
Fact is: No one, not the game developers nor the card manufacturers want to build technology that doesn't exist in the console market aka where the real money is made. And Xbox360 titles will continue to use DX9 for the Xbox360's lifespan which is at least another year. This whole DX10 is good for PC gaming propaganda was just M$ marketing BS.
 
Originally posted by: munky
The day a single card can not play my games well will be the day I quit PC gaming. Until 2 cards can guarantee me close to 100% performance improvement in every game, I will not buy into this marketing gimmick and primitive performance solution.


Famous last words - we will hold you to this. 😀


Besides what card that is twice as expensive as its little brother, is twice as fast at every game?
 
Originally posted by: frythecpuofbender
I think Nvidia are just waiting till the next gen of Xbox (Xbox720?) and Playstation4 are around the corner. They will then release an architecture for PC that is similar to the capabilites of those new consoles.
Fact is: No one, not the game developers nor the card manufacturers want to build technology that doesn't exist in the console market aka where the real money is made. And Xbox360 titles will continue to use DX9 for the Xbox360's lifespan which is at least another year. This whole DX10 is good for PC gaming propaganda was just M$ marketing BS.

Those consoles are not "around the corner" if they were, the Playstation brand will die...horribly. They haven't got any of the games that they said were doing to be so good and only on the PS3.

The rest of what you said may be somewhat correct. However, I think that Nvidia and AMD are simply taking advantage of a ,market that keeps buying cards that are only marginally faster than those released a year ago.
 
Well the fact is that since most PC games now are ported from consoles a new high end would not sell as well since they are not needed. The few people that game at 1920 and up will just be forced to purchase a Nvidia SLI chipset and another video card.

Until the next console generation is out we will be lucky to get more than one true (non- GX2) new high end card. I don't think the next console will be out until a quad core @ 3.0, 2 gigs of ram, and the next high end video card become cheap enough and cool enough to justify it... probably Xmas 2009. Microsoft and Sony can then actually make a small profit on the consoles if they choose to. The next gen consoles being more evolutionary than revolutionary. The biggest benefit will be the extra ram.
 
Originally posted by: Shaq
Well the fact is that since most PC games now are ported from consoles a new high end would not sell as well since they are not needed. The few people that game at 1920 and up will just be forced to purchase a Nvidia SLI chipset and another video card.

Until the next console generation is out we will be lucky to get more than one true (non- GX2) new high end card. I don't think the next console will be out until a quad core @ 3.0, 2 gigs of ram, and the next high end video card become cheap enough and cool enough to justify it... probably Xmas 2009. Microsoft and Sony can then actually make a small profit on the consoles if they choose to. The next gen consoles being more evolutionary than revolutionary. The biggest benefit will be the extra ram.

Well, effectively then Nvidia and AMD are helping PC gaming die. Very few people can justify 2 cards for 1 game. Many more people would be happy to buy 1 video card and be able to play their games.
 
Well I am tempted to buy one video card for one game. I have been delaying paying valve for the orange box and really the black box looks like it has what I want. Problem is I have to buy a card to get it.
 
Originally posted by: cmdrdredd
Originally posted by: Shaq
Well the fact is that since most PC games now are ported from consoles a new high end would not sell as well since they are not needed. The few people that game at 1920 and up will just be forced to purchase a Nvidia SLI chipset and another video card.

Until the next console generation is out we will be lucky to get more than one true (non- GX2) new high end card. I don't think the next console will be out until a quad core @ 3.0, 2 gigs of ram, and the next high end video card become cheap enough and cool enough to justify it... probably Xmas 2009. Microsoft and Sony can then actually make a small profit on the consoles if they choose to. The next gen consoles being more evolutionary than revolutionary. The biggest benefit will be the extra ram.

Well, effectively then Nvidia and AMD are helping PC gaming die. Very few people can justify 2 cards for 1 game. Many more people would be happy to buy 1 video card and be able to play their games.

Most of the money is in consoles and that is where the developers are going. If PC gaming had lots of developers like Crytek pushing the technology envelope then Nvidia/ATI would have more of a market for high end cards.

I think it's the developers fault for the decreasing growth of PC gaming not the video card industry. We need more developers that are into technolgical innovation instead of the bland console games we get every year. Because they would have to develop on the PC to push the technological envelope.

 
Originally posted by: Lonyo
Nvidia Corp., the world?s largest supplier of graphics processors, said that it has no immediate plans to release graphics cards that offer higher speed than current top-of-the-range GeForce 8800 Ultra, but said that customers seeking for extreme performance will soon be able to install three graphics cards into one system to get incredible graphics rendering horsepower.
http://www.xbitlabs.com/news/v...s_Chief_Executive.html
ATI, graphics product group of Advanced Micro Devices, also said earlier this year that it would focus on development of multi-GPU solutions for high-end market instead of creating large graphics chips with roughly a billion of transistors, which are hard to produce and develop.

This also ties in with AMD/ATi talking about quad-GPU systems with Crossfire-X

ATI CrossfireX, previously dubbed Quad Crossfire, will also finally make its debut. In short, users will be able to connect up to four HD 3800 cards through AMD's Crossfire. The technology will support up to 8 monitors, and will also allow overclocking.
http://www.dailytech.com/AMD+R...pon+Us/article9613.htm
http://www.dailytech.com/AMD+P...ssFire/article7986.htm

Looks like the move which makes sense (given how parallel graphics processing is) is coming, especially now PCIe-2.0 is here which offers more bandwidth (twice as much, as triple/quad cards won't be too bandwidth starved).

Of course, for the consumer it means more heat, more power, more space and more expensive components, but for the manufacturers it means it's easier to give us more power, since they can sell us multiple less powerful cards, rather than having to invest a lot in a single high power complex GPU.

well ...like ... yeah ... if anyone has been reading my posts, i have been saying as much for quite some time - that my 2900xt would "do" as a 2nd card in a Xfire setup until Spring
-nvidia and AMD are "milking" us and getting us used to multi-GPU [hot/noisy/power-hungry]

THEN they will simultaneously release a single card that beats the best multi-GPU set-up and you will upgrade again ... and perhaps buy another high-end GPU

they are leading us and conditionings us like Pavlov's dog 😛
:Q
 
Originally posted by: Shaq
Originally posted by: cmdrdredd
Originally posted by: Shaq
Well the fact is that since most PC games now are ported from consoles a new high end would not sell as well since they are not needed. The few people that game at 1920 and up will just be forced to purchase a Nvidia SLI chipset and another video card.

Until the next console generation is out we will be lucky to get more than one true (non- GX2) new high end card. I don't think the next console will be out until a quad core @ 3.0, 2 gigs of ram, and the next high end video card become cheap enough and cool enough to justify it... probably Xmas 2009. Microsoft and Sony can then actually make a small profit on the consoles if they choose to. The next gen consoles being more evolutionary than revolutionary. The biggest benefit will be the extra ram.

Well, effectively then Nvidia and AMD are helping PC gaming die. Very few people can justify 2 cards for 1 game. Many more people would be happy to buy 1 video card and be able to play their games.

Most of the money is in consoles and that is where the developers are going. If PC gaming had lots of developers like Crytek pushing the technology envelope then Nvidia/ATI would have more of a market for high end cards.

I think it's the developers fault for the decreasing growth of PC gaming not the video card industry. We need more developers that are into technolgical innovation instead of the bland console games we get every year. Because they would have to develop on the PC to push the technological envelope.

Most console games don't even touch the power that consoles have to put out. ALL developers, regardless of platform are lazy. end of story here lol
 
woof! apoppin, you never answered me. how low before you buy 3870? I caved at $225 since my 1950xt is already sold and I'm limping around on a 6600gt right now.
 
Originally posted by: bryanW1995
woof! apoppin, you never answered me. how low before you buy 3870? I caved at $225 since my 1950xt is already sold and I'm limping around on a 6600gt right now.

i probably didn't actually realize that you were asking me a question 😱

My attention - over 100 hours since it was released - 2 weeks ago! - has been taken up with battling hordes of demons and attempting to shut the London Hellgate
:Q

😀


... oh ... when . ... . when it goes on sale 😛
-i don't *need* one yet .... i am NOT getting Crysis ... not till after Xfire/Penryn
 
Originally posted by: firewolfsm
How about dual gpu cards? Even that's better than sli.

As long as they don't need specific drivers to perform. Current offerings (SLI and crossfire) need a driver tweak for specific games to take advantage of the performance boost.
 
Back
Top