R680 naked

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
$300 is not mainstream, not even close. For a "hardcore gamer", perhaps. The Steam survey shows that cards $150 and under are more popular. The 8800 series holds the top spot, and then followed by the 7600, 6600, 8600, 5200, 9600, and 7300 cards. All well below $300. Even a lot of the 8800's are below $300. The fact is, most people do not spend $300 on a video card.

gmofftarki, tthe 9800 Pro was not $180-$250 like you claim. It was $400 it first came out. And it wasnt 8 years ago like you claim. More like 4.5 years ago if memory serves.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
mmm... so mainstream gamers, buy a dell?

I think they would actually... if you want to corner the mainstream market you should focus on system builders. The average user doesnt build a computer, he gets it laden with spam and adware from some big retailer.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Yes, a lot of gamers buy a dell. Highend and mainstream. That doesnt refute the fact that most people do not spend $300+ on a video card. People who do, are the vast minority.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Ackmed
$300 is not mainstream, not even close. For a "hardcore gamer", perhaps. The Steam survey shows that cards $150 and under are more popular. The 8800 series holds the top spot, and then followed by the 7600, 6600, 8600, 5200, 9600, and 7300 cards. All well below $300. Even a lot of the 8800's are below $300. The fact is, most people do not spend $300 on a video card.

That's a completely flawed perspective. If everyone in the steam survey bought a card when the 8800 owners bought their cards you'd have a point. Survey doesn't show anything really, but basing mainstream price using today's prices on old parts isn't accurate for sure. A more accurate analysis would involve finding the % of cards that retailed above $300 when NEW and drawing a conclusion from that.

 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Zstream
Originally posted by: ViRGE
Originally posted by: chizow
Also we can notice two digital PWN power modules.
Haha proof this card is going to pwn!

Interesting part though, definitely hope it makes it to market. PLX bridge chip piqued my interest, but it doesn't seem much more than a PCI-E bridge for the 2nd GPU or both GPU to access the PCI-E slot. Essentially it'll just make an a single x16 into 2 x8 for the two GPU unless I'm misinterpreting some of the info on PLX's website.

PLX website
No, you're interpreting it correctly. It's just like the 7950GX2, they need a bridge chip to split the PCIe lanes for each GPU, which is what the PLX chip does. This looks exactly like the rumors have been stating, it's a pair of RV670s on a single card, running in Crossfire. It's ATI's 7950GX2/Rage Fury Maxx.

Performance wise, I would be shocked if it performed significantly different from a pair of 3870's in Crossfire today. That's going to be a problem for ATI, Crossfire doesn't perfectly scale among all games as we're well past the days where rendering was simple enough to so easily split.

The reason is that the 7950gx2 was split into two seperate PCB's. Also the Nvidia chip did not use PLX technology. I will wait and see about my judgement but for now it looks like a decent card.

Also it is shorter then the Ultra. So it is not that long.
PLX isn't a technology, it's a company. They make PCIe bridges and switches, presumably that's one of their switches on the card we're seeing.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,072
575
136
Originally posted by: taltamir
oh its a pcie splitter? then that sucks... hardcore suckage. Forget it than, i aint buying.

anyways, i forgot to make my joke... "If UFOs taught me anything, its that if the pictures are blurry than it doesn't exist"...

Why would that be a bad thing?
 

Demoth

Senior member
Apr 1, 2005
228
0
0
The majority of gamers buy pre-built systems with either onboard video or a very low end vid card. When they find their system can't handle any of the games they try and play, some helpful person usually tells them they need a 256 or 512 meg vid card. They soon hop down to Best Buy and are steered towards a 512 meg 8500GT that's 'turbo charged!' by one of the crack geek squad experts.

http://www.bestbuy.com/site/ol...oduct&id=1188560188489


Generally they are a happy lot, owning a 512 meg card of the latest generation and only paying $99. The average gamer usually learns to tweak down game settings or play in constant lag.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: chizow
Originally posted by: Ackmed
$300 is not mainstream, not even close. For a "hardcore gamer", perhaps. The Steam survey shows that cards $150 and under are more popular. The 8800 series holds the top spot, and then followed by the 7600, 6600, 8600, 5200, 9600, and 7300 cards. All well below $300. Even a lot of the 8800's are below $300. The fact is, most people do not spend $300 on a video card.

That's a completely flawed perspective. If everyone in the steam survey bought a card when the 8800 owners bought their cards you'd have a point. Survey doesn't show anything really, but basing mainstream price using today's prices on old parts isn't accurate for sure. A more accurate analysis would involve finding the % of cards that retailed above $300 when NEW and drawing a conclusion from that.

It is when the cards I listed, were never above $300. The 7600, 6600, 8600, 5200, 9600, and 7300 cards were all budget cards from the get-go. These are the top six out of seven series of cards. And even some 8800's cards were/are under $300.

Saying that a 7800GTX is a budget card, when listed in that survey wouldnt be accurate. Because as you pointed out, it may have been bought at its $500 MSRP, and is now much less than that used.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Ackmed
Originally posted by: chizow
Originally posted by: Ackmed
$300 is not mainstream, not even close. For a "hardcore gamer", perhaps. The Steam survey shows that cards $150 and under are more popular. The 8800 series holds the top spot, and then followed by the 7600, 6600, 8600, 5200, 9600, and 7300 cards. All well below $300. Even a lot of the 8800's are below $300. The fact is, most people do not spend $300 on a video card.

That's a completely flawed perspective. If everyone in the steam survey bought a card when the 8800 owners bought their cards you'd have a point. Survey doesn't show anything really, but basing mainstream price using today's prices on old parts isn't accurate for sure. A more accurate analysis would involve finding the % of cards that retailed above $300 when NEW and drawing a conclusion from that.

It is when the cards I listed, were never above $300. The 7600, 6600, 8600, 5200, 9600, and 7300 cards were all budget cards from the get-go. These are the top six out of seven series of cards. And even some 8800's cards were/are under $300.

Saying that a 7800GTX is a budget card, when listed in that survey wouldnt be accurate. Because as you pointed out, it may have been bought at its $500 MSRP, and is now much less than that used.

Again, survey means nothing without total %s. Just the mere fact the 8800 is #1 should indicate the market for $300+ cards is higher than you might think when it comes to people who actually game. Again, Top 7 spots making up what %? If those 7 make up 50% and 25-30% of the rest are all former high-end cards that originally cost $300 or more than that's still a significant portion of total gamers who spent $300 or more at one point or another. I don't think that's too unrealistic considering there's no 6800, 7800, 7900, 9800, x800, x1800, x1900, x2900 or HD3800 in the top 7.

Its certainly possible that "hardcore" gamers who previously owned high-end migrated to 8800s in the past year, but that only shows that hardcore gamers tend to upgrade more often and spend more on hardware than the people buying or holding onto budget cards.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: sisq0kidd
Flashbacks of Voodoo5... everything is lining up... the stars, the pcb, the chips... uh oh.

ATI had their own Voodoo 5 at the same time as the Voodoo 5, it was called the RAGE Fury MAXX with dual Rage 128 Pro chips...no need to bring up 3DFX, ATI has already been down this road all on their own when they needed to whip up something to compete with the GeForce 256.

Of course with both sides full into multi GPU solutions and with both sides already seeing dual GPU support both with 3rd and 1st party solutions, the R680 isn't quite as foreboding as back then. Heck, if ATI could pull through from where they were with the end of the Rage series, their situation now doesn't seem so dire - nVidia was so on a roll back then.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: waffleironhead
Originally posted by: taltamir
oh its a pcie splitter? then that sucks... hardcore suckage. Forget it than, i aint buying.

anyways, i forgot to make my joke... "If UFOs taught me anything, its that if the pictures are blurry than it doesn't exist"...

Why would that be a bad thing?

half the ram is wasted, the drivers have to be specially optimized, and many games don't work with it or get sub optimal results... its a software workaround for something that should have been done or a hardware level. And it sucks.
on average on a NEW game (one that actually needs multiple GPUs) you can expect 0-25% benefit over a single card (or in this case, a single chip). By the time you already finished playing with it will rise to about 50% benefit. If it is a super popular game they might focus on it some more and get it up to 70% or so...

A proper hardware implementation should have close to doubled the speed though. (think RAID0 as an example of how to properly do something like that)
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Originally posted by: taltamir
half the ram is wasted, the drivers have to be specially optimized, and many games don't work with it or get sub optimal results... its a software workaround for something that should have been done or a hardware level. And it sucks.
on average on a NEW game (one that actually needs multiple GPUs) you can expect 0-25% benefit over a single card (or in this case, a single chip). By the time you already finished playing with it will rise to about 50% benefit. If it is a super popular game they might focus on it some more and get it up to 70% or so...

A proper hardware implementation should have close to doubled the speed though. (think RAID0 as an example of how to properly do something like that)

This is exactly what SLI and Crossfire have been from the beginning :confused: And what the 7950GX2 was as well (ie. you get all the drawbacks of multi-GPU)

What do you mean by "half the ram is wasted"?

If they had everything working as one GPU it would be great...but they're not there yet.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Yea, but why would I pay to buy a work in progress?

And I am fully aware that those have been the issues with SLI and CF from the begining. I am just saying this hasn't SOLVED them.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,072
575
136
Originally posted by: taltamir
Originally posted by: waffleironhead
Originally posted by: taltamir
oh its a pcie splitter? then that sucks... hardcore suckage. Forget it than, i aint buying.

anyways, i forgot to make my joke... "If UFOs taught me anything, its that if the pictures are blurry than it doesn't exist"...

Why would that be a bad thing?

half the ram is wasted, the drivers have to be specially optimized, and many games don't work with it or get sub optimal results... its a software workaround for something that should have been done or a hardware level. And it sucks.
on average on a NEW game (one that actually needs multiple GPUs) you can expect 0-25% benefit over a single card (or in this case, a single chip). By the time you already finished playing with it will rise to about 50% benefit. If it is a super popular game they might focus on it some more and get it up to 70% or so...

A proper hardware implementation should have close to doubled the speed though. (think RAID0 as an example of how to properly do something like that)

So you are saying that all of these things will happan because the card will split its output on the pcie slot to x8/x8? That is what I was asking by quoteing your post. We all(ok maybe not all) know the benifits and drawbacks of sli/crossfire. But I was wondering why you thought that the card having a chip on it to split the output on the pcie slot equally at 8x/8x would be a bad thing. Is this worse than 2 seperate cards on full x16 lanes? Do the 3870 chips actually saturate the entire bandwidth of the x16 lane?
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: chizow
Originally posted by: Ackmed
Originally posted by: chizow
Originally posted by: Ackmed
$300 is not mainstream, not even close. For a "hardcore gamer", perhaps. The Steam survey shows that cards $150 and under are more popular. The 8800 series holds the top spot, and then followed by the 7600, 6600, 8600, 5200, 9600, and 7300 cards. All well below $300. Even a lot of the 8800's are below $300. The fact is, most people do not spend $300 on a video card.

That's a completely flawed perspective. If everyone in the steam survey bought a card when the 8800 owners bought their cards you'd have a point. Survey doesn't show anything really, but basing mainstream price using today's prices on old parts isn't accurate for sure. A more accurate analysis would involve finding the % of cards that retailed above $300 when NEW and drawing a conclusion from that.

It is when the cards I listed, were never above $300. The 7600, 6600, 8600, 5200, 9600, and 7300 cards were all budget cards from the get-go. These are the top six out of seven series of cards. And even some 8800's cards were/are under $300.

Saying that a 7800GTX is a budget card, when listed in that survey wouldnt be accurate. Because as you pointed out, it may have been bought at its $500 MSRP, and is now much less than that used.

Again, survey means nothing without total %s. Just the mere fact the 8800 is #1 should indicate the market for $300+ cards is higher than you might think when it comes to people who actually game. Again, Top 7 spots making up what %? If those 7 make up 50% and 25-30% of the rest are all former high-end cards that originally cost $300 or more than that's still a significant portion of total gamers who spent $300 or more at one point or another. I don't think that's too unrealistic considering there's no 6800, 7800, 7900, 9800, x800, x1800, x1900, x2900 or HD3800 in the top 7.

Its certainly possible that "hardcore" gamers who previously owned high-end migrated to 8800s in the past year, but that only shows that hardcore gamers tend to upgrade more often and spend more on hardware than the people buying or holding onto budget cards.

You still dont get it.

For one thing, a lot of 8800 cards, are under $300. Secondly, the other six cards have over 30% of the share. While the 8800 cards have just under 10%. Not to mention all the other cards that were never over $300. And the biggest share (15%) is "unknown". Which means they are really on the low end. Feel free to add up all the cards that were ever over $300, and compare them to the cards under $300, and you will see that the vast majority of the cards on the survey, are budget cards, that were never over $300.

The simple fact is, most PC gamers do not have a $300+ card in their system. I dont know why you just cant acknowledge this, and move on. Its actually pretty common knowledge.. and Im done talking about it. You either get it, or you dont.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: waffleironhead
Originally posted by: taltamir
Originally posted by: waffleironhead
Originally posted by: taltamir
oh its a pcie splitter? then that sucks... hardcore suckage. Forget it than, i aint buying.

anyways, i forgot to make my joke... "If UFOs taught me anything, its that if the pictures are blurry than it doesn't exist"...

Why would that be a bad thing?

half the ram is wasted, the drivers have to be specially optimized, and many games don't work with it or get sub optimal results... its a software workaround for something that should have been done or a hardware level. And it sucks.
on average on a NEW game (one that actually needs multiple GPUs) you can expect 0-25% benefit over a single card (or in this case, a single chip). By the time you already finished playing with it will rise to about 50% benefit. If it is a super popular game they might focus on it some more and get it up to 70% or so...

A proper hardware implementation should have close to doubled the speed though. (think RAID0 as an example of how to properly do something like that)

So you are saying that all of these things will happan because the card will split its output on the pcie slot to x8/x8? That is what I was asking by quoteing your post. We all(ok maybe not all) know the benifits and drawbacks of sli/crossfire. But I was wondering why you thought that the card having a chip on it to split the output on the pcie slot equally at 8x/8x would be a bad thing. Is this worse than 2 seperate cards on full x16 lanes? Do the 3870 chips actually saturate the entire bandwidth of the x16 lane?

I think he is saying that it is just a Crossfire on a single card solution, and not a single card solution like the Voodoo 5500. You get all of the benefits but also all of the drawbacks of crossfire with this solution. I think he was hoping that the crossfire would be done internally on the card, like it was on the Voodoo 5500, so the drawbacks of crossfire wouldn't be an issue.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
It would be amazing if they could do what they did with the Voodoo 5500 but things are a lot more complex (the actual image being processed is a lot more complex and each GPU would require access to the other's memory wouldn't it?...correct me if I'm wrong) nowadays so I doubt that kind of thing will happen anytime soon.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Martimus
Originally posted by: waffleironhead
Originally posted by: taltamir
Originally posted by: waffleironhead
Originally posted by: taltamir
oh its a pcie splitter? then that sucks... hardcore suckage. Forget it than, i aint buying.

anyways, i forgot to make my joke... "If UFOs taught me anything, its that if the pictures are blurry than it doesn't exist"...

Why would that be a bad thing?

half the ram is wasted, the drivers have to be specially optimized, and many games don't work with it or get sub optimal results... its a software workaround for something that should have been done or a hardware level. And it sucks.
on average on a NEW game (one that actually needs multiple GPUs) you can expect 0-25% benefit over a single card (or in this case, a single chip). By the time you already finished playing with it will rise to about 50% benefit. If it is a super popular game they might focus on it some more and get it up to 70% or so...

A proper hardware implementation should have close to doubled the speed though. (think RAID0 as an example of how to properly do something like that)

So you are saying that all of these things will happan because the card will split its output on the pcie slot to x8/x8? That is what I was asking by quoteing your post. We all(ok maybe not all) know the benifits and drawbacks of sli/crossfire. But I was wondering why you thought that the card having a chip on it to split the output on the pcie slot equally at 8x/8x would be a bad thing. Is this worse than 2 seperate cards on full x16 lanes? Do the 3870 chips actually saturate the entire bandwidth of the x16 lane?

I think he is saying that it is just a Crossfire on a single card solution, and not a single card solution like the Voodoo 5500. You get all of the benefits but also all of the drawbacks of crossfire with this solution. I think he was hoping that the crossfire would be done internally on the card, like it was on the Voodoo 5500, so the drawbacks of crossfire wouldn't be an issue.

Exactly.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Ackmed
Originally posted by: chizow
Originally posted by: Ackmed
Originally posted by: chizow
Originally posted by: Ackmed
$300 is not mainstream, not even close. For a "hardcore gamer", perhaps. The Steam survey shows that cards $150 and under are more popular. The 8800 series holds the top spot, and then followed by the 7600, 6600, 8600, 5200, 9600, and 7300 cards. All well below $300. Even a lot of the 8800's are below $300. The fact is, most people do not spend $300 on a video card.

That's a completely flawed perspective. If everyone in the steam survey bought a card when the 8800 owners bought their cards you'd have a point. Survey doesn't show anything really, but basing mainstream price using today's prices on old parts isn't accurate for sure. A more accurate analysis would involve finding the % of cards that retailed above $300 when NEW and drawing a conclusion from that.

It is when the cards I listed, were never above $300. The 7600, 6600, 8600, 5200, 9600, and 7300 cards were all budget cards from the get-go. These are the top six out of seven series of cards. And even some 8800's cards were/are under $300.

Saying that a 7800GTX is a budget card, when listed in that survey wouldnt be accurate. Because as you pointed out, it may have been bought at its $500 MSRP, and is now much less than that used.

Again, survey means nothing without total %s. Just the mere fact the 8800 is #1 should indicate the market for $300+ cards is higher than you might think when it comes to people who actually game. Again, Top 7 spots making up what %? If those 7 make up 50% and 25-30% of the rest are all former high-end cards that originally cost $300 or more than that's still a significant portion of total gamers who spent $300 or more at one point or another. I don't think that's too unrealistic considering there's no 6800, 7800, 7900, 9800, x800, x1800, x1900, x2900 or HD3800 in the top 7.

Its certainly possible that "hardcore" gamers who previously owned high-end migrated to 8800s in the past year, but that only shows that hardcore gamers tend to upgrade more often and spend more on hardware than the people buying or holding onto budget cards.

You still dont get it.

For one thing, a lot of 8800 cards, are under $300. Secondly, the other six cards have over 30% of the share. While the 8800 cards have just under 10%. Not to mention all the other cards that were never over $300. And the biggest share (15%) is "unknown". Which means they are really on the low end. Feel free to add up all the cards that were ever over $300, and compare them to the cards under $300, and you will see that the vast majority of the cards on the survey, are budget cards, that were never over $300.

The simple fact is, most PC gamers do not have a $300+ card in their system. I dont know why you just cant acknowledge this, and move on. Its actually pretty common knowledge.. and Im done talking about it. You either get it, or you dont.

1 NVIDIA GeForce 8800 69,356 9.25%
8 NVIDIA GeForce 7900 24,234 3.23%
9 NVIDIA GeForce 6800 22,079 2.95%
10 ATI Radeon X1950 21,951 2.93%
11 NVIDIA GeForce 7800 20,145 2.69%
13 ATI Radeon X800 19,074 2.54%
17 NVIDIA GeForce 7950 12,509 1.67%
Total----> 25.26%

2 NVIDIA GeForce 7600 49,804 6.64%
3 NVIDIA GeForce 6600 39,218 5.23%
4 NVIDIA GeForce 8600 32,544 4.34%
5 NVIDIA GeForce FX 5200 28,034 3.74%
6 ATI Radeon 9600 25,381 3.39%
7 NVIDIA GeForce 7300 24,436 3.26%

Total-----> 26.60%

320MB 8800GTS MSRP'd for $299 with many OC'd versions retailing for more than that. Unless you're referring to the 8800GTs that certain people claim you can't get, and certainly not for less than $300 if you could even find one. /sarcasm.

But anyways, back to the numbers, which once again show the top 7 slots as an indication of what gamers actually buy is clearly a flawed methodology, especially when the difference between 6th and 17th is 1-2%. This isn't an argument about whether ATI/NV care more about the mainstream and OEM boxes vs. high-end/add-in card markets, this is what people who actually consider themselves gamers buy for their rigs. And it clearly shows that there are just as many out there willing to spend on "high-end" cards as "mainstream" ones.

And no I don't think $300 is mainstream/mid-range, I still consider that upper mid-range with $200 being closer to mid-range.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
The Voodoo 5500 had 64MB of memory, but only 32MB per chip. The major difference is that it did all of the splitting on the card, and didn't require drivers and software to control how each chip would subdivide each task. I used the card for about 5 years, and it did well throughout, even though it lost driver support in less than one year (3dFX went out of business). I am surprised that AMD isn't making this a hardware solution, as this rumored solution is more of a jury rigged solution.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This is just Ati's 7950gx2 card. I'd be surprised if it used anything more advanced than what Crossfire does now. Ati may be on their way to implementing a HW-based multi gpu solution, but this isn't it.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I would buy ATI if they made the hybrid crossfire work with high end cards...

Having your watt hungry video card off (and using the onboard video) except when running a game sounds peachy to me.

Since this is their absolute last priority with hybrid CF and will take a while to implement i would probably get a GF9
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Martimus
The Voodoo 5500 had 64MB of memory, but only 32MB per chip. The major difference is that it did all of the splitting on the card, and didn't require drivers and software to control how each chip would subdivide each task. I used the card for about 5 years, and it did well throughout, even though it lost driver support in less than one year (3dFX went out of business). I am surprised that AMD isn't making this a hardware solution, as this rumored solution is more of a jury rigged solution.
AMD can't make a hardware solution. We've been over this a couple of times now, but basically modern rendering is too complex to do things the old SLI way. You can't just subdivide scenes and get perfect performance scaling, there's too much other stuff going on that doesn't easily subdivide or can't be subdivided at all.