Originally posted by: BenSkywalker
Your point isn't being understood as it makes no sense whatsoever- what if the basics for capitalism one day will fall simply due to the will of nVidia or ATi? Not likely.
My point was that price/performance, for the non-enthusiast gaming market, was as or more important than simply performance alone. I don't know what the heck you are rambling about with the fall of capitalism due to ATI or NV.
Originally posted by: BenSkywalker
Uhm, the buyers of graphics hardware, based on the required demands of the software developers (games developers).
What game today won't run on a Ti4200? You need to be able to list those of first to see if that line of discussion has any merit whatsoever.
Anything that requires DX9-level support in hardware won't run. Of course most commercial games today, also implement a DX8 path, just so that they won't exclude too many people from their potential customer base, but that won't always be the case. It can largely depend on developer resources, similarity between different API levels, and the difficulty, expense, and comparisons between the sizes of the intalled-base of each generation of hardware, whether the developer chooses to implement alternate code-paths or not. (For example, how many games, coded today, make the assumption that gaming cards support hardware T&L? I would assume all of them. Soon, that will be true for other features as well, and eventually, prev-gen cards, no matter what their performance level, become unsupported by current games because of their lack of certain features.)
Doom3 was a well-documented case of an exception to the rule - it was coded such that it would even be able to run on a GF2MX, although doing so caused the design of the engine to be more-or-less far more CPU-heavy than other engines that put more performance "weight" onto the video card itself. (FarCry for one example.) It can also affect what the artists are allowed to create with the art assets - if the game is designed to be playable on the least-common-denominator feature-set in terms of hardware, then the artists have to keep that in mind, and cannot be as free with their usage of high-end features. (At least not without expanding the art assert budget, and having two create two sets of assets. So the issue of card feature levels can also figure into more than just the technical side of a game's development costs.)
Originally posted by: BenSkywalker
Again, this re-inforces that you didn't even get my original point - it was centered around the price/performance curve argument,
Which is precisely my end of the discussion. GPU price/performance factors are
significantly in favor of the consumer when compared to
any other element of PC technology.
Well, that comparison with any other area of general PC hardware is somewhat outside of the context of this discussion, but I did bring up the question of CPU performance relative to the demands of the software running on the system, so that much at least is fair game.
But I personally find it difficult to believe that a multi-GPU or multi-card solution, could possibly be any more price/performance efficient, when compared to a single-GPU/single-card solution, with the same level of capability.
I don't deny that "the market will eventually find the more efficient solution" - in fact, I do indeed feel that was one of the major strikes that caused 3Dfx's demise in the market. And NV is now falling into the same trap. The "curse of 3Dfx" idea might well have merit after all.
Originally posted by: BenSkywalker
SLI makes this even more pronounced giving a price/performance level at the top end that is completely out of the question in any other area of PC hardware.
Again, you keep holding on to your "top end" arguement, when I was argueing along the price/performance curve. We're not even discussing the same curve here, nevermind the same area along it! (Which is why I suggested that you seemed to fail to understand my original statement referencing price/performance.)
Originally posted by: BenSkywalker
And you admit that SLI systems double-up NV's profit margins - how are you so certain, that NV doesn't plan for SLI to be a new, permanent and required feature?
Because capitalism works.
So you are suggesting, that the market will indeed eliminate SLI solutions, eventually? (Which logically follows from that you are apparently arguing against that they will become a "permanent and required feature".) I guess that would mean that you're agreeing with me then, or at least hopeful of the same outcome - that SLI isn't useful long-term solution, at least for the mid-range market.
Originally posted by: BenSkywalker
We've already witnessed that the vast majority of current graphics-card developments, in terms of products being sold on the market, are almost nearly all at the high-end, with a few token low-end offerings.
This is some of the most ignorant drivel I have seen out of you. If you honestly think that then you don't know anything at all about the graphics card market. The high end parts still constitute a miniscule portion of the total market, Intel is still the largest player in the graphics market and nV's 200 series products(52,6200) significantly outsell both their 6600 and 6800 series of parts
combined. Honestly, if you think the high end parts sell remotely close to their low end counterparts you don't have a clue about how the graphics industry works. We aren't talking a little bit here either, these are orders of magnitude issues.
Obviously you have difficulty following the context of the discussion here. In terms of sheer sales volume,
of course the low-end/budget integrated graphics solutions sell in far more numbers. I was talking about the rate of developments, meaning the number of various models introduced. I didn't even bring up sales volumes, which had nothing to do with my point.
(I will give you a point though Ben, you are a master of the "straw man" arguement, aren't you. Introducing something that I never brought up, and then seemingly shooting it down, as if I were the one that mentioned it. No dice for you this time though.)
Additionally, the context of the discussion here was in terms of
gaming cards. Things like FX 5200s and Intel integrated graphics are not even remotely usable as gaming cards, and so were not even part of the context of the discussion contrasting low-end
gaming cards with the mid- and higher-end ones. The variety of different models, is clearly more diverse near the high-end, almost non-existant currently at the mid-range, and then there are the low-end cards, still not a lot of variety there.
Originally posted by: BenSkywalker
There are almost *no* good new mid-range gaming card offerings. Hence still the longevity and usefulness of people still using GF4 Ti 4x00 and Radeon 9500/9700/9800 cards. Where are the mid-range cards, at mid-range prices, based on newer chipsets? Where are they? The GF 6600 is about the only one out there.
The same evil company that brought SLI is the one making
the current worthwhile mid range solution. I honestly don't get what you are talking about. The 6600GT launched at $200, 50% less then the 9700/9800 and was reasonably faster then the prior gens highest end part. How is this not a good thing on the high end? The 6200 is by far the best part we have seen for less then $100, again easily destroying the previous baseline offerings. How is this not significant improvement?
Note the highlighted word from your statement - "the". The simple fact is that there indeed are a current lack of choices amoung gaming cards in the mid-range market. In a supposedly highly-competitive market, why is this? And again, you bring up high-end, I'm talking about the mid-range (primarily), and price/performance comparisons vis-a-vie SLI systems and single-card/GPU solutions.
The 6200 (turbocache) is an interesting development, and I don't really know much about it yet to comment. What market segment is it aimed at? Low-end/non-gaming (desktop use), low-end gaming, mid-range gaming? (I know it's no-where near high-end gaming.) Considering that it seems primarily designed to cut costs by reducing the amount of memory needed onboard, I would probably rate it low-end. Does it give acceptable frame-rates and features on today's games? Is it faster than a Ti 4200/4400 card, or a R9700/9800? If not, then it must be a low-end card, since those prev-gen high-end cards only offer what I would consider mid-range gaming performance today. Which still leaves the 6600GT sitting alone as the primary viable mid-range gaming card being sold today with an up-to-date feature-set.
Originally posted by: BenSkywalker
The fact that the "best" mid-range cards today, are actually the high-end cards of yesteryear, should tell you something about the rate of recent developments in that market segment.
They aren't. The 6600GT is the best mid range card(assuming you are talking enthusiast) regardless of which up until quite recently mid range cards had always been the previous gens high end offerings with essentialy no changes.
That was part of my point, and the reason that I put "best" in quotes - that the mfg's have essentially stopped development of chips aimed at the mid-range gaming market, instead
only seemingly focusing on the high-end, and letting pre-gen cards trickle-down to the mid-range. That might be an acceptable market solution purely in terms of performance, but when new APIs get released (SM3.0 being a big one), and new games are developed to support or more likely
require those features, that renders those prev-gen high-end cards (now considered mid-range for gaming performance), effectively obsolete and useless. Thus the need for cards specifically designed for the mid-range segment in terms of performance, but including the newest features, in order to make them viable to play the newest games.
The fact that the 6600GT is essentially the "only" de-facto choice for a mid-range gaming card, with the newest (SM3.0) feature-set, in a market as competitive as the video-card GPU market is, should tell you something right there about the seeming lack of development going on in the mid-range gaming market.
I guess what I'm trying to say is that prev-gen high-end cards may be performance-competitive in terms of being a suitable mid-range gaming card, but they aren't necessarily going to befeature-competitive, especially at those points where the major APIs change. Thus the need for continued development for a current-gen (in terms of features) mid-range (in terms of performance) gaming card. (Unless the mfgs want to cut their R&D costs, and effectively eliminate the mid-range gaming market segment.)
Originally posted by: BenSkywalker
I was talking about the time when there effectively were no consumer-level 3D cards available, and the only (and relatively-speaking, expensive) hardware-accelerated solution was a workstation-class graphics card.
There were no hardware accelerated games at that time.
Although I admit my example of OpenGL gaming on workstation-class cards was a bit arbitrary, I believe that you are wrong there, SGI workstations had several games available for them to play, although I don't believe any of them were sold commercially as seperate software. I'll have to dig up some more on that, I'm certain that I've seen something about that. I believe that Spectre and NetTrek evolved from those. I'll concede that my example was ill-concieved though, in relation to the current commercial PC games market.
Originally posted by: BenSkywalker
Some of that work involves deals cut between those hardware mfgs and the game developers, with an eye towards increasing the market-share of those mfgs. Or did you think that the "The Way It's Meant To Be Played" logos/animations, were just added to games as a running joke gag?
Like TR:AoD? Please, just because they work on promotional deals in no way whatsoever means that technical limitations are going to be artificially put in place(well, maybe TwinkieMan

).
Who the hell is "TwinkieMan"? Are you referring to Gabe Newell? (Interestingly, I have a friend that looks just like him, I had to do a double-take when I was reading some of the HL2 review material, to make sure that my friend wasn't actually the author. Scary.

)
But you are somehow steadfastly suggesting, that game devs
don't often specifically optimize some features of their games, for a particular mfg's hardware? LOL.
Originally posted by: BenSkywalker
There's nothing inherent technically that prevents multiple GPUs to be put onto a card that has an AGP system interface. (Whether it is cost-effective to develop the necessary bridge silicon, compared to the volume of multi-GPU cards that would utilize it, is another question entirely, and is largely the reason that such solutions have been rare. In fact, weren't the 3Dfx "Rampage" boards based on just such a configuration?)
Yes, there are technical limitations which is why the bridge chip is needed. Even then, no multi rasterizer chip ever worked properly in an AGP slot using an AGP mode under WinNT based OSs.
Ahh, now the caveat appears. That's more of a limitation of NT's driver architecture than anything else, certainly not any inherent limitation on the hardware end of things. IIRC, ATI's official excuse for why NT OS drivers for their Rage MAXX card were never released, was that NT's architecture somehow prohibited them from implementing them, but personally I think that was just an excuse. If you recall at the time, ATI's NT driver offerings were very immature across their entire product line at the time. (AIW cards only functioning as video cards, no multimedia features, etc.) So ATI probably didn't feel that it would be cost-effective to put their driver engineering resources into producing a NT driver for a card that would have been obsoleted by the time that the drivers would have been done anyways. (Since they were hard at work on the Radeon in-house at the time.)
I'm pretty certain that there
were high-end workstation AGP solutions with multiple GPU chips at the time. The existance of such things (for the workstation market), is what motivated Intel to design the AGP Pro specifications, to support the power and cooling requirements that were estimated for such cards. They just largely didn't exist in the consumer market, because of the inherently poor price/performance ratio of those sorts of solutions.