While I understand your point that history does side with nVidia, you can't just assume that because nVidia says that it will be faster that it actually will. For example, nVidia was hyping up their detonator 40 drivers not long ago (in the original beta stage) saying that it would increase performance by nearly 30%, when it actually only gave about a 5%-10% increase.Originally posted by: BDawg
My point was all of these people doubting the NV30 were likely the same one building up the R300 before any specs were published. If I had to pick a side, I'd say history goes with nVidia.
Also, I fixed that quote I had attributed to you. I wholeheartedly apologize for that.
First off, since you apparently were omniscient and knew that the V5 6k wasn't going to work, I can say that I too knew what to expect of a certain video product before it hit the market. All that? I don't think that the R300 surprised anyone, and it surely didn't surprise me. ATI had to release a card with performance like that to prove anything to the market. They couldn't release a card that merely matched the performance of the Ti4600, they had to considerably pass it to make it worthwhile to pay for their product over nVidia's. Otherwise, they would just go back to the drawing board and try to redesign it. New cards are always sold at a premium for purposes of covering R&D costs, so if your product only slightly beat what was out, you'd have to either offer it at a better price (lowering margins to dangerous levels) or scrap that product in the hopes that you can improve on its design quickly (lest you go the way of 3dfx). I expect that every product launch should show roughly a 20%-30% increase over the current fastest card on the market. That's what's been happening for a few years now. The R300 handily beat the GF4 in Aniso performance, but that too wasn't unexpected. I didn't have to beta test the product to know that.Originally posted by: Taz4158
It took nVidia 6 months extra and a huge core increase before they could even begin to offer a product faster than R300 and we still haven't seen benches. I knew the 9700 was "all that" because I was beta testing it. Until there are benches and an actual product it's all speculation be you a fan of ATI or a fan of nVidia and I could frankly care less since I'll have a NV30 to go along with my 9700 so I haven't a clue why you got your panties in a knot.
In response to both remarks about waiting to see benchmarks, I think that my post (had you read the whole thing) made it clear that I don't accept anything a manufacturer says as fact until I see benches. However, you were basing your original criticisms on the 50% number in nVidia's press release, therefore I got my 4x number from the same place. I can say that I don't think that nVidia would say what it does without at least some benchmark to back it up. I know how marketers work, and they won't outright lie (for fear of lawsuits). Instead, they'll find some obscure benchmark on some obscure game that will prove their point. I'm guessing that the 4x bench will be with Aniso and AA enabled, as it seems like nVidia made a concerted effort to increase its card's performance in these two areas.Originally posted by: Taz4158
I also wouldn't count on the 4X speed increase until you have seen a LOT of benchmarks, a remark like that can quickly come back and haunt you.
I'm not sure, exactly, how quoting a press release will come back to haunt me, but I'll take that chance.
Actually, I have heard this before. Now whether or not nVidia or ATI or any other manufacturer recommends this I do not know. I can also tell you that the kind of air flow you would lose wouldn't make your card overheat, but it would make your card run a little hotter than usual, which has all of the bad side effects of any other warm computer component.Originally posted by: HendrixFan
I have never heard that you are "supposed to" have your first slot open for airflow. In fact, I dont know of any one person personally who has had such a configuration.
More importantly, I never put a card into the first PCI slot in any system I build. The first PCI slot was notorious for causing problems with the AGP slot only a couple of years ago. Even though most of these problems (IRQ sharing problems mostly) have been fixed, why not just protect yourself from this entirely. With the integration of a network card into almost every motherboard now, and decent integrated sound on many others, there is little reason why losing the access to one PCI slot would affect anyone. I used to have a DVD decoder card, a sound card, a network card and a video card plugged into my computer. The only other thing I could see anyone using would be a modem (I have broadband). Take away the DVD decoder card (decoding DVDs is a joke to any modern CPU) and the network card (already stated why) and there are only three things anyone should ever need to plug into his or her motherboard. There are more than enough slots on any ATX motherboard to support the important cards, and I don't think that nVidia is too concerned with their top-of-the-line card not fitting into micro-ATX or Flex-ATX systems, though they would still fit in a Shuttle system.
So who is left out of the market for these new 2-slot AGP cards? Enthusiasts? Nope. Any enthusiast would be smart enough to get a board with at least 4 PCI slots (almost always 5) and 1 AGP slot, with integrated 10/100 NIC card. So, you would have room for your vid card, your sound card, your modem (a very small portion of the enthusiast market uses modems I'd reckon), and maybe even a RAID card. Anyone needing/using 2 RAID cards should have been smart enough to at least get 5 PCI slots anyway. Home user? Most won't get this kind of card, but even if they do, they won't need RAID cards and will probably even have integrated sound. The only person I could see that would be let down would be someone with a Shuttle box that's a LAN gamer that uses a separate sound card. However, most of them (I believe) would be willing to settle for a lesser sound solution in order to get the bleeding edge performance in video cards.