Video Cards' Exhorbitant Prices

Biggs

Diamond Member
Dec 18, 2000
3,010
0
0
D'ya'll think that Nvidia knew about 3DFX's imminent demise therefore realizing they can increase their product's prices to a ludicrous degree cuz no one else could even touch them in the realm of speed? This is just my theory which should taken with a grain of salt.
:)
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Exorbitant is in the eyes of the consumer. If people will buy $300 video cards, then nVidia, Matrox, ATI will try to sell $300 cards. Personally my threshold is about $100-$125, so I'm more of a cheapie than the rest of this BBS. I can't picture spending $200 on a card simply to play games. I'm willing to simply play them at a lower resolution until the card drops down below $100.

There's still plenty of competition. If ATI and Matrox fold, then we can start worrying about a lack of competition in the video card space. S3 leaving was a bit of a concern, 3Dfx dissolving is worrying. If ATI and Matrox leave the picture, then we can start to really be concerned.

If prices really start to skyrocket and technology starts stagnate due to a lack of competition, then someone else will step in. If margins are high, this creates a "vacuum" and into this vacuum someone else will step. It may take a year or two for them to show up, but someone else will show up. The nice thing about video chipsets is that the threshold of entry is not ridiculously high (like, for example, CPU's and hard drives). You can gather venture capital, license off a bunch of software from Synopsys and Cadence, put together a team of engineers by promising an IPO, and create a video chipset from scratch in a couple of years and then have someone in Taiwan fab it for you. 3Dfx and nVidia were nothing 5 years ago. S3 ruled the world. Before that it was Trident. Before that Hercules. Companies fold and new companies take their place bringing new innovation. This, to me, is the sign of a vibrant industry.
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
I don't think Trident ever really ruled. I believe you are probably right about them having most of the market share. However, they have never really had a superior product. During that period of time I personally went with ATI since I believed they had a better product than Trident.

I also believe that nVidia with their very complex GPU is creating a higher point of entry into this industry. Creating a chip that competes with the nv20 isn't something that is likely to happen in someone's garage. :)
 

Erasmus-X

Platinum Member
Oct 11, 1999
2,076
0
0
I've never really spent more than $200 on a video card. No need to. In fact, my next major upgrade (Omega II under construction, under my rigs) will equipped with a GeForce2MX. Buying a top-end video card is like buying an exotic car. You never get a good price/performance ratio. Sure, it's lightning-fast, but you don't actually lose a significant amount of power going with the significantly cheaper MX. To top it off, video cards are obsolete every 6 months. So, unless you're loaded (which I'm certainly not), it's sort of pointless to spent $1k on just video every year.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81


<< I don't think Trident ever really ruled. I believe you are probably right about them having most of the market share. However, they have never really had a superior product. During that period of time I personally went with ATI since I believed they had a better product than Trident. >>



For a period of time around 1990, they were the graphics company with the highest revenue and the highest market share. The correct term for this may not have been &quot;ruled&quot; but they certainly were the one in the lead in terms of market share. They definitely had the largest market share in shipped product for a period of time around 1990. It may not have been the best product, but it was certainly the most common. Same with S3 around 1995.



<< I also believe that nVidia with their very complex GPU is creating a higher point of entry into this industry. Creating a chip that competes with the nv20 isn't something that is likely to happen in someone's garage. >>



I didn't say that it was easy - just that it's not as hard as other devices in the industry. Like Flash, for example. No one is going to start up a company from the ground up that makes Flash chips. They are difficult to manufacture, they require vast amounts of capital investment, etc. Flash is the kind of industry that a start-up could never really break into.

nVidia uses CAD software to create their CPUs - they've issued a couple of press releases on the subject in the past. So does ATI. Yeah, it's a huge advantage to have a team of experienced engineers who have done this before. But it really doesn't take a massive team to create a GPU using CAD software. It's not a like a CPU where the bulk of the transistors are hand-placed. 20 million transistors or not, the CAD software created most (if not nearly all) of them for you. You create RTL code, you floorplan the design, and you hit a bunch of buttons and basically hit the &quot;create chip&quot; button and when you've killed all of the errors that the program creates, you ship the design off to your Taiwanese foundry. Ok, it's not as easy as this, but the basic idea is there.

Of course this whole thing hinges on having some new innovative idea to be competitive - you need a new idea that would allow you to blow away the competition (similar to 3Dfx's original idea which enabled them to briefly beat well-entrenched companies like S3, ATI and Matrox), but assuming that you have a new different way to do GPUs (which would be the reason that you created the start-up to begin with), then there's no reason why a small start-up couldn't create a competitive GPU.
 

dszd0g

Golden Member
Jun 14, 2000
1,226
0
0
Yeah, I agreed that Trident had the market share. I just didn't think that &quot;ruled&quot; was the right term for it. Not the same way Hercules, S3, 3dfx, or nVidia have ruled. During each of those periods almost everyone I knew was using a card using that chip. Almost every long time computer geek I know has owned a Hercules, an S3, a 3dfx, and an nVidia. The only Trident chip I've ever used has been in my laptop. During the period you refered to as Trident ruling almost all pre-built systems used Trident chips. However, almost all self-built systems used something else, because the Trident chips sucked for games. They were slow and didn't have enough or fast enough memory on them. I just think they had market share based on marketing and getting contracts with PC vendors and not by having a superior product which has been the case for the other companies you mentioned. I think at least ATI and Matrox had superior products during that period, possibly others I am forgetting.

I thought I should clarify for those who read these threads and either don't remember that period or entered the PC market after that time.
 

weinir

Member
Jan 1, 2001
40
0
0
its doubtful that 3dfx's demise will really kill the market. the prices are going down, and always will. plus, ATI has finally stepped up to the plate and decided to challenge nVidia with the radeon, and matrox is always looming somewhere in the background. now the doomsday scenario about ATI and Matrox BOTH folding is all but impossible. Matrox, maybe. their future looks shaky if they dont roll the g800 out soon. ATI...naw. they got a good hold on the mobile GPU market, and also the onboard video market is pretty much theirs too. as long as there are people out there who are unwilling to pay 150 or more for a vid card(like me), there will always be competition.

and the prices are not really exhubirant. if you remember a couple years ago, a voodoo2 cost the same as a gf2u. top of the line cards always cost that much because they are the best when released. simple economics.