OK, this explains a lot. I was wondering yesterday, "Where in the heck did we come up with a Matrox fanboy". Not fanboy, interested party. 😛Originally posted by: xMax
OK. Fair enough. But as an engineer at Matrox...
Originally posted by: fisher
3dfx signed their own death certificate when they bought diamond and began making their own cards.
Originally posted by: Trente
Originally posted by: fisher
3dfx signed their own death certificate when they bought diamond and began making their own cards.
Diamond was purchased by S3; It was STB that 3dfx had bought
Originally posted by: xMax
OK. Fair enough. But as an engineer at Matrox, and one who is off duty and having fun on a forum, then which card would you recommend i get, the Parhelia APVe or the P650e 128?
You can scroll through this thread to see what i need from the cards.
Im asking because the logic is not allowing me to figure out which card is the right one. They both have the same features that i need, but im thinking that maybe one of them may have better image quality, even in DVI mode. Im suspecting this because i saw a test showing analog signal quality on many cards, not just matrox, and a millenium P550 had a better signal than the Parhelia 512. And so, if the same applies to DVI signal quality, then maybe i should get the P650e 128.
What do you say Santa?
Originally posted by: Wolfshanze
The repeated statements about "Matrox ruled back in the day" is totally dependant on your definition of "back in the day".
I've been around for a long while, and computing and gaming since the C64 days.
For awhile, Matrox did rule video... pre-3dfx/pre-Voodoo-1.
In other words, Matrox hasn't ruled since roughly 1994 or so (give or take a year).
Ever since Rendition, 3DFX in-particular, and the rise of NVidia (and much later, a quality ATI product [I remember when ATI = junk]), Matrox has always been a "also ran" video card maker, and that lasts to this day.
Yes, Matrox ruled in the 2D-only era, and still pride themselves on 2D output, but they never were, never have been, and still aren't competing on the 3D market with the big boys (3DFX, NVidia & ATI). 3DFX folded mostly from poor capital investment and partly from market strategy. The decision to buy STB and produce their own cards, combined with a HUGE "money is no object" add campaign both backfired and sent 3DFX to its grave.
Matrox has been smarter with its money, but hasn't really been putting out any cards that compete (besides the 2D-only nitch) since the early 90s.
I am an engineer, but I don't work in graphics so I'm not familiar with our graphics products.Originally posted by: xMax
Santa,
so what if you work in an office, you said you were an engineer.
But it doesn't matter anyway. Im getting the APVe.
Originally posted by: xMax
And why did they not make that decision to go public when there is so much glory and glamour and fame and money that comes out of going public and becoming one of the leading companies. And i know Matrox had the experience and talent to become a leader, if not 'the leader'.
Originally posted by: Wolfshanze
The repeated statements about "Matrox ruled back in the day" is totally dependant on your definition of "back in the day".
I've been around for a long while, and computing and gaming since the C64 days.
For awhile, Matrox did rule video... pre-3dfx/pre-Voodoo-1.
In other words, Matrox hasn't ruled since roughly 1994 or so (give or take a year).
Ever since Rendition, 3DFX in-particular, and the rise of NVidia (and much later, a quality ATI product [I remember when ATI = junk]), Matrox has always been a "also ran" video card maker, and that lasts to this day.
Yes, Matrox ruled in the 2D-only era, and still pride themselves on 2D output, but they never were, never have been, and still aren't competing on the 3D market with the big boys (3DFX, NVidia & ATI). 3DFX folded mostly from poor capital investment and partly from market strategy. The decision to buy STB and produce their own cards, combined with a HUGE "money is no object" add campaign both backfired and sent 3DFX to its grave.
Matrox has been smarter with its money, but hasn't really been putting out any cards that compete (besides the 2D-only nitch) since the early 90s.
Originally posted by: SickBeast
Correct me if I'm wrong, but didn't the Matrox G400 "rule" in its day? It was faster than anything available at the time PLUS it had full workstation drivers PLUS it had the best 2D output on the market. That happened not all that long ago, and definately well after 1994.
I can't remember if the G400 was king in the TNT2 era or the GeForce 2 era.
i dunno about canada, but going public involves huge expense in the US, not to mention that you now have to put up with tons of minority shareholders who will sue you at the drop of a hat.
G400 came after the TNT2 and than came the geforce (dont remember about 3d though, not alot of 3d games at the time)Originally posted by: SickBeast
Originally posted by: Wolfshanze
The repeated statements about "Matrox ruled back in the day" is totally dependant on your definition of "back in the day".
I've been around for a long while, and computing and gaming since the C64 days.
For awhile, Matrox did rule video... pre-3dfx/pre-Voodoo-1.
In other words, Matrox hasn't ruled since roughly 1994 or so (give or take a year).
Ever since Rendition, 3DFX in-particular, and the rise of NVidia (and much later, a quality ATI product [I remember when ATI = junk]), Matrox has always been a "also ran" video card maker, and that lasts to this day.
Yes, Matrox ruled in the 2D-only era, and still pride themselves on 2D output, but they never were, never have been, and still aren't competing on the 3D market with the big boys (3DFX, NVidia & ATI). 3DFX folded mostly from poor capital investment and partly from market strategy. The decision to buy STB and produce their own cards, combined with a HUGE "money is no object" add campaign both backfired and sent 3DFX to its grave.
Matrox has been smarter with its money, but hasn't really been putting out any cards that compete (besides the 2D-only nitch) since the early 90s.
Correct me if I'm wrong, but didn't the Matrox G400 "rule" in its day? It was faster than anything available at the time PLUS it had full workstation drivers PLUS it had the best 2D output on the market. That happened not all that long ago, and definately well after 1994.
I can't remember if the G400 was king in the TNT2 era or the GeForce 2 era.
Correct me if I'm wrong, but didn't the Matrox G400 "rule" in its day? It was faster than anything available at the time PLUS it had full workstation drivers PLUS it had the best 2D output on the market.
Originally posted by: xMax
It's a sad world we live in. So brutal.
So what's going to happen with all these other companies as ATI and Nvidia just keep rolling out with so many up to date highly compatible and high performance products? Is it just innevitable doom or what? I mean, once ATI/Nvidia become larger and larger, then they will have enough resources to excel in what these leftover companies still excel in, like Matrox and thier 2D image quality. Although of all those companies, i still think Matrox will be able to hold its ground with Multi Display Solutions. But thats just a guess on my part.
And heres the thing. Has their ever been a 2-company monopoly? Because if one of these companies, either ATI or Nvidia, starts to take over the other, then we get a monopoly as the other companies just get the bread crumbs. And if that happens, then that company, wether ATI or Nvidia, which gets the monopoly, will eventually be dismantled.
In my view, i think ATI and Nvidia will probably make sure they both stay on top. I honestly believe thats the way of modern capitalism. Because if their is 2 companies, then they cant be taken apart.
Im almost certain that Intel made sure AMD caught up to it so that it didn't become a monopoly. Of course i could be wrong, and dont think anybody could prove this theory wrong or right. But it could make sense.
Poor Matrox. But hey, its all about survival of the fittest, and thats natures way, and nobody could argue with that.
Most reviews of the G400 were done with very early drivers that performed poorly. A few months after it came out, Matrox brought out a driver that made it the fastest card out there until the GeForce was released.Originally posted by: BenSkywalker
Correct me if I'm wrong, but didn't the Matrox G400 "rule" in its day? It was faster than anything available at the time PLUS it had full workstation drivers PLUS it had the best 2D output on the market.
The G400 bested the TNT2 overall but in terms of the workstation drivers- it took Matrox years to get what I would consider tollerable drivers for OpenGL functioning on the G400. That is the reason why nVidia is still widely considered to have had the superior product- Matrox performed very poorly in Q2 and Q3 powered games in relative terms.
The G200's OpenGL ICD ended up being too little, too late, and Matrox may run into the same problem with the G400's TurboGL. The next generation of cards from NVIDIA, 3dfx, and S3 are due within the next few months and all promise more features and raw power. Thus, if you can, it is probably a good idea to wait a little while and see how things pan out in the next generation. At the very least, the price of the current generation will probably drop significantly when the new cards are finally available.
However, for anyone that needs to buy a card now the TurboGL definitely brings the G400 back into contention as a great all around 3D accelerator. Just remember that the TurboGL only works with Pentium III and Athlon CPUs. Without the TurboGL, the G400 is severely crippled, making a TNT2 or Voodoo3 a better choice for Pentium II, Celeron, or K6 users. Unfortunately for Matrox, that takes out a huge chunk of the market.
Originally posted by: xMax
Matrox used to rule back in the days. But now they have been cornered into a niche market that seems to be shrinking with each passing day. It seems almost inevitable that they will be out of business within the next 5 to 10 years. Of course i could be wrong. I hope im wrong. I really liked Matrox, and they are based out of Montreal, the city i live in.
So what happened? Clearly its the 3D market that has crushed them. But they were making 3D cards, so why didn't they battle it out with ATI and Nvidia?
Im assuming there are three reasons why they lost the battle:
1- They didn't have the money and thus the resources.
2- They were too deep into the 2D market and couldn't battle it out in the 3D market.
3- 2D rendering was no longer a big deal for modern computers to handle.
4- DVI emerges and makes all 2D analog image quality, their specialty, somewhat obsolete.
Maybe i dont know what im talking about, but i am interested in knowing why they went down.