Reactor Critical thinks its has the specs down for the new Matrox Parhelia. OMG!!!

Piano Man

Diamond Member
Feb 5, 2000
3,370
0
76
Here are the specs. This is sweet. They also say May 14th is when the NDA expires.

- 4 Pipelines
- 4 TMU's per pipeline
- Partial DX9 Support
- Pixel/Vertex Shader 2.0
- Displacement Mapping
- 40-bit Color
- 4 Vertex Shader Pipelines
- 512-bit 2D Graphics core
- 256-bit bus
- 700 MHz DDR Memory (~ 23 GB/sec)
- Triple-Head
- AGP 8x support
- 80-90 Million transistors
- 350 MHz.

Link to Reactor Critical
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Pretty nice, reminds me of the GF4 though, GF4 on crack is more like it.

This will be my next card ;)
 

Pocatello

Diamond Member
Oct 11, 1999
9,754
2
76
The one from 3Dlabs has partial support for DX 9 also, which one will come out first and how much ;).
Anand believes that the 3Dlabs videocard will be ready soon (less than 2 months?), not sure about a version for the consumer though.
Nvidia should announce a new videocard for DX9 in August.
Still waiting for ATi to announce a GeForce 4 killer ;)
 

vss1980

Platinum Member
Feb 29, 2000
2,944
0
76
Hmmmm, 80-90 million transistors...... thermaltake better ramp up its crystal orb production line.
Also, 256-bit bus (I am assuming they are talking from chip to memory) is gonna make the card PCB a bit harder to make than others (example, the GF4 Ti4200 is a 6-layer PCB, with the 4400/4600 boards being 8-layer boards - that puts them ahead of some motherboards in terms of PCB layers). Of course, its Matrox so they could just be building up a 128-bit DDR bus, we'll see if and when the card is released.
 

Athlon4all

Diamond Member
Jun 18, 2001
5,416
0
76
Those specs are impressive, but what I am more interested in is the May 14th NDA expiring, we should know where Parahelia ROCKS or NOT on May 14. *Marks May 14 as a big day*
 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
Triple head. Why? The overlap between people who need more than two monitors and those who need awesome 3D is miniscule. Does that also mean the chip is going to have 3 integrated ramdacs?

512bit 2D graphics core. Why? What possible benefits could that bring?

If this is genuine, it's looking like a workstation only card in the £1000+ range so unless you have money to burn it's a big no-no.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,373
8,497
126


<< If this is genuine, it's looking like a workstation only card in the £1000+ range so unless you have money to burn it's a big no-no. >>

thats what 3d labs is targeting and what matrox is rumored to be targeting, with relativly inexpensive cards aimed at the high-end gamer.
 

Windogg

Lifer
Oct 9, 1999
10,241
0
0
Damn, can't wait. My G400s and G450s are getting a little tired. Time for some new Matrox lovin.

There is a valid reason for triple head. Do not forget Matrox's target market which is the business and workstation sector. Anyone that has worked in a financial instituation that trades in stocks and bonds knows a trader can have 2, 3, or more monitors on their desk. Dual head was a revolution as it cut down the need for more hardware. The dual DVI port models were great as many companies moved toward LCDs to save space, power, and heat. Tri-head will be great for such businesses and set Matrox apart from other companies.

Windogg
 

robg1701

Senior member
Feb 12, 2000
560
0
0
Well, i could sure use a proper tri-head card to run my 3 monitors, using 3 seperate csards juts now....and i aint no financial guy, im just a guy who likes my desktop real estate ;)

Either way, i just hope they make somethign affordable, since its mroe than likely a Ti4200 or Ti4400 will be my next card....though if Matrox released somthing in the same price range but only 85% the speed, id still change my mind easily enough....
 

DeathByDuke

Member
Mar 30, 2002
141
0
0
This looks like what the G400 was to the Geforce256, almost as fast but better quality and hardware support WELL ahead of anything then, Vibrant Colour, Environment Mapped Bump Mapping, etc. All of those didn't get supported til Geforce3/Radeon256.
Three Cheers To Matrox for Innovation in the Past and now the future (but nVidia won't make the same mistake twice, nor will ATi.)
 

Czar

Lifer
Oct 9, 1999
28,510
0
0
not sure if this is true or not

http://forums.matroxusers.com/showthread.php?s=5d352b66f44f3515fb41f1b75c31b5eb&threadid=32725



<< The new card frrom Matrox has arrived in the company I worked in, since we have to write softwares for the new card.

I don't know the card's name, neither did my superiors-- I heard that Matrox haven't named yet.

The first thing I see on the card was its HSF, which was similar to the size of that of a CPU's, and we have to remove the card on the PCI slot immediately next to it to install the card properly. The card used 128 MB of DDR SDRAM.

The card is tri-headed, and actually the company is to develop software for enabling the feature.

Yes, although the card's finished, the driver's not-- we are getting driver updates from Matrox every day. So, it needs to wait for the card to hit the market, at least when we are finished with the software.

And on the 3D speed... Maybe I should find a opptunity to test it out secretly, but since the driver's not really finished, thee results would never be accurate.
>>

 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,373
8,497
126
<sorority girl>OMG OMG OMG OMG OMG</sorority girl>
 

Valinos

Banned
Jun 6, 2001
784
0
0
Guys, this is beautiful. 2002 is going to be the beginning of a new graphics war. :D Lots of innovation, lots of variety, and lots of price cuts

We got four contenders now. No longer is it just ATI and Nvidia. I really hope that Matrox focuses more on the gaming market and that 3DLabs brings out an affordable, and powerful 3d card for consumers by August. I want to upgrade my graphics card before Quakecon...and I don't want to spend $400.
 

rahvin

Elite Member
Oct 10, 1999
8,475
1
0
Nvidia is going to eat them and 3dlabs for lunch. NV30 is the first card to combine 3dfx and nvidia technology. All the Sage and Rampage research is finally going to be put to use combined with nvidia research along with nvidia management to bring it to market.
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81
Well this always happens. ESPECIALLY with Matrox cards. The last one was suppose to be all awesome but then it ends up not even being a gaming card. People were so pumped about the Rampage too a while ago. "It'll kill EVERYTHING"... Well we know what happended to 3dfx but even if they did release it, how do we know if it's really what those first specs said.

So I'm not gonna say anything like Matrox is gonna crush Nvidia yadda yadda until it actually comes out!
 

Athlon4all

Diamond Member
Jun 18, 2001
5,416
0
76


<< Well this always happens. ESPECIALLY with Matrox cards. The last one was suppose to be all awesome but then it ends up not even being a gaming card. People were so pumped about the Rampage too a while ago. "It'll kill EVERYTHING"... Well we know what happended to 3dfx but even if they did release it, how do we know if it's really what those first specs said.

So I'm not gonna say anything like Matrox is gonna crush Nvidia yadda yadda until it actually comes out
>>

Exactley. I'm holding my breath, but I will definately be excited if this super product is for real, and I'm hoping. I will continue to look forward to the revealing of Parahelia.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
NVIDIAs NV30 is going to be EXTREMELY powerful, i think NVIDIA will lose the performance crown over this summer, but the eclipse will be the card to have this fall, its gonna be a killer.
 

Caveman

Platinum Member
Nov 18, 1999
2,536
34
91
So can someone translate the specs into something useful for those how a bit "behind"? How about a qualitative SWAG like "it should be about 3 times faster than a GF3 Ti 500" or something along those lines...

Anyone?
 

rahvin

Elite Member
Oct 10, 1999
8,475
1
0


<< People were so pumped about the Rampage too a while ago. "It'll kill EVERYTHING"... Well we know what happended to 3dfx but even if they did release it, how do we know if it's really what those first specs said. >>



"Also on Monday, Nvidia raised its financial outlook for the just-ended quarter, and Huang said he sees continued market share gains this year leading to more growth. Some of that will come from a new graphics chip slated to arrive in August.

The new chip will be manufactured on Taiwan Semiconductor Manufacturing Co.'s latest 0.13-micron manufacturing process, Huang said. Huang did not reveal the name or specific features of the chip, but did say it was a fundamentally new architecture from the GeForce 4 Titanium introduced earlier this year.

"It is the most important contribution we've made to the graphics industry since the founding of this company," Huang said, speaking at the Merrill Lynch Hardware Heaven Technology Conference. "

http://news.com.com/2100-1040-896850.html

"Did Nvidia's philosophy change with the purchase of 3dfx Interactive?
Not too much. We still want to be profitable and we still want to stay in business, so they haven't influenced us in that. What we did, though, was to mix the development teams up completely. I didn't want 3dfx people versus Nvidia people. I wanted to have us all learn from each other and make different products.

Both companies had products in development at the time, and we could have just picked up 3dfx's products and developed those. But instead I took the two teams and shuffled them around. I got the Nvidia people to argue for 3dfx products and the 3dfx people to argue for Nvidia, so they all had to learn the advantages of the competing products. We ended up changing the projects so much that they really weren't recognizable from before, and that was the goal; we wanted the best from both sides. Plus, they are all Nvidia people now."

http://news.com.com/2008-1082-894330.html

I'm not trying to turn this into a nvidia vs <whoever thread>, but this was big news guys. These are Nvidia senior management, the CEO and the head scientist. In particular, Huang the CEO believes the NV30 will be the biggest contribution to the graphics industry ever made by nvidia.