GeForce2 MX/MX 400 with a WEGA 32"

DeadSeaSquirrels

Senior member
Jul 30, 2001
515
0
0
So I just got a Dell with a GeForce2 MX(400?). The video card is actually made specifically for Dells by Nvidia, but it is suppose to be basically the MX400. I am connecting the video card to my 32" WEGa, and I was just wondering what the max resolution to the TV should be. I can only get it to work at 800x600, I was wondering if I can get it at a higher resolution or if that is the limitation of the television.


Thanks for the help everybody.

Let me also mention that at 800x600, there is definitely something wrong with the output, because the picture is all fuzzy and messed up. In fact the output to the television is even worst than the output i had with my old Diamond Viper II video card. Is this just a driver thing, and where do I get a good driver. I mean this computer is brand new and presumably (by me) has the newest drivers available.
 

mee987

Senior member
Jan 23, 2002
773
0
0
your max res for tvout on that card is 800x600

does the card look like it is running 256 color mode? if so, there is nothing you can do. i have a gf2 gts and mine looks crappy like that and i have tried dozens of different things. i have posted on these forums 3 times about it and i always get like 15 responses but nothing ever fixes it.
 

DeadSeaSquirrels

Senior member
Jul 30, 2001
515
0
0
No it is at 32-bit, and the colors look fine, but the graphics is not smooth. I mean it is not smooth, worst than how it looked with my Diamond Viper II (from 6 years ago). I don't know how people do it. I mean there is TV-out (S-Video) output on the video card, why did they put it there if the image going into the tv is going to be unbearably bad. Can anybody make sense out of this...people do this right?
 

Viper GTS

Lifer
Oct 13, 1999
38,107
433
136


<< No it is at 32-bit, and the colors look fine, but the graphics is not smooth. I mean it is not smooth, worst than how it looked with my Diamond Viper II (from 6 years ago). I don't know how people do it. I mean there is TV-out (S-Video) output on the video card, why did they put it there if the image going into the tv is going to be unbearably bad. Can anybody make sense out of this...people do this right? >>



It's essentially useless for anything but gaming, & only that at low resolutions.

Viper GTS
 

MasterHoss

Platinum Member
Apr 25, 2001
2,323
0
0
I'm not sure if this is wise or not, but why not run it at the TV's resolution of 640X480?
 

mee987

Senior member
Jan 23, 2002
773
0
0


<< No it is at 32-bit, and the colors look fine, but the graphics is not smooth >>



what do you mean by "not smooth"? are there rough transitions between colors where there should be more of a smooth gradient? this is how my card is, i have it set to 32-bit color but there are these rough edges where the colors should blend more smoothly.

if the picture is just blurry, thats how tvs are and you cant do anything about it.
 

DeadSeaSquirrels

Senior member
Jul 30, 2001
515
0
0
When I say the pictures are blurry, I mean they are pixilated, so not so much blurry. The reason I think there must be some sort of solution is because when I run the computer off of a Viper II (which is a crappy old video card) the picture comes out decent enough that I would be satisfied. But now with this state of the art (of 2 years ago) nvidia card, the graphics are so pixilated that I think something must be wrong.