• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

GF3 Ti200 Support nVIEW or Dual Monitor?

apexi350z

Member
Mar 14, 2002
131
0
0
I have a PINE Geforce3 Ti200 video card with 64mb ram. It has a analog and a dvi connector. Can I use a dvi to analog convert and have dual monitor support with this card? Does GF3 Ti200 has this feature? Thanks..
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
It has a analog and a dvi connector. Can I use a dvi to analog convert and have dual monitor support with this card?

Nope, its a single head card with connectors for VGA or DFP, one or the other, but not both. There are no dual head GF3 cards.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Just to second rbV5, GF3 cards don't have that ability so you'd need to either get a cheap PCI card ($20ish) to use in conjunction with your GF3 (but avoid another nVidia card) or else get a card with dual display functionality. ATI Rad8500 cards ($90ish) do and will give a small perf improvement over your GF3, GF4TI4200 ($130-150) will give you a bigger boost and also keep the kind of fast AA and great quality Aniso you're already used to. If you do upgrade it would be very wise to get a 128MB card. If you go Radeon do double check that the card does dual display as simply having the ports doesn't mean a card can do it, esp if you don't buy ATI.
 

jabongga

Member
Aug 19, 2001
52
0
0
ive asked the same question before. In short you cant have "extened desktop", however you can "mirror" your desktop
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
however you can "mirror" your desktop

I'm not so sure about that. I do recall someone saying that somehow they were able to get it to clone the screen, but my understanding is that is not possible. That would be the only time I've heard anyone do that with any GF3 variant, IIRC, one of the biggest complaints of the TV-out implementation other than poor IQ, is that you cannot clone the Primary Monitor to the TV, one or the other, not both. So my understanding is that the Monitor outputs work in the same fashion, when one is enabled...the other is disabled.

But is that really correct? Maybe someone with a GF3 (any model) could verify it for sure one way or another?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) That's certainly my experirnce and understanding rbV5. One reason I by-passed the GF3 and went directly to GF4TI was the poor image quality, TVout and dual display of GF3, and the matter of upgrading was nothelped by ATI due to the over-priced Radeons here in the UK. Having to choose between the TV and monitor and have quirky refresh rates was simply not worth the cost even for the great 3D perf as I run a 10m (about 33') lead from my card to my large screen TV. Don't get me wrong GF3 are fine cards, but for me this was an important factor and I'm glad I waited it out.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Hey cool someone using a GF4 and TV out...could you give me your general impression of the GF4's video out with regard to IQ, and functionality? I use my TV-out quite extensively, and its a prime consideration when I make a card purchase.

I use a 20' high density s-video cable with mine, and I see no real difference over my 6' cable. I understand that quality is subjective...but we've all seen TV.

For my testing its easy, I switch back and forth with my different inputs on my TV. I'll put the Digital Cable box and the Radeon tuner on the same channel. (You have to timeshift with the Radeon Cards to display TV via TV-out, as it doesn't support live video streams on the Theater Mode overlay, it works VERY well with catalyst, perhaps with the next Gen that will be moot using the shader rather than Overlay to display video streams me hopes) and then simply switch back and forth. Using my family as subjects, we are about evenly split as to the preference. The Radeon gives a slightly less sharp picture IMHO, but pretty close.

My other test involves DVD and VHS versions of the movie Matrix. In order of highest to lowest quality,
Standard rented standalone DVD player>Radeon DVD output via ATI DVD 7.7>PS 2 DVD player>VHS tape via Phillips VCR.

The Radeon and the standalone are the closest, with the Radeon losing out due to lack of sharpness comparatively speaking, The PS 2 IQ/and functionality is a bit disappointing to me, its OK, but certainly not great, and the functionality is poor at best...fine for the kids. The VHS is a nice copy without much play(I got it as a gift after I already had the DVD).

Using Theater Mode with Overscan enabled, I get a full picture, great for video...but for gaming I clone the Primary with no overscan at 1024X768, so I get a little bar at the bottom of the set(overscan loses too much of the play screen) but the IQ is great after I crank up the color saturation, the NTSC conversion gives a bit of a poor mans FSAA.

Sorry about being wordy and a bit off topic, but I havent heard much about the GF4 TV-out.

Thanks
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) No problem rbV5.

:) Wow, that's a lot of info and I won't pretend I took it all in!

:D Well the GF4 cards are still reliant on the same 3rd party chips as the GF3 cards but at least there are advances in image quality and multi-display. The black border is there and pretty noticable on my Philips 29" 4:3 PAL TV set, but you do forget about it quite quickly. Without tweaking the picture is not sharp (but we are talking 1024x768 on a standard if decently sized TV) but the brightness, contrast and colours are very good, although I don't have another TVout card to compare this with. When playing a high quality DivX movie it is noticably better quality than a VHS recording, even pre-recorded shop bought tape ... and that's before the cleaning, tracking, wear & tear are taken into account. Other than the border the result is excellent and using the TV-Tool v6.5 15min DEMO there are loads of tweaking options and it also allows 2 overscan modes which help a lot. Also do bear in mind that I'm running a 10m+ standard RF lead down to my living room (lounge), I am impressed with the results but am annoyed nVidia haven't tweaked the drivers to remove the border or better yet standardised the TVout.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Thanks, yes it seems that it wouldn't be that hard to implement some user options as far as TV-out is concerned. It would be nice to have better control over the overscan for instance, with the Radeon its all or nothing, and I'd like to have some control over the amount, overall, its very good, but it could be much better with a little effort.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:eek: Yup, it's a shame these big manus get so fixated on little more than pure 3D speed, they only seem to tweak their drivers for that rather than thinking about TVout etc. At least nVidia with the GF4 cards have caught up with ATI for image quality and dual display, perhaps TVout will be addressed with nVidia's new cards!