GF2 display not good on 19"

Dec 30, 2000
91
0
0
I have a Leadtek GF2 64mb. I just purchased a Samsung 900IFT 19" monitor. It seems the card has really poor image quality on the 19"--on my old 15" it was fine. The left side of the screen seems out of focus and the text is blurry.

I know it's not the monitor because I plugged in my old 3d card, Intense 3D 100, and the display was beautiful on the 19" inch.

Does this qualify for replacement from Leadtek?
 

toph99

Diamond Member
Aug 25, 2000
5,505
0
0
i noticed that too. i have a GF2MX in this computer on my Samsung 900NF, and the text isn't very sharp. soon as i drop my voodoo3 2000 in here, the display is really amazing :)
 

CromNogger

Senior member
Jan 26, 2001
849
0
0
My friend has a 32MB GF2 GTS and a 19" LG monitor. I noticed the text wasn't really sharp even at medium res, and the image looked washed out. I never noticed things like this on computers with other video cards, like some from 3dfx and Matrox.

But that same friend's computer gets 80-110fps in 1600x1200x32 with everything maxed in UT, so I guess there are disadvantages and advantages.

I would prefer to have sweet image quality in all circumstances, and get 100fps in only 1280x1024 instead of 1600x1200. :p
 

duragezic

Lifer
Oct 11, 1999
11,234
4
81
I know the GF2 2d quality is pretty bad but I didn't think it would look that bad.

I thought the Leadtek's had decent 2d quality. You could possibly return it and if wanted to stick with a GF2, get a Visiontek or maybe an ELSA. Or ditch it and snag a Radeon :)
 

Taz4158

Banned
Oct 16, 2000
4,501
0
0
That's the nature of the GTS. When I switched to the Radeon it was hard to believe the difference.
 

Sharkmeat

Senior member
Sep 15, 2000
467
0
0
Maybe Nvidia will use 3DFX 2D graphic,s engine in there next card being they have 3dfx property.Maybe the NV25 being the nv20 is already about done.I sure hope so 3DFX is a barn burner in 2D when playing RedAlert 2 compared to the GF2,3DFX 3 3000 even best out the GF2 ultra in 2D useing lastest wintune program.If you don't beleave me you can check it out your self over at winmag.com.
 

Engineer

Elite Member
Oct 9, 1999
39,230
701
126
There is a known problem with the reference design of the NVidia based boards with the filters located on the card (capacitors and inductors). These filters apparantely aren't designed to provide enough bandwidth for the video signals from the cards to the monitor and therefore, the signal quality is degraded resulting in poor video (especially in 2D) quality. The fix, at this point, is to remove the capacitors and bypass the inductors on the card (may not be FCC compliant then :() which, has been reported, to GREATLY sharpen the video of the NVidia cards. I'll post the link of the guide when I find it.....takes a brave soul to do this though....:)


Good Luck

Link from Video forum from AdamK47 - 3DS....Good Luck :)
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Having done the complete filter mod (remove 9 capacitors, short 6 inductors) on an Elsa Gladiac GTS, I can tell you it makes a BIG difference in 2D. There's also a noticeable difference in 3D as well. Makes the GeForce2 just as good as a Radeon. Too bad that 1) they don't design the filter circuit properly, and 2) Canopus doesn't sell GeForces in the U.S. anymore.
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
my leadtek doesnt have any 2d probs. i only run 1024X768 though. i do admit that at high res, the gf2 reall blows, but i only 1024 so i dont see a difference.
 

DaBoogieman

Member
Jan 6, 2001
53
0
0
Is it really neccesary to remove the capacitors when doing this? Correct me if i'm wrong but doesnt shorting the inductors provide an unfiltred path for the current to flow? Being the path of least resistance it would seem that all need be installed is some jumper wires. What ya think?
 

dougjnn

Senior member
Dec 31, 2000
474
0
0
trippy --

<<I would prefer to have sweet image quality in all circumstances, and get 100fps in only 1280x1024 instead of 1600x1200.>>

Haven't you just described the Radeon 32.

Actually, I'm thinking maybe the Radeon 64 vivo ...

Anyone know how the Radeon 64 max stable overclocked compares to the GeForce2 GTS 32 max stable overclocked? They are about the same price, if they are both TV out.

 

Engineer

Elite Member
Oct 9, 1999
39,230
701
126
DaBoogieman,

No, just jumpering over the inductors would not work. If the monitor had 0 impedance, then possibly, but the capacitors will act as miniture resistors for the AC portion of the signal (especially at certain frequencies) and will effectively short parts of the signal out regardless of a jumper wire or not. If any of the capacitors touch any part of the live circuit to the monitor, it won't help just to jumper out. You can however remove the caps and just jump (or solder) over the inductors. The electricity flow will take the solder/jump path over the inductor because of the lease impedance (resistance to AC).

Good Luck
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
Like Engineer said ... ;)

Practically speaking, I tried shorting the inductors without removing the capacitors - little difference. Also tried jumpering over inductors, with little difference. The guy who first posted the mod gave several of these partial mod alternatives, but the best results come from removing all of the capacitors and shorting the inductors. The mod isn't that difficult, maybe 10-15 minutes, since I did it with a regular Radio Shack desoldering iron. I had both a Radeon and a GeForce2 and even at 1024x768 on a FD Trinitron the Radeon was noticeably better. After the mod, the GeForce2 was easily the equivalent of the Radeon. The best of both worlds - speed and quality. Then again, that was when the Radeon and GeForce2 were the same price (I had paid ~$130 for each of them). Now that the Radeon 32DDR is less than $100 and the GeForce2 is ~$170, that might change the dynamic a bit. If you really like the GeForce and you're running a 19&quot; monitor and/or high refresh rates, the mod is a must.
 

dougjnn

Senior member
Dec 31, 2000
474
0
0
Yeah, well the other part of the dynamic is that the Radeon 64 meg is now $158 at buy.com, with Video In/Out. And hardware DVD support.

Unlike the GF2, the Radeon really benefits from the 64 meg version. Which also has faster memory and is more overclockable.

It's looking like that is how I'm gonna go. When I pick up the last pieces of my system next week. Now if only AMD will drop the T-Bird 1.2ghz to keep a price gap below the PIII 1 ghz ...

 

mindiris

Senior member
Oct 23, 1999
483
0
0
Just as an aside, I still have a Riva128 in another computer. If you think a GeForce looks bad at high resolutions or on a big monitor, you really don't a have a good perspective on this issue. With a Riva128 on high resolutions, you'd think you've gone blind if you didn't look away from the monitor! :Q

 

Fierysonic

Senior member
Apr 30, 2000
298
0
0
Sometimes video cards don't mix with certain monitors. When I received my GF2, it looked awful on one of my monitors; however, my V3 3000 and G400 Max looked fine on it. Then, I connected the GF to my other monitor, and it looks great.

 

John

Moderator Emeritus<br>Elite Member
Oct 9, 1999
33,944
4
81
I run a Hercules Prophet GTS pro 64MB on a Dell P1110 21&quot; FD Trinitron and the image quality is extremely crisp in 2d/3d. Sounds like you should try that card on another computer, or simply return it for a Herc or Elsa.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
86plymouthcolt

The process takes about 10-15 minutes. You have to remove 9 capacitors and short 6 inductors. Check Engineer's first post for the thread with a link to the mod and a discussion.

John

I have an Elsa Gladiac GTS (supposed to be one of the best), which is the board on which I did the filter mod. It wasn't bad, but when compared to both a V5500 and Radeon attached to a 17&quot; FD Trinitron the Elsa was noticeably lacking in 2D and refresh rates above 75Hz were even worse on the Elsa. All I can tell you is that the filter mod made a HUGE difference. However, if you're comparing between nVidia-based cards, the Elsas are definitely better, and Hercs seem to be a bit uneven, with some better than average and others average at best, depending on the model. I had also tried a Creative Annihilator MX, and the 2D was noticeably worse than the un-modded Elsa.
 

dpopiz

Diamond Member
Jan 28, 2001
4,454
0
0
Hello.

I got an old VisionTek 32MB SDR GeForce coming from a Matrox G400-TV, both hooked up to an el-cheapo Futura 17&quot;

I have to say, the Matrox quality is so much nicer, first there's the slight pixel-gitter if you look really close on the GeForce, that's not a problem, but what really blows is that the GF is sooo much darker and less vibrant, forcing me to turn up the crappy monitor's brightness significantly. The driver color adjustment is horrible. That Matrox card was so nice even on my worthless monitor. But at lest now I get decent 3D performance.
 

dougjnn

Senior member
Dec 31, 2000
474
0
0
Wetwilly --

The thing that makes me leary of that hack is that it doesn't make much sense that the manufacturers wouldn't have shipped the card with those mods themselves, if it improves image quality without sacrificing anything.

This is different from hacks which enable a disabled level of performance on cards which are all produced with essential the same chipset and componets (because its cheaper to manufacture the variety that way), but are modified/crippled to conform to the price they are marketed at. E.g., the hack to turn GF2 cards in GF2 Quadro professional graphics cards.

This is different. What purpose is there to having their whole line of 3D cards look less good in 2D than they could if the moved a few wires around in on the components they supply anyway???

I mean it makes me worry what the downside would be.
 

WetWilly

Golden Member
Oct 13, 1999
1,126
0
0
dougjnn,

The thing that makes me leary of that hack is that it doesn't make much sense that the manufacturers wouldn't have shipped the card with those mods themselves, if it improves image quality without sacrificing anything.

Well, the mod does sacrifice something - FCC compliance. But as people who've actually tested it have found, they can't find any real difference (e.g. interference with the display or radios or TVs in the vicinity of the video card). I personally haven't found any downside at all, except possibly for the fact I voided my warranty (and if you're real good at soldering SMDs, you could probably put the capacitors back so no one would know they'd been removed). The reason the manufacturers don't leave the components off is because they don't want to expend the time (and money) to redesign the circuit - it's much easier to copy the reference design. The other problem, which may be related to how variable nVidia-based cards are, is the quality of the components. Some manufacturers like Canopus have totally redesigned the filter circuit and don't have the problem. You also pay a premium for Canopus products - to the point that they've left the US market because most US consumers won't pay the premium, which IMHO is rather unfortunate. I've said in other threads discussing nVidia's 2D quality that it's ultimately nVidia's problem because it's their reputation that's affected.

This is different. What purpose is there to having their whole line of 3D cards look less good in 2D than they could if the moved a few wires around in on the components they supply anyway???

Probably because, as I've found, many people aren't horribly picky about 2D. If as many people were picky about 2D as are picky about the Quake3 compressed sky issue, nVidia would have been pressed to take care of this a long time ago. I've used Matrox cards a lot, so I'm REAL picky about 2D. If the G800 had come out when it was supposed to, I wouldn't be in this thread. This 2D issue is far from new to nVidia, because 2D has pretty much sucked on their boards all the way back to the Riva 128 (and I know this personally). To be fair to nVidia, they don't have a whole lot of control over what OEMs do with their chips and, as the mod shows, the issue isn't with the GeForce2 chip (which nVidia controls), but the filter circuit on the board (which nVidia doesn't control). Matrox, ATI, and 3dfx (RIP) all manufacture their own boards or control manufacture.

The other thing that really doesn't get discussed is how the mod affects 3D. If you did a blind test, you'd be VERY hard pressed to tell the difference between a modded GeForce2 (with appropriate gamma adjustments) and a Radeon. In fact, after I modded the GeForce, a friend came over and thought I still had the Radeon in the computer. Before the mod, you could easily tell the difference. I've been playing Gumman Chronicles recently, and with the GeForce2, it looked pretty good. Then I installed the Radeon and immediately noticed that the Radeon's colors were better saturated and in places like the weapon selection bar particularly, the image was sharper. Modded the GeForce2, reduced the gamma a bit, and every point I had checked critically in the game was noticeably sharper. In 3DMark2000, the details in the Adventure test and the Demo in particular are also much sharper.

I understand how people would be a bit hesitant about the mod, but my theoretical perfect choice was a card with the GeForce's performance and the Radeon's image quality. I wasn't terribly enthused about either card individually, and if I hadn't accidentally killed my Voodoo3 :(, I would have returned both the Radeon and GeForce2 and waited. Since I didn't have that option, I gambled, did the mod, and (luckily) ended up with the best of both worlds. I should add the disclaimer here that YMMV, since I've heard of one or two people for whom the mod did very little. I don't know if they did the complete mod, since it can be done in stages (I did the complete mod). I also haven't heard of anyone for whom the mod made their display worse.