Will a GF4 4200 push 1680x1050 through DVI?

housecat

Banned
Oct 20, 2004
1,426
0
0
I realize I wont be able to push that resolution very well in games (if at all), but I hope my current card (Leadtek Geforce 4 4200 128MB 8XAGP with DVI) can do that resolution in Windows for me until I upgrade.

I know this card has certain limitations for analog monitors (like not being able to do 2048x1536@85hz).. but dont really understand DVI technology and how the RAMDACs affects resolutions on a flatpanel.

I'm waiting for the Asus SLI board (or similar) to be released for my upgrade.. and will go with either a 6600GT or 6800GT and probably a A64 3500+.
 

Hikari

Senior member
Jan 8, 2002
530
0
0
It may not, a lot of boards were only supporting 1280x1024 DVI at the time the GF4 series first came out. I know my Chaintech 4600 manual says it does 1280x1024 via DVI.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
I think mine is only 1024x1024 too. When did they start supporting 1680x1050? I want the cheapest hold me over until I build a new system.. or would the analog connection work just fine for now?
 

Hikari

Senior member
Jan 8, 2002
530
0
0
I ended up getting a 6800 so I could run that resolution. You can always use the VGA output, but I don't find that signal near as clear.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Originally posted by: Viper96720
Maybe but most cards DVI support is up to 1600x1200

You say that as if that matters.

1680 x 1050 has fewer pixels than 1600 x 1200, so if a card would support 1600 x 1200 it should be able to support 1680 x 1050, since it would require lower bandwidth than 1600 x 1200.

housecat,
If you need a stopgap, I'm 90% sure a plain Radeon 9500 will support 1600x1200 via DVI, and should also support your resolution. Those can be found for a few bucks more than a Ti 4200.

 

housecat

Banned
Oct 20, 2004
1,426
0
0
I'm going to test it to be 100% sure, but I found this information googling..

"A 10-bit TMDS link operates at up to 165 MHz and offers 1.65Gbps of bandwidth. This is enough to operate a digital flat panel display at 1920 x 1080 resolution refreshed at 60 Hz. This is virtually doubled with a dual link TMDS. Dual offers 2Gbps of bandwidth but must be operated at 100 MHz to match the second link with the primary link. Its possible to get a resolution of 2048 x 1536 with a dual link TMDS. This ability to achieve high bandwidth and larger resolutions has pushed DVI into the forefront of graphics technology."

source

I'm hoping it has a 10bit TMDS link.. then it should do 1650x1080.

I'm not able to readily find this information easily on various video cards.. but I am glad I have DVI at all for the time being even on this old GF4.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Also found this little bit of hope-

gainward gf4 4400 supports 1600x1200

"Support LCD output with DVI connect, the resolution up to 1600x1200."


I'm wondering if the standard is a 10bit TDMS that if you need a driver update.. or theres something I'm missing here.

I remember another poster here remarking how he got his video card, even older than the GF4, to display 1650x1080. I am going to search and find out what card it was.. I'm thinking it was a Geforce 3 TI200.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
You'll probably just have to check "reduced blanking interval" somewhere in the drivers, but you should be good to go.

If you do get a new card and plan to game with it at that res, I'd aim for a 6600GT.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Thanks alot Pete. That helped me a lot. I did a quick google search and found two great sources of further information for anyone interested in this topic of DVIs and resolution.

link1
link2
 

housecat

Banned
Oct 20, 2004
1,426
0
0
It appears all current (GF4/Radeon 8500 and newer) DVI outputs are created equal. With only the Quadro class cards having dual link DVI output.

Reduced blanking, according to those links, is only needed at 1900x1200@60hz or higher.

So I should be ok even without that feature enabled.. we'll see.

When a AGP 6600GT hits a even $200 I might pick one up.. hopefully they'll get the SLI boards out soon (I've been waiting a very long time for an upgrade!! and i think that is the best time for longevity).
 

housecat

Banned
Oct 20, 2004
1,426
0
0
I have not received my 2005FPW yet. I will post here when I do on this issue.
I'm just getting prepared right now, I want to be able to run it at full res.

It doesnt ship out till the 10th tho..
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Originally posted by: housecat
I remember another poster here remarking how he got his video card, even older than the GF4, to display 1650x1080. I am going to search and find out what card it was.. I'm thinking it was a Geforce 3 TI200.

Search for the post about BFG 6800GT not supporting 1600x1200 via DVI. Ignore most of that heavily flamed thread, but I remember a link from there to a site that had waveforms from various different card's DVI outputs and showed which cards were clean and which had issues and may not be as clear at high resolutions.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
Originally posted by: Concillian
Originally posted by: housecat
I remember another poster here remarking how he got his video card, even older than the GF4, to display 1650x1080. I am going to search and find out what card it was.. I'm thinking it was a Geforce 3 TI200.

Search for the post about BFG 6800GT not supporting 1600x1200 via DVI. Ignore most of that heavily flamed thread, but I remember a link from there to a site that had waveforms from various different card's DVI outputs and showed which cards were clean and which had issues and may not be as clear at high resolutions.

The artice you are referring showed Nvidia's DVI in a bad light, but in real world performance I've never heard anyone complain.. and its been said the differences between ATI and NV DVI performance is not worth discerning (besides on bar charts).
I'm willing to put up with it anyway, as I wont use ATI.

That 6800 mustve used a onchip TDMS transmitter instead of a seperate transmitter. I would suspect most of the older cards with DVI support had a silicon image chip because they hadnt created the onchip transmitter yet.

I'd like to try the BFG myself though, as it only makes one wonder if it was user error.
 

CraigRT

Lifer
Jun 16, 2000
31,440
5
0
before entering this thread I would have said of course, but I have never used DVI, so I guess it's a good thing you guys got to this before I did :p
 

GabeyD

Member
May 8, 2001
50
0
0

I'm using my GF4 4200 through DVI on my Dell 2000FP at 1600x1200 no problems. Don't remember which brand the card is though.

 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: Concillian
Originally posted by: Viper96720
Maybe but most cards DVI support is up to 1600x1200

You say that as if that matters.

1680 x 1050 has fewer pixels than 1600 x 1200, so if a card would support 1600 x 1200 it should be able to support 1680 x 1050, since it would require lower bandwidth than 1600 x 1200.

housecat,
If you need a stopgap, I'm 90% sure a plain Radeon 9500 will support 1600x1200 via DVI, and should also support your resolution. Those can be found for a few bucks more than a Ti 4200.



You get less frames in 1680x1050 vs. 1600x1200, because more of the game is shown. Not less.
 

housecat

Banned
Oct 20, 2004
1,426
0
0
ackmed, hes referring to the bandwidth used across the DVI link.. not gaming performance.

but 1600x1200 is about equal to 1650x1080 because it renders less pixels. even though it shows more onscreen due to the AR. someone pointed this out to me in another recent thread.
it comes out about a wash FPS-wise. you yourself said that before.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
I said it should, and that frames felt about the same. I took a few pics, which Ive posted. 1680x1050 gets less frames in HL2 in 2 out of the three shots.
 

ianj

Junior Member
Dec 6, 2004
1
0
0
Not sure how helpful this is, but I recently purchased a Conqueror GeForceFX 5200 Plus. I am also considering purchasing this Dell monitor. I called Mad Dog Multimedia to ask if the video card would support widescreen 1680x1050. The tech on the phone said it would not. However, first he asked if I was calling because I was having problems with such a monitor (I wasn't), so I'm not confident I got an accurate answer.

So, I emailed Nvidia pre-purchase support. I asked which of their cards support 1680x1050 over DVI, particularly the 5200 and 6200 series. They replied, "All GeforceFX and Geforce 6 family of GPU's will support this resolution over DVI."

So... There's that. I did find it frusturating that no graphics card manufacturer has this information easily available on their website, particularly with all the widescreen monitors now coming out.
 

mickyb

Junior Member
Aug 29, 2004
4
0
0
I am in the same boat. I am trying to figure out which card is the best for his HDTV. He already has a 9800SE, but it is one of the very few without a DVI port.

To make matters worse, his TV only has one DVI port which is connected to his HD satelite receiver. So, I have to go RGB. The problem is that if you get a OEM model (read cheaper), then it does not come with any extra cables or dongles. You have to spend an extra $30 to $40 to get that dongle. A 6' DVI cable is no cheap item either.

I have also looked around for a VGA to RGB converter cable, it is difficult to find. There are some that output RGB, but not in the format for most projectors. I noticed that the ATI 8500 vga out must support outputing a different signal than Y-Pb-Pr. ATI makes a simple adapter for that card, but I can't tell if you can use it on all cards. The converter box that I found is http://www.smarthome.com/77706.html.

Another problem I am finding is that the cheap DX9 cards are all coming out in PCI-e only. This makes it difficult to find that middle price ground to pull the trigger. So now I am thinking GeForce 6800 LE is going to do the trick.

Any advise would sure help.