Why better FPS on HDTV than on monitor?

x1222

Member
Jun 24, 2010
39
0
0
I plugged in my comp onto my 46" 1080p TV today, I was expecting crysis to lag bad since on my 22" at 1680 x 1050 my fps can drop pretty low. But to my surprise at 1080p I was getting about the same frame rates. If i changed it to 1680 x 1050 on the TV I would get even better fps, so I don't think it's a cpu issue. I did some googling and seems like i'm not the only one, but people on the other forums couldn't figure out the reason why this occurs. My card is a 5850.
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
Long shot here, but are you using the same connection, or are you using HDMI for the TV?
 

HeXen

Diamond Member
Dec 13, 2009
7,832
37
91
maybe it has something to do with "game mode" my tv has this and is what my PC is hooked to.

5. HDTV With Game Mode Is Better For Gaming
More recently, some HDTV makers have recognised that consumers also want to use their high definition televisions to play fast-action online and console games. This has lead to the development of a "games mode" to give you an instant reaction with the game controller. The principle behind the game mode is to optimize the reaction time and the picture quality of the TV to be closely matched with the higher resolution 1080p capable video game consoles such as the PS3 to give the gamer more realism.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
It could be CPU-related. Or really, sound related. When connected to your monitor your probably using onboard sound, right? When connected to the TV, you're using the ATI sound chip via HDMI, right? The motherboard soundcard requires more CPU overhead, plus I've heard that if you disable the ATI sound when you're not using it you get an FPS bump. Just one possible theory. Maybe there is something else to it, but the display itself shouldn't make a difference.

Game mode for TVs basically concerns the circuitry in the TV itself, not with how it interacts with the devices connected to it. I think game mode basically disables any color correction and other picture-altering code that TVs use for displaying the picture.
 
Last edited:
Mar 11, 2004
23,444
5,847
146
Is the game changing settings? Seems like many games have an auto-select for the graphics settings based on resolution, so it might have taken off AA and/or lowered some settings.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
So let me get this straight, you're getting " about the same " performance on a screen that has " about the same " resolution??

:rolleyes:
 

x1222

Member
Jun 24, 2010
39
0
0
I was saying my performance at 1920 x 1080 on my tv is the same as 1680 x 1050 on my monitor. But 1680 x 1050 on my TV is about 5-6 frames better. The game was crysis. The settings should have been the same. I must say AA is very good on 5850, it doesn't seem to impact performance much, unlike my previous 4670 card.

I'm not really looking for a solution, I'm just curious to why this happens. I don't think there is a solution since it's a TV and monitor thing. Could be wrong though.

I didn't test different connections but from http://forums.techpowerup.com/showthread.php?t=48164 HDMI/DVI connection doesn't matter. That was my first guess too. It seems to be an ATI HD series card thing. They think it's a resolution thing, but the TV handled the same resolution better. Disappointed Crysis actually looked better on my monitor than my TV.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
This doesn't sound right.

The computer doesn't render frames faster based on what display it's connected to.

The sound settings sounds like a shot, but crysis is GPU limited, not CPU limited so I wouldn't think a couple cycles of processing time to do sound would affect frames through the HDMI sound as compared to onboard.

If it's better though, then thats a great thing for your setup. I never noticed an improvement, ATI or nVidia, when going from computer monitor to HDTV.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
Well 1680 x 1050 will look worse on a 1920 x 1080 display not to mention on a 46" display pixel density is going to be pretty bad unless you're sitting several feet away.
Some other possible theories.
1. Pure chance. i.e. the test run with your tv coincidentally was a few frames faster but if you repeated between the two a few times they'd even out.
2. Driver issue with monitor detection. Maybe something going on with communication between chips in the monitor and gpu that identify the display. TVs tend to use better components so perhaps a cheap component in the monitor requires the gpu to redetect the display or something every now and then.
3. Heat. Possibly the computer may have been moved to a cooler location near the TV and so it never stalls while with your monitor location the gpu is overheating or something and so it clocks down or something to cool down.
4. Other. Possibly related to chance. Another program or something is running in the background when you have your monitor hooked up, but not your TV. Are you using wifi? Ethernet? Are all the connections the same for both locations?
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Is the 5 fps difference anecdotal from playing the game or did you actually run a benchmark fly through. The sound is a long shot but a possible explanation. More than likely you're just noticing something that isn't there. You can even hook up to your monitor and TV at the same time and set them to clone mode.
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
i don't see how game mode affects anything. that should only affect lag input - game mode will go without typical filters in order to reduce processing time, which reduces input lag

if you are serious, run a series of benchmarks (doesn't crysis have a benchmark tool?) 3 times on each display and then look at if there is a real difference.

your pc display @ 1608x1050 3x
tv display @ 1680x1050 3x
tv display @ 1900x1200 3x
 

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
Game mode doesn't mean anything.

Although I like to play games on my 32" and 40" TVs, it doesn't compare to Monitor (quality wise). Picture simply doesn't look as good/sharp etc due to size of pixles.

PS. Never had any response/lag issues though. Both of my TVs are 60hz.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
maybe it has something to do with "game mode" my tv has this and is what my PC is hooked to.

This is talking about input lag, nothing more.

1920x1080P is 1920x1080P no matter how you are connected. Maybe the big screen is fooling your eyes? Bench them with vsynch off. Then report your findings. Very confident thati f the settings remain the same, you will have the same performance between the displays.
 

HeXen

Diamond Member
Dec 13, 2009
7,832
37
91
How do you check that? All I did was plug it in.

mine is a setting. i can choose 120hz or game mode.
game mode is suppose to be faster response for input.

i just mentioned it as perhaps maybe onscreen response didnt match your mouse/kb response and hardware had to compensate? i dunno but doesnt look like anyone else does either.
maybe try playing with Vsync, use Fraps to benchmark the differences between displays and post them
 

pw38

Senior member
Apr 21, 2010
294
0
0
It's entirely possible it could be a perception issue based on the difference in screen sizes, seating distance, pixel density, etc.. Like noted earlier unless stringent tests are done on each monitor it's hard to quantify outside of subjective reasoning or arbitrary evidence. The difference in pixel fill between 1680x1050 and 1920x1080 isn't really that much (1680 being 85% the pixels of 1920) so maybe that explains "about the same". Who knows, I wouldn't really worry about it although it is an interesting thought analysis. Keep us updated if you find anything out.
 

ahurtt

Diamond Member
Feb 1, 2001
4,283
0
0
What? That didn't really explain much.

Game mode exists on some HDTVs to reduce input lag introduced by the extra signal processing circuitry in TV's that PC monitors typically don't have. The game mode bypasses or turns off this image processing circuitry and in some cases for 120Hz TVs, it will lower the refresh rate to 60Hz when game mode is turned on.