Yea but can it run Half-Life 2? - A look Ten Years Back..

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
My daughter's a huge Cosplay/Valve fan. She has costumes complete with gravity guns and portal guns. She vividly remembers being terrified of headcrabs while watching me play Half-Life 2. :)

All her talk got me playing the Half-Life 2 series again. Running all settings maxed, I was curious where Boost would clock my 780Ti to run the game. Surprised to see clocks never leave idle state:

Someone suggested GPU usage may have been higher without vsync enabled. Needless to say I got a steady 60fps.



I also tested it with my HD4600 iGPU, 1080p max settings. A headcrab rocket just landed causing dip in framerate:


Frames: 4502
Time: 87110ms
Avg: 51.682
Min: 32 - Max: 87







Our sport has a history folks, pretty amazing when you think about it. Half Life 2, ten years ago next month, and AnandTech was there covering the release..

Half-Life 2 Release Date:
November 16, 2004

AnandTech HL2 GPU Roundup:
November 17, 2004


The Slowest Level in the Game
For our fifth and final demo we turn to one of the last levels in the game – d3_c17_12. This city level takes place mostly outdoors and gave us the lowest average frame rates out of any level we played in during our testing of Half Life 2.


Price Point
At the $400 price point, the X800 Pro and the GeForce 6800GT are basically equal performers in all of the resolutions we tested (regardless of whether or not AA/anisotropic filtering was enabled). So the recommendation here goes either way, look at the performance of the cards in some of the other games you play to determine which one is right for you.

http://www.anandtech.com/show/1546
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
this game worked well with low end VGAs at the time, most people still had CRTs, so 800x600 was not to bad, but yes... 10 years ago... interesting.

performance is not 100% comparable with the original game because of the updates,
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Would be fun to see how my hd4000 and my gtx650 comes in on that game, hd4000 dominates 1024x768 in UT2003 without aa leaving the cpu as a bottleneck but back in the day that title apparently murdered anything outside of a sli config. :)

Never played Half-Life 2, does it still hold up? Any chance of a active multi-player?Single player perhaps just what makes it good?
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
All her talk got me playing the Half-Life 2 series again. Great news, my 780Ti can run it. Pretty well in fact, as it never ramps up from idle clock state
This is good progress. Maybe, you can even run some of the newer games on that performance state. How much power does it consume?
 

jpiniero

Lifer
Oct 1, 2010
16,818
7,258
136
Never played Half-Life 2, does it still hold up? Any chance of a active multi-player?Single player perhaps just what makes it good?

I actually played through it and both episodes not too long ago. It hasn't aged well, it's pretty boring. I did like parts of Episode 2.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
Interesting to compare the then vs. now numbers.

Shaders:
16 in the Geforce 6800 Ultra, 2048 in the GTX 980

Core Clock:
400 Mhz in the 6800 Ultra, 1200+ Mhz in the GTX 980

Memory Bandwidth:
35.2 GB/s in the 6800 Ultra, 224 GB/s in the GTX 980

Core Transistor Count:
222 million in the 6800 Ultra, 5.2 billion in the GTX 980

EDIT: It is funny to see now actually that your idle clocked card is still operating at a similar frequency to flagship cards from when HL2 launched.
 
Last edited:

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Would be fun to see how my hd4000 and my gtx650 comes in on that game, hd4000 dominates 1024x768 in UT2003 without aa leaving the cpu as a bottleneck but back in the day that title apparently murdered anything outside of a sli config. :)

Never played Half-Life 2, does it still hold up? Any chance of a active multi-player?Single player perhaps just what makes it good?

It's a little slow during storyline, but still very good. Ravenholm is a creepy undead city that remains one of my favorite levels of all time. The episode packs are fun and showcase newer lighting, smoke, and fire effects.
 
Last edited:

conlan

Diamond Member
Jan 27, 2001
3,395
0
76
Interesting to compare the then vs. now numbers.

Shaders:
15 in the Geforce 6800 Ultra, 2048 in the GTX 980

Core Clock:
400 Mhz in the 6800 Ultra, 1200+ Mhz in the GTX 980

Memory Bandwidth:
35.2 GB/s in the 6800 Ultra, 224 GB/s in the GTX 980

Core Transistor Count:
222 million in the 6800 Ultra, 5.2 billion in the GTX 980

EDIT: It is funny to see now actually that your idle clocked card is still operating at a similar frequency to flagship cards from when HL2 launched.

Remember the 6800NU > 6800U hack?...awesome
 

Mushkins

Golden Member
Feb 11, 2013
1,631
0
0
Reminds me of the old Unreal Tournament days. I remember getting a brand new Dell to replace an ancient HP and putting a Geforce FX 5200 in it. I could spawn *hundreds* of bots into a map and it kept chugging along. 200v200 on a map designed for 12v12 was some good times :p
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Anyone remember cards previous to the 9xxx and 6xxx series wouldn't even have the same water effects due to not having 24bit pipelines?
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Anyone remember cards previous to the 9xxx and 6xxx series wouldn't even have the same water effects due to not having 24bit pipelines?

Are you sure it was tied to pipelines and not DX feature set? Also, a 9800 pro had 8 pipelines no where near 24.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Are you sure it was tied to pipelines and not DX feature set? Also, a 9800 pro had 8 pipelines no where near 24.

I think it was actually shaders, the hardware couldn't handle 24 bit shaders which were part of the DX9 feature-set, as well as being part of the pipeline.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
I think it was actually shaders, the hardware couldn't handle 24 bit shaders which were part of the DX9 feature-set, as well as being part of the pipeline.

as far as I know R300 was 24bit FP pixel shader 2.0, and Hl2 worked perfectly with good performance, NV30 was problematic and Valve forced it to run in DX8 mode,
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
lol I played the game back then on highest DX7 settings at 1024x768 with dinky mx440.
 

zir_blazer

Golden Member
Jun 6, 2013
1,259
573
136
10 years ago the question was "Can it run Doom 3?". Half Life 2 was easier to run than ID Software beast.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
10 years ago the question was "Can it run Doom 3?". Half Life 2 was easier to run than ID Software beast.

Agreed, HL2 was a breeze to run in comparison. Valve designed their games to run well across all platforms, well optimized. Played though the series once with developer commentary on, pretty smart dudes. They really paid attention to test player's actions and adjusted accordingly.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I built a 6800GT and A64 3500+ rig for this game and Doom 3. Also got an NEC DiamondTron 22" CRT for the system. That was my first high end monitor. I wouldn't mind having one of those again to try out.

nec22.jpg

 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
Doom3 was the Crysis of that time. Makes me cry this nostalgia.


Radeon X800XT made me a computer enthusiast and IT student then.