Yea but can it run Half-Life 2? - A look Ten Years Back..

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

schmuckley

Platinum Member
Aug 18, 2011
2,335
1
0
Where's Half-Life 3?
They may as well forget it at this point.
Lest it goes the way of Duke Nukem.
 

HeXen

Diamond Member
Dec 13, 2009
7,837
38
91
When it released, I was far from impressed with this game. It was to me, just another shooter on rails with scripted AI, mediocre sounds and an unimaginative story from a B flick 80's movie. And to this day I still don't see why anyone thinks its so great. The facial animations were good for it's time and the physics were ok as well as the water effects but that's about it for me.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I thought it was shader 3, which the Radeon didn't have.?

It was shader 2, even though some of the FX range had it, it wasn't implemented to directX specifications for performance reasons so they had to revert to DX8.1. There was a way to force it with them but it was slow and didn't render entirely properly. Especially underwater.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
It was shader 2, even though some of the FX range had it, it wasn't implemented to directX specifications for performance reasons so they had to revert to DX8.1. There was a way to force it with them but it was slow and didn't render entirely properly. Especially underwater.

Ummm no, it was shader 3 with HDRR
Complex shader effects began their days with the release of Shader Model 1.0 with DirectX 8. Shader Model 1.0 illuminated 3D worlds with what is called standard lighting. Standard lighting, however, had two problems:

Lighting precision was confined to 8 bit integers, which limited the contrast ratio to 256:1. Using the HVS color model, the value (V), or brightness of a color has a range of 0 – 255. This means the brightest white (a value of 255) is only 255 levels brighter than the darkest shade above pure black (i.e.: value of 0).
Lighting calculations were integer based, which didn't offer as much accuracy because the real world is not confined to whole numbers.
On December 24, 2002, Microsoft released a new version of DirectX. DirectX 9.0 introduced Shader Model 2.0, which offered one of the necessary components to enable rendering of high-dynamic-range images: lighting precision was not limited to just 8-bits. Although 8-bits was the minimum in applications, programmers could choose up to a maximum of 24 bits for lighting precision. However, all calculations were still integer-based. One of the first graphics cards to support DirectX 9.0 natively was ATI's Radeon 9700, though the effect wasn't programmed into games for years afterwards. On August 23, 2003, Microsoft updated DirectX to DirectX 9.0b, which enabled the Pixel Shader 2.x (Extended) profile for ATI's Radeon X series and NVIDIA's GeForce FX series of graphics processing units.

On August 9, 2004, Microsoft updated DirectX once more to DirectX 9.0c. This also exposed the Shader Model 3.0 profile for high-level shader language (HLSL). Shader Model 3.0's lighting precision has a minimum of 32 bits as opposed to 2.0's 8-bit minimum. Also all lighting-precision calculations are now floating-point based. NVIDIA states that contrast ratios using Shader Model 3.0 can be as high as 65535:1 using 32-bit lighting precision. At first, HDRR was only possible on video cards capable of Shader-Model-3.0 effects, but software developers soon added compatibility for Shader Model 2.0. As a side note, when referred to as Shader Model 3.0 HDR, HDRR is really done by FP16 blending. FP16 blending is not part of Shader Model 3.0, but is supported mostly by cards also capable of Shader Model 3.0 (exceptions include the GeForce 6200 series). FP16 blending can be used as a faster way to render HDR in video games.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Ummm no, it was shader 3 with HDRR
Complex shader effects began their days with the release of Shader Model 1.0 with DirectX 8. Shader Model 1.0 illuminated 3D worlds with what is called standard lighting. Standard lighting, however, had two problems:

Lighting precision was confined to 8 bit integers, which limited the contrast ratio to 256:1. Using the HVS color model, the value (V), or brightness of a color has a range of 0 – 255. This means the brightest white (a value of 255) is only 255 levels brighter than the darkest shade above pure black (i.e.: value of 0).
Lighting calculations were integer based, which didn't offer as much accuracy because the real world is not confined to whole numbers.
On December 24, 2002, Microsoft released a new version of DirectX. DirectX 9.0 introduced Shader Model 2.0, which offered one of the necessary components to enable rendering of high-dynamic-range images: lighting precision was not limited to just 8-bits. Although 8-bits was the minimum in applications, programmers could choose up to a maximum of 24 bits for lighting precision. However, all calculations were still integer-based. One of the first graphics cards to support DirectX 9.0 natively was ATI's Radeon 9700, though the effect wasn't programmed into games for years afterwards. On August 23, 2003, Microsoft updated DirectX to DirectX 9.0b, which enabled the Pixel Shader 2.x (Extended) profile for ATI's Radeon X series and NVIDIA's GeForce FX series of graphics processing units.

On August 9, 2004, Microsoft updated DirectX once more to DirectX 9.0c. This also exposed the Shader Model 3.0 profile for high-level shader language (HLSL). Shader Model 3.0's lighting precision has a minimum of 32 bits as opposed to 2.0's 8-bit minimum. Also all lighting-precision calculations are now floating-point based. NVIDIA states that contrast ratios using Shader Model 3.0 can be as high as 65535:1 using 32-bit lighting precision. At first, HDRR was only possible on video cards capable of Shader-Model-3.0 effects, but software developers soon added compatibility for Shader Model 2.0. As a side note, when referred to as Shader Model 3.0 HDR, HDRR is really done by FP16 blending. FP16 blending is not part of Shader Model 3.0, but is supported mostly by cards also capable of Shader Model 3.0 (exceptions include the GeForce 6200 series). FP16 blending can be used as a faster way to render HDR in video games.

HDR didn't come in until after Half Life 2 was released, It was first demoed with the Lost Coast tech demo and then integrated into Half Life 2 around the time episode 1 was released.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Far Cry was the Crysis of the time. Didn't all 3 (Far Cry, Doom 3, HL 2) of those games come out at the same time, with Far Cry being the first of the 3.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
HDR didn't come in until after Half Life 2 was released, It was first demoed with the Lost Coast tech demo and then integrated into Half Life 2 around the time episode 1 was released.

There was also an unofficial patch for Far Cry that enabled HDR. Not sure if that patch released before or after HL2 though, but Far Cry its self came out early 2004 and HL2 was in the fall of that same year.
 

CP5670

Diamond Member
Jun 24, 2004
5,665
765
126
I built a 6800GT and A64 3500+ rig for this game and Doom 3. Also got an NEC DiamondTron 22" CRT for the system. That was my first high end monitor. I wouldn't mind having one of those again to try out.


I have one of those in storage, the Mitsubishi version. It still does some things very well and I only took it out of my system fairly recently, when I discovered that my 120hz LCD did backlight strobing (Lightboost).
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,949
7,661
136
I have been playing the original Half Life the last couple of weeks, and I cannot believe how well it still holds up 16 years later. How the hell does a developer put out a masterpiece like Half Life 1 as its first release?
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I have been playing the original Half Life the last couple of weeks, and I cannot believe how well it still holds up 16 years later. How the hell does a developer put out a masterpiece like Half Life 1 as its first release?

you should play the "source" remake of it
 

conlan

Diamond Member
Jan 27, 2001
3,395
0
76
I have been playing the original Half Life the last couple of weeks, and I cannot believe how well it still holds up 16 years later. How the hell does a developer put out a masterpiece like Half Life 1 as its first release?

Still the best game ever.
The Black Mesa mod is awesome too
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Far Cry was the Crysis of the time. Didn't all 3 (Far Cry, Doom 3, HL 2) of those games come out at the same time, with Far Cry being the first of the 3.

Ugh yeah forgot about Far Cry, very demanding on high settings.
 
Mar 10, 2006
11,715
2,012
126
Ugh yeah forgot about Far Cry, very demanding on high settings.

HL2, DOOM 3, and Far Cry were all unique in their own rights. HL2 had the best artwork (IMO) and nailed down realism. DOOM 3 had by far the most dramatic/moody lighting and the best shadowing system in a game yet. Far Cry piled on the special FX, which really killed graphics cards at the time.

Loved them all. The only bad memory I have of the three were those darn trigens in FC! Ack!
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
HL2, DOOM 3, and Far Cry were all unique in their own rights. HL2 had the best artwork (IMO) and nailed down realism. DOOM 3 had by far the most dramatic/moody lighting and the best shadowing system in a game yet. Far Cry piled on the special FX, which really killed graphics cards at the time.

Loved them all. The only bad memory I have of the three were those darn trigens in FC! Ack!
you could still run Far Cry on even an mx440 using low settings though.