NV30 to "unlock" advanced graphic features in UT2k3?

Xernex

Senior member
Jul 15, 2002
304
0
0
Sorry if this is a repost but I thought this rumour sounded quite interesting.

Word around the Internet is that Epics latest FPS creature, Unreal Tournament 2003 is going to take special advantage of nVidia's upcoming GPU powerhouse, currently only known as the NV30.

The exact phrase that's being tossed around the Internet is "Ultra High Detail" and that this ultra high detail will be exposed with the NV30. This would make sense, considering the logo/nVidia advertising at the beginning of Unreal Tournament 2003. Hopefully more info will come out as the announcement of the NV30 gets closer.

The NV30 is set to be officially announced at Comdex. (November 16th - 22nd)

Source = Hardware Connect

Rumour? Yes. Perfect sense? Yes.

I think there could be some truth in this, think about it the Nvidia logo at the start of ut2k3? Nvidia looking for a way to beat ATI. I can see this leading to Nvidia signing up other companies to do exclusive graphics options that only work with their cards. It would be a smart move from Nvidia. It would explain their motives behind this new marketing campaign. (where they put their logo at the start of games, UT2k3 is just the first)

Guess we will just have to wait and see.
 

NOX

Diamond Member
Oct 11, 1999
4,077
0
0
Well, there is also an nvidia logo at the biginning of AA, which can be changed by naming another GFX file to "splash". So I don't know about that.

We'll just have to wait and see if this is true.
 

XBoxLPU

Diamond Member
Aug 21, 2001
4,249
1
0
You can always edit your INI file and change to ultra settings

http://forums.anandtech.com/messageview.cfm?catid=28&threadid=872213&highlight_key=y&keyword1=UT2003

WinDrv.WindowsClient]
ScreenFlashes=True
NoLighting=False
MinDesiredFrameRate=35.000000
Decals=True
Coronas=True
DecoLayers=True
Projectors=True
NoDynamicLights=False
ReportDynamicUploads=False
TextureDetailInterface=UltraHigh
TextureDetailTerrain=UltraHigh
TextureDetailWeaponSkin=UltraHigh
TextureDetailPlayerSkin=UltraHigh
TextureDetailWorld=UltraHigh
TextureDetailRenderMap=UltraHigh
TextureDetailLightmap=UltraHigh
TextureMaxLOD=0
TextureMinLOD=0
NoFractalAnim=False
ScaleHUDX=0.000000

[D3DDrv.D3DRenderDevice]
DetailTextures=True
HighDetailActors=True
SuperHighDetailActors=True
UsePrecaching=True
UseTrilinear=True
AdapterNumber=-1
ReduceMouseLag=True
UseTripleBuffering=False
UseHardwareTL=True
UseHardwareVS=True
UseCubemaps=True
DesiredRefreshRate=100
UseCompressedLightmaps=True
UseStencil=False
Use16bit=False
Use16bitTextures=False
MaxPixelShaderVersion=255
UseVSync=False
LevelOfAnisotropy=1 <----increase this for a higher level of Anisotropic filtering
DetailTexMipBias=0.800000
DefaultTexMipBias=-0.500000
UseNPatches=False <-------for the ATI guys
TesselationFactor=1.000000
CheckForOverflow=False
DecompressTextures=False
UseXBoxFSAA=False
DescFlags=0
Description=
 

Adul

Elite Member
Oct 9, 1999
32,999
44
91
danny.tangtam.com
if you go to the rage 3d forums there is comment by one of the developers of UT. He even gives all the settings you can use on any graphics card. there is no special nv30 setting. There is a setting that will require about 180 MB of texture memory though :Q
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
Originally posted by: BFG10K
LOL! 128MB Cards didnt last long!
No surprises there yet I've seen some people harping on about how 32 MB VRAM is still good enough.

if you're doing 2d on 1 monitor 8 megs is fine.
 

Crusty

Lifer
Sep 30, 2001
12,684
2
81
If you're doing 2d on 1 monitor 8 megs is fine.

Im running 2d on my second monitor with a Matrox Millenium PCI, if I remember correctly it is a 4mb card. It looks great. But then again it is Matrox.....
 

draggoon01

Senior member
May 9, 2001
858
0
0
i'd worry if it really does turn out to be nvidia specific features. although developers would support the lowest common denominator in setups always, they might start to fractionate for advanced features, and computer games could end up no different than consoles games, where hardware determines what software u can choose from.
 

ndee

Lifer
Jul 18, 2000
12,680
1
0
Are there any screenshots uf these Ultra-High settings compared to the "normal" settings?
 

EdipisReks

Platinum Member
Sep 30, 2000
2,722
0
0
if you go to the hardocp news posts from a couple days ago, you will see that there are no nVidia specific settings in UT2k3.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Speculation time.

What if nVidia implemented a (transparent to applications) texture compressor (based on S3's pro level texture compression or 3dfx's compressed formats)? All of a sudden NV30 is able to fit more textures into 128mb than other cards (which would be a reason why NV30 could do this and 9700 could not).

In addition I think it is likely we will see "free" antialiasing on NV30. The NV30 is expected to arrive with 8 pipelines, each with two texturing units. If an application can't use all the pipelines/texture units there will be spare bandwidth on chip which could be used to antialias for free (remember how 3dfx implemented their FSAA - it only cost fillrate, if you have free fillrate/bandwidth you can FSAA for free).

It's also possible NV30 could perform all z buffering on die, which would further free up texture memory, while speeding up occlusion detection.

Greg
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
In addition I think it is likely we will see "free" antialiasing on NV30. The NV30 is expected to arrive with 8 pipelines, each with two texturing units. If an application can't use all the pipelines/texture units there will be spare bandwidth on chip which could be used to antialias for free (remember how 3dfx implemented their FSAA - it only cost fillrate, if you have free fillrate/bandwidth you can FSAA for free).
Those are the exact same specification on the Radeon9700... Why would the performance be "free" on NVIDIA's card and not on ATi's?
 

Woodchuck2000

Golden Member
Jan 20, 2002
1,632
1
0
Originally posted by: jliechty
I guess it's time to break out the 3DLabs cards with 416 MB of memory. :D
Yeah, but I bet it wouldn't run UT well at all...
About the compression thing - I thought that all DX7 and above cards support DirectX Texture Compression? Is that 180Mb of compressed textures?
 

GTaudiophile

Lifer
Oct 24, 2000
29,767
33
81
Even if certain games had certain special features for nVidia cards, how hard would it be for ATi to figure out how to support those same features? I mean, from what I've seen NV30 and R9700 are fairly similar, but NV30 will have a lot more horse power. DOn't forget the R9700 can run nVidia's own demos without a problem given a software hack.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Bovinicus
In addition I think it is likely we will see "free" antialiasing on NV30. The NV30 is expected to arrive with 8 pipelines, each with two texturing units. If an application can't use all the pipelines/texture units there will be spare bandwidth on chip which could be used to antialias for free (remember how 3dfx implemented their FSAA - it only cost fillrate, if you have free fillrate/bandwidth you can FSAA for free).
Those are the exact same specification on the Radeon9700... Why would the performance be "free" on NVIDIA's card and not on ATi's?

No, they are not the same specifications as the ATI 9700.

The 9700 has eight pipelines, but only only texturing unit per pipeline, not two. Also, you have to design the card to be capable of taking advantage of the free bandwidth when available.

Anyhow it would seem (from information posted on another board) that texture compression is indeed the key to the riddle:

The reference NV30 only has 128 MB of memory. The NVS5 64 bit compressed texture support is what makes the difference.
http://66.96.254.245/cgi-bin/ultimatebb.cgi?ubb=get_topic;f=1;t=025005

Greg
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
can assure you that we did NOT lock out anything from customers that is on the CD.

We pick (conservative) default settings based on detected HW and the code currently in place will only pick the highest texture detail on 256 MByte cards by default though you can easily change this either via the menu or by modifying the ini file.

At highest texture detail some levels might use up to 180 MByte of data (textures + geometry) and if you have a lot of players this number might be even higher

Here's how the detail settings work:

FWIW, we didn't ship with any textures that require a detail setting above High to be shown at full detail. The texture detail is basically a bias against the LOD level defined in the texture package. So e.g. large textures might have a "NormalLOD" of 2 which means at normal texture LOD ("Normal") the engine will skip the first two mip levels. At "Lower" it will skip 3 and at "Higher" it will skip only 1. To the best of my knowledge the highest NormalLOD used in UT2k3 is 2 which means that by setting your texture detail to "High" (ini and menus use a different notation as texture detail ended up being to fine grained and I'm refering to ini options) you'll end up with the full quality textures. We also do some clamping so small textures won't loose mipmaps at low texture detail settings.

Below are the ini options and their bias level.

-4 UltraHigh
-3 VeryHigh
-2 High
-1 Higher
0 Normal
+1 Lower
+2 Low
+3 VeryLow
+4 UltraLow

As this is too fine- grained for the regular user we mapped the words differently so XYZ in the menus doesn't necessarily map to XYZ in the ini so this might have caused some confusion.

-- Daniel, Epic Games Inc.


Its not on the cd, and its not the texture setting. It will obviously be in a patch.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: Woodchuck2000
Originally posted by: jliechty
I guess it's time to break out the 3DLabs cards with 416 MB of memory. :D
Yeah, but I bet it wouldn't run UT well at all...
About the compression thing - I thought that all DX7 and above cards support DirectX Texture Compression? Is that 180Mb of compressed textures?
Yea, I know, I was just joking about the 3DLabs card - It would totally annihilate any nvidia or ati card in CAD stuff, but would have it's booty kicked to the maximum in UT2003. :(
 

codehack2

Golden Member
Oct 11, 1999
1,325
0
76
Originally posted by: Gstanfor
Speculation time.

What if nVidia implemented a (transparent to applications) texture compressor (based on S3's pro level texture compression or 3dfx's compressed formats)? All of a sudden NV30 is able to fit more textures into 128mb than other cards (which would be a reason why NV30 could do this and 9700 could not).

In addition I think it is likely we will see "free" antialiasing on NV30. The NV30 is expected to arrive with 8 pipelines, each with two texturing units. If an application can't use all the pipelines/texture units there will be spare bandwidth on chip which could be used to antialias for free (remember how 3dfx implemented their FSAA - it only cost fillrate, if you have free fillrate/bandwidth you can FSAA for free).

It's also possible NV30 could perform all z buffering on die, which would further free up texture memory, while speeding up occlusion detection.

Greg

Umm.. errr. Lets start at the begining...

1)" What if nVidia implemented a (transparent to applications) texture compressor (based on S3's pro level texture compression or 3dfx's compressed formats)? All of a sudden NV30 is able to fit more textures into 128mb than other cards (which would be a reason why NV30 could do this and 9700 could not)."

-You can't compress textures twice... Texture compression can already be turned on and used in UT2k3... how would NV30's so called transparent compressor be any more effecient that the current implementation, which is based off of s3's spec mind you.

2) "In addition I think it is likely we will see "free" antialiasing on NV30. The NV30 is expected to arrive with 8 pipelines, each with two texturing units. If an application can't use all the pipelines/texture units there will be spare bandwidth on chip which could be used to antialias for free (remember how 3dfx implemented their FSAA - it only cost fillrate, if you have free fillrate/bandwidth you can FSAA for free)."

-Close, but no cigar. Texel fill rates do not matter when doing AA. It's all dependant on raw pixel fill rate. So even if Nv30 has 8 pipes, it will be no better off than the 9700pro.

3)"It's also possible NV30 could perform all z buffering on die, which would further free up texture memory, while speeding up occlusion detection."

-Don't think so... the only way this is goin to happen is if NV30 has embedded DRAM on die, which I have heard no talk of at all. With that, reliable sources have qouted NV30 to be in the 120 to 130 million transistor count. Embedding enough dram to do on board Z-buffering would push this chip up into the 180 to 200 million transistor count. Don't see it happening

CH2
 

BD2003

Lifer
Oct 9, 1999
16,815
1
81
Let me get this straight, radeon has 8 pipes 1 tex. NV30 has 8 pipes 2tex. The game obviously uses multitexturing, so wouldnt the nv30 theoretically need only half of its pipelines to match the 9700's speed? Therefore, AA wouldnt be free, but itd certainly be a hell of a lot faster...
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: BD2003
Let me get this straight, radeon has 8 pipes 1 tex. NV30 has 8 pipes 2tex. The game obviously uses multitexturing, so wouldnt the nv30 theoretically need only half of its pipelines to match the 9700's speed? Therefore, AA wouldnt be free, but itd certainly be a hell of a lot faster...
NV30 will not only have two texture units per pipeline (whether or not that will help I'm not competent enough to say), but it will also be on a lower micron process than R300, and thus (probably) be able to hit higher clockspeeds. I'm sure the higher clockspeeds will help it.