So if there is no H/W yet to run Crysis Ultra High, How was it created?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: ja1484
Originally posted by: GundamSonicZeroX
Crytek's game is too much for today's hardware. I don't think Joe-user has a GeForce 9800 GTX SLI configuration (or three 8800 GTX's with good drivers) :p


I'm starting to get really tired of this statement. Crysis is very playable on high/ultra high at resolutions of 12x10 or lower if you have an 8800GTS or better.

No, you can't currently run it maxed in DX10 mode at 19x12 with full AA and AF. That doesn't mean its an unplayable title.

As for development hardware, many companies develop on *either* consumer cards or work-station cards, but almost all titles are QA'd with consumer level video solutions, because that's what's going to be running them in the wild. Who's using what exactly depends on the developer.

QFT. I didn't realize playable is now defined as "max resolution, max settings, 60 FPS." I played it on a combination of high/vhigh (kept water lower though, seemed to help out) with some AA and AF at 1680x1050 with some tweaks and it was very playable (35+ FPS). I think I only had to drop the settings a bit toward the end when FPS dipped into the 20's.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: GundamSonicZeroX
Originally posted by: SunnyD
Wrong. GAME devs generally use either current generation consumer cards OR if they're a large company like Epic or whatnot that usually showcase next gen tech, will use undisclosed next gen hardware if ATI or NVIDIA have it available under some circumstances.
Linky please?

Why do you need a link? What do you think they're using?

Honestly - you think there is something magical about the workstation cards? You do realize that the Quadro and FireGL cards have the exact same GPU's at the consumer cards, right? In fact, I remember BIOS flashing GeForce cards to Quadro's, and a simple resistor mod to enable drivers to recognize a variety of GeForce as a Quadro. You need links so bad, hop on Google. The ONLY difference between a consumer and workstation card from NVIDIA or ATI is the driver - enabling better precision and and image reproduction, usually at the expense of performance.

Originally posted by: GundamSonicZeroX
Originally posted by: SunnyD
Why would game devs target "professional" rendering if their actual target is joe-user?
Crytek's game is too much for today's hardware. I don't think Joe-user has a GeForce 9800 GTX SLI configuration (or three 8800 GTX's with good drivers) :p

While that may be so, nothing says NVIDIA didn't give alpha silicon to Crytek to develop on. Though I highly doubt it. The reality of it is, as a developer, it is EASY to program past the hardware. All you do is use enough textures and polygons to best the fill rates of any consumer card out there. You write shaders complicated enough to bring a card's shader unit to it's knees. Just because the hardware can't do it at 60FPS doesn't mean you can't program for it.

And because Joe-user doesn't have a triple-SLI 9800GTX, that's why Crytek included LOWER quality modes. Because they want their game on older machines.
 
Oct 4, 2004
10,515
6
81
How do you think Hollywood visual effects gurus create all those CGI effects? We obviously can't render something like Ratatouille in real-time (I wonder how long it took to render a single frame in that movie). Similar principle in effect here.

This Crysis thing is nothing new. ES4:eek:blivion couldn't be maxed out for a long time.
 

andrei3333

Senior member
Jan 31, 2008
449
0
0
well i guess this is where the "The way its meant to be played" slogan comes into play...Nvidia provides next gen H/W to a developer and in return they get to put their stamp on a game...makes perfect sense to me....

So ATI and nVidia have to fight it out with developers for more of these stamps to get the gamers to think that this or that card is better for their favorite games...so in return if ATI cannot create a super next gen card for developers to create a game on then Nvidia gives their cards to those same developers to use and cashes n big time...at the end there still may be some good old fashion competition left out there !!!
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: ja1484
Originally posted by: GundamSonicZeroX
Crytek's game is too much for today's hardware. I don't think Joe-user has a GeForce 9800 GTX SLI configuration (or three 8800 GTX's with good drivers) :p


I'm starting to get really tired of this statement.
Came from Crytek, not me.
 

ja1484

Platinum Member
Dec 31, 2007
2,438
2
0
Originally posted by: GundamSonicZeroX
Originally posted by: ja1484
Originally posted by: GundamSonicZeroX
Crytek's game is too much for today's hardware. I don't think Joe-user has a GeForce 9800 GTX SLI configuration (or three 8800 GTX's with good drivers) :p


I'm starting to get really tired of this statement.
Came from Crytek, not me.


No it didn't. Crytek noted that it wouldn't be max everything at absurd resolutions. That's entirely different from playable. Doom 3 was in the same boat when it first came out.

It's not hard to create content that current GPUs have trouble with...if you slap a 4096x4096 texture on everything in the game with associated specular and parallax maps and then don't watch your polygon count, sure, it can look like real life...at several seconds per frame.
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: ja1484
Originally posted by: GundamSonicZeroX
Originally posted by: ja1484
Originally posted by: GundamSonicZeroX
Crytek's game is too much for today's hardware. I don't think Joe-user has a GeForce 9800 GTX SLI configuration (or three 8800 GTX's with good drivers) :p


I'm starting to get really tired of this statement.
Came from Crytek, not me.


No it didn't. Crytek noted that it wouldn't be max everything at absurd resolutions. That's entirely different from playable. Doom 3 was in the same boat when it first came out.

It's not hard to create content that current GPUs have trouble with...if you slap a 4096x4096 texture on everything in the game with associated specular and parallax maps and then don't watch your polygon count, sure, it can look like real life...at several seconds per frame.

The OP was talking about max settings, so that's what I was talking about. Sorry for not making this clearer.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Well for some people a "playable" game means that it can run at 300FPS at whatever resolution they can throw it at. I play Crysis at 1280x960, no AA, 8x AF, at DX9, with the "Ultra High" tweaks, leaving Shadows at "High" (un-tweaked), and I run it perfectly well at between 25 to 35FPS (depends on the levels and the amount of action on screen). That for me is perfectly playable.
 

andrei3333

Senior member
Jan 31, 2008
449
0
0

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: andrei3333
so is this better than getting two 500 dollar cards and sli ? same price one would think so right ?

I remember a member here saying that he got two 7800GTXs (back when they were new) and put them in SLI in place of a workstation video card. He said he regretted it.
 
Dec 21, 2006
169
0
0
Two fast consumer cards in SLI will smoke a workstation card in gaming. Heck, even one equivalent gaming card will do it.

Ultra high-precision and extreme levels of filtering and AA aren't a requirement for most games. If you can live without (ie if you don't 3d render), then get the consumer card.

OTOH, if quality and precision in every single frame mean that much to you (note that during gaming this will hardly be noticable), then workstation cards are the way to go.

It's like saying that a Xeon is inherently better than a comprable Conroe. No its not- you actually get more performance for general apps out of the Conroe (due to pipeline "tricks" directed towards consumers). And if the choice is between a higher clocked Conroe- or two equivalent ones, for the sake of the metaphor- and a single Xeon, the Conroe will win in almost every instance.
 

angry hampster

Diamond Member
Dec 15, 2007
4,232
0
0
www.lexaphoto.com
Originally posted by: shadowofthesun

Ultra high-precision and extreme levels of filtering and AA aren't a requirement for most games. If you can live without (ie if you don't 3d render), then get the consumer card.
.


Agreed. Frankly though, I find anti-aliasing and high resolutions to sort of equal each other in visual quality of 3D games simply because as the pixels get smaller, jaggies on curves become less noticeable.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
since crysis is based on dx9, they could have created 99% of it on a x1900 or a 7900gtx. All the features in regards to graphics in the game port directly between dx9 & 10. There is no particular api required for advanced features. like many others have said, they can play at less detail, lower res, with smaller polygon counts. High res texutres and models can be incorporated into the game. I'm amazed at things like depth of field, motion blur, day-night, godrays and such being developed on dx9. The "gotta have dx10" and "gotta have quadcore" was FUD start to finish, possibly with the aid of MS & intel. Why wouldn't they want you to upgrade your cpu & OS. (come to find out no difference in 2 cores and 4, and it's actually slower in DX10. hahahaha i'll never forget that 1) Nvidia spamming their logo on it had it's price as well.

 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: theprodigalrebel
How do you think Hollywood visual effects gurus create all those CGI effects? We obviously can't render something like Ratatouille in real-time (I wonder how long it took to render a single frame in that movie). Similar principle in effect here.

This Crysis thing is nothing new. ES4:eek:blivion couldn't be maxed out for a long time.

To render a single frame 18 minutes.
Requires 24 frames x 18 minutes = 432 minutes for each second of movie
432 minutes x 90 minutes = 27 days non stop processing.
Thats using a render farm with 96 nodes, each node contains 16 processors, 4GB ram.
Thats just a rough estimate from having used renderman for several years.
Render time goes down on some frames and up on others, depending on lighting and complexity.

We are nowhere near being able to do that kind of rendering in real time.

 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: Sumguy
Game developers have computers that make most gamers computers cry from inferiority.

I had to laugh at reading that comment.

I have talked with developers that are working on pc's that are two years old.
Its not that they can't keep the latest hardware, its that if the project starts now, they usually keep that same hardware until the project is finished.

So what may be state of the art now, may be old tech when the project is finished.
Usually after key points of the game are done its sent over to quality control and they run it through a few pc that represent the average pc. But the developer is often stuck using something not quite so new.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: GundamSonicZeroX
Originally posted by: andrei3333
so is this better than getting two 500 dollar cards and sli ? same price one would think so right ?

I remember a member here saying that he got two 7800GTXs (back when they were new) and put them in SLI in place of a workstation video card. He said he regretted it.

Most professional applications are opengl based and do not support sli with anything but quadro based cards.

 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: andrei3333
Right i was talking about maxing out everything...surely the creators must have tested the game for smooth operation at such resolutions and quality

i think i see what you guys mean abuot workstation cards...something like this from nvidia right ?
http://www.tigerdirect.ca/appl...pNo=3618435&CatId=1826
and this from ati
http://www.tigerdirect.ca/appl...pNo=3501398&CatId=1826

so is this better than getting two 500 dollar cards and sli ? same price one would think so right ?

The main difference between the professional and gamer cards is two things.

1.) Driver - The driver is optimized usually for opengl. It also provides access to things that the average game isn't going to need. Like the ARB options.

Can you convert a gamer card to a professional card ?
Some of them , yes.

So if you can get a 400.00 gamer card and make it perform like a 1,000.00 workstation card why don't people do it ?

2.) Support
If you buy a quadro card you are guaranteed from Nvidia that they have tested the card and drivers thoroughly with the application that you are using. There will be no display problems, crashes, or anything else while using that card, driver, application.

If there is a problem, all you have to do is email nvidia support and they respond usually same day. If the application is causing problems and you contact the developer the first thing they will ask about is are you using a quadro card and what model.

Developers provide no support for gaming cards used on professional applications.


 

invidia

Platinum Member
Oct 8, 2006
2,151
1
0
They tested them on the 4 alpha stage Geforce 11 cards running on 32nm octocore Intel chips overclocked to 7ghz cooled by liquid nitrogen.
 

slag

Lifer
Dec 14, 2000
10,473
81
101
Originally posted by: EvilComputer92
What acceptable means varies a lot from person to person.

Crysis runs on Ultra High on a 8800gt at 1280x1024 and gets about 20fps.

I run it on my system, 1024x768, very high settings, no AA and i have no stuttering so far. I'm only about 30 minutes into the game though. I'm going to try 1280x1024 tonight or higher and see what it does.

e4300@3.22 ghz
MSI 8800GT OC
4 gb adata ram
vista 64
Intel Bad Axe 2 motherboard
 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
Originally posted by: andrei3333
hmm maybe i should install it and see what happens ... i got the game, but did not want to spoil it for myself with crappy visuals, i have an overclocked 7800gt, but i can run COD4, GOW and UTS at max settings with a variety of AA and AF on or off at my 17" LCD's 1280x1024 resolution

how would crysis fare ? i really dont want to spoil the awesome visuals

BTW the only bottleneck is the Video card cause i got:
e8400
ga-p35-ds3l
2x1gb Corsair xms2 4-4-4-12
7200.11 32mb cache HDD 500GB
600w OCZ stealthxstream

Yep, a better video card will definitely shoot the framerates WAY up. Hell.. buy two 9600GT's and go to TOWNNNNN!

 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: shadowofthesun
Two fast consumer cards in SLI will smoke a workstation card in gaming.
I think he meant for 3D applications, ie. what Pixar would do with one. But, that is true, a single high-end consumer video card will beat the living crap out of a workstation video card in games.

 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Hmm, you've missed the answer entirely, IMO. This isn't a new concept with Crysis. Most games with adjustable resolutions work this way; users are just demanding too much lately.

If the game runs at 1920x1080 with low detail,
and the game runs at 800x600 with high detail,
then the code will work just fine with both maxed out.

No supercomputer testing required.