64bit gaming = huge?

ponyo

Lifer
Feb 14, 2002
19,688
2,811
126
Looks like con job to me. A person would be better off buying faster graphics card.

I don't buy the article or the pics.
 

Mullzy

Senior member
Jan 2, 2002
352
0
0
There really isn't much proof of anything in this article except that a game is choosing what textures and lighting to show based on 32 vs 64 bit mode detection. Maybe I'm missing something.

It also seems to me I have seen similar and better lighting/textures in games that were developed for a 32 bit CPU. I'm sure 64 bit will be better (and perhaps it is WAY better in this game)... but the article is missing some key points - like the Framerate difference between 32 and 64 bit mode. It's not too impressive to have really great lighting/textures and have a framerate of 30.

Hopefully 64 bit mode is actually faster as well as better looking. That would be great - the article just doesn't seem to mention anything.
 

Wingznut

Elite Member
Dec 28, 1999
16,968
2
0
So.... 32-bit cpu's cannot allow the videocard to display lighting effects and textures of posters on the wall???
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Wingznut
So.... 32-bit cpu's cannot allow the videocard to display lighting effects and textures of posters on the wall???

Exactly what I was thinking Wingznut.

Apparently the 64-bit CPU's can feed the GPU faster. Details here and here.


Honestly, what conjecture. Why would the option to run the game (slower) on 32-bit not be present at all?

I find it a little too convenient that AMDzone is displaying how much better 64-bit gaming will be, how rocks will be on the floor while they won't be there on the 32-bit version, etc. Good, maybe I will play it on my IA-64 equipped Pentium then, lol.


I definately think that once the ball gets rolling, 64-bit computing will be faster than 32-bit computing and having a 64-bit processor will show a performance increase. But stuff like video details being only displayable on a 64-bit processor... I need a bit more proof than this.

Otherwise, I guess we'll have to settle for 'half the vines and waterfalls'.
 

Drift3r

Guest
Jun 3, 2003
3,572
0
0
This article and game screenshot is one gaint ad for AMD. Don't get me wrong though I like their cpus but this article is BS.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
An old link comparing 32-bit to 64-bit.

To be honest, theres a big difference to me between the 2 pictures. I claim to know pretty much nothing about the benefits of 64 bit over 32 bit but Im happy if thats the difference with minimal performance impact.
 

Vee

Senior member
Jun 18, 2004
689
0
0
The big problem facing 32-bit gaming is the virtual address range.
A 32-bit game need to fit all the code and all the data models in about 1.5GB. The problem is even bigger for the development tools and editors, than for the game itself. 32-bit edition games are going to be cut down versions. Just bite off, chew and swallow.

There are two major differences between running 32-bit and 64-bit code:

1: The linear virtual address space is (currently) 256TB for 64-bit code, and only 2GB for 32-bit code. Since the virtual space becomes fully fragmented, you also can't use all of it. This is the big 32-bit killer. This is why 32-bit is not viable anymore. Considering the low prices on basic 754 64-bit CPUs and MBs, I wouldn't dream of wasting any money on any 32-bit PC today. Remember the 8086 and the 640KB barrier?

2: The '86-64 ISA specifies twice as many registers available for 64-bit code. Also, some registers are more flexible in use, as well. This should give good opportunities for some good compiler optimizations. Also, since the K8's sheduling window and register renaming are dimensioned for twice as many registers as are in use in 32-bit mode, we are not using the Athlon64s to their full potential. My guess is that we can get as much as 50% performance increase.

These two things are the biggies. It's tempting to see 32 vs 64 in terms of some width paradigm, "twice as wide is twice as fast". But that's a very minor thing. Mostly, only the pointers will be wider and benefit from wider processing. For the rest, things that benefit from width, are already done wide. 32 bit or 64 bit is not necessarily the width of the CPU's data processing. It's only the width of the integer registers, and the width of the address field, of the new 64-bit instructions.
The fact that it's not a "width" issue, does not mean 64-bit is not a big issue. It is an even bigger issue. Because what will be done in 64-bit code, cannot be done at all, with 32-bit code, not even slow. Bite off, chew and swallow.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Call me crazy but I thought it was pretty clear 64bit allows for more on screen w/o performance hit. Yes developer even says so:

. Mr. Wright informed me that the advantages with 64 bit are purely object oriented. The 64 bit executable allows more polygons to be rendered each cycle allowing more objects to be placed into the game while still maintaining optimum performance.

Then reviewer goes on to to show so.
 

Mullzy

Senior member
Jan 2, 2002
352
0
0
I'm not disagreeing with the concept of 64 > 32 for future games. I was just stating that the article is simply making a couple statements and showing a couple pretty pictures whose quality really isn't anything special.

Now show us that the 64 bit version never drops below 80 frames per second at 1600 x 1200 with 8xAA and 16xAF with 64 guys running around in a large open area killing an army of robots and we'll be impressed. :p There just isn't any actual evidence of anything in the article.

I'm sure in the near future some games and benchmarks will start being released showing nice improvements of a 64 bit architecture. Perhaps statements from respected developers stating the game they are currently working on is going to look amazing and couldn't be replicated in a 32 bit environment.

I'm really looking forward to that... I just don't believe this article as offered anything significant.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
Shadow Ops: Red Mercury

wow ~ did a 12yo come up with that highly original name for that game?
 

ruiner5000

Member
Oct 18, 2004
82
0
0
you guys are something. i don't really like you slandering the work that denver did on this article. in fact is it complete bs for you to do it. none of you have tested the 64 bit version have you? have any of you talked to zombie? are any of you even gamers? it is pretty weak for aholes like jiffy to say crap like that. we don't make anything up. ever. and we have been around almost as long as anandtech. you want to say crap like this, say it to my face, not in some forum. guys like you are a dime a dozen on the net. all talk, no facts, nothing to back up what you are saying at all. i hope you enjoy being a p_rick.
 

ruiner5000

Member
Oct 18, 2004
82
0
0
and also, guess what kiddos, there is no time demo feature. also the brainiac at pcper copied amdzone, and never contacted zombie for his article. it was a hack job based on what some idiot at atari marketing did, and was not a serious look at the game.
 

Vee

Senior member
Jun 18, 2004
689
0
0

Another of those lackey BS articles designed to keep people complacent about investing in Intel 32-bit. :frown:


"Those small items that are exclusive to the 64-bit version are not an exception as we saw the same quality enhancements as detail settings rose. "

- Smoke in your eyes!


"that does not mean that AMD processors directly increase the visual quality of your games; that is a leap of conclusions that we have to hold back"

- Utter BS!


 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Hmmmmm....sounds like something we all heard earlier.

Wow! Look at SM3.0 compared to SM2.0! Looks way better! Cool!

Hey wait! That doesn't even look like SM2.0!
Ummmmm.......that isn't SM2.0.....ooops! Sorry about that...... :D

I think we'll just have to wait and see about 64bitness, and by then we'll have the Intel - AMD war going all over again in 64bit mode. We'll also have new video cards to argue about as well, most likely.

 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
I must add that I find the article fishy. Leaving out whole buildings instead of lowering the detail a bit seems odd to me.

This may just be choices the developer made to make it look like you need 64bit for their game.

What about those folks running Semprons?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Vee


"that does not mean that AMD processors directly increase the visual quality of your games; that is a leap of conclusions that we have to hold back"

- Utter BS!

So you are honestly saying that you believe that 64-bit processors can display more detailed terrain (ie higher visual quality objects) over 32-bit processors, despite the fact that everything you see is calculated by the GPU and is related to the GPU's colour precision etc. 64-bit CPU's does not mean 64-bit colour! It's all done by the GPU!

We must be confusing the term "increased visual quality" here because a CPU cannot give you higher video quality/detail over another CPU. What it can do is, given a faster CPU (and with proper coding, 64-bit vs 32-bit will be 'faster'), it can render more objects onscreen simultaneously. But it doesn't mean "better looking terrain," just more stuff to look at.

Originally posted by: Vee

1: The linear virtual address space is (currently) 256TB for 64-bit code, and only 2GB for 32-bit code. Since the virtual space becomes fully fragmented, you also can't use all of it. This is the big 32-bit killer. This is why 32-bit is not viable anymore. Considering the low prices on basic 754 64-bit CPUs and MBs, I wouldn't dream of wasting any money on any 32-bit PC today. Remember the 8086 and the 640KB barrier?

When people start running 2GB of memory I will agree with you. When the standard for geekdom, as it stands, tops out at 1GB (very few enthusiasts currently run 2GB in their systems), there's still plenty of wiggle room with 32-bit's 'puny' ability to only address 2GB virtually.

2: The '86-64 ISA specifies twice as many registers available for 64-bit code. Also, some registers are more flexible in use, as well. This should give good opportunities for some good compiler optimizations. Also, since the K8's sheduling window and register renaming are dimensioned for twice as many registers as are in use in 32-bit mode, we are not using the Athlon64s to their full potential. My guess is that we can get as much as 50% performance increase.

I agree 100% with the bolded statement. 50% performance boost is just conjecture; we don't really know yet (although I bet it is over 25% when a game is designed from the ground up to be 64-bit). However, 64-bit games are not out yet and it appears to be a couple years off before a mainstream game becomes 64-bit optimized (not some whacked out mod).


Originally posted by: LTC8K6
I must add that I find the article fishy. Leaving out whole buildings instead of lowering the detail a bit seems odd to me.

This may just be choices the developer made to make it look like you need 64bit for their game.

No doubt; it's a showpiece for AMD. Nobody can provide a rational explanation for why 32-bit processors can't provide as detailed graphical objets as 64-bit because it is not possible.

Originally posted by: ruiner5000
you guys are something. i don't really like you slandering the work that denver did on this article.

Nobody slandered anybody here unless they expressed their views in a speech or another spoken presentation.

However, the differences between the screenshots in Denver's article are night and day; 64-bit has way more fine details than 32-bit. And yet, everything we see in the 64-bit version we have seen before on 32-bit games: the light rays, details on the ground, craggier surfaces, etc.

In order for this to be truly convincing of 64-bit's merit, there would have to be a performance comparison between 32-bit and 64-bit running the same detailed terrain (ie looking like the 64-bit screens) and saying that the 32-bit processor is "not capable" is not an option (that's a cop out). Show us precisely *why* 32-bit is incabable of rendering those screenshots at playable fps, because we've seen those details before, and they are dependent on fast pixel shader 2.0+ performance, not 64-bit (yet).
 

assemblage

Senior member
May 21, 2003
508
0
0
Wow ruiner5000 seems mad! I'm with you fella. I don't even have to read the article to know that "32x2=64" which means "fast x2 = Twice as fast."
 
Jun 14, 2003
10,442
0
0
Originally posted by: LTC8K6
I must add that I find the article fishy. Leaving out whole buildings instead of lowering the detail a bit seems odd to me.

This may just be choices the developer made to make it look like you need 64bit for their game.

What about those folks running Semprons?


unlock the 64bit capabilities!!! hahaha no seriously
 

Vee

Senior member
Jun 18, 2004
689
0
0
Originally posted by: jiffylube1024
Originally posted by: Vee


"that does not mean that AMD processors directly increase the visual quality of your games; that is a leap of conclusions that we have to hold back"

- Utter BS!
So you are honestly saying that you believe that 64-bit processors can display more detailed terrain (ie higher visual quality objects) over 32-bit processors, despite the fact that everything you see is calculated by the GPU and is related to the GPU's colour precision etc. 64-bit CPU's does not mean 64-bit colour! It's all done by the GPU!

We must be confusing the term "increased visual quality" here because a CPU cannot give you higher video quality/detail over another CPU. What it can do is, given a faster CPU (and with proper coding, 64-bit vs 32-bit will be 'faster'), it can render more objects onscreen simultaneously. But it doesn't mean "better looking terrain," just more stuff to look at.
Exactly. That is one reason why 64-bit will give us better looking 3D worlds. Another, is that the removed limit on virtual range, will make larger, more realistic and detailed 3D worlds possible.
So certainly, I'm "honestly saying that" I "believe that 64-bit processors can display more detailed terrain (ie higher visual quality objects) over 32-bit processors". I don't know anything about 'Shadow Ops' or why it does things as it does. But a 64-bit game can do so because the the program's virtual space can hold more and more detailed objects.
So I stand by that the article statement is BS.
The reason I like the phrase BS in this context, is I'm brimful enough of pent up irritation over the 2GB limit, to be somewhat lacking in patience with people who I consider are part of the problem. That would be all people suggesting 32-bit is "enough". ;) Just bear with me please, I'll try to be nicer.

But they need to get on the program and aqcuire 64-bit equipment ;)
Of course you can claim that aviation does not bring any benefits to long distance travelers, as long as they are still insisting on traveling by horse cart. (The sad part is that it is true as long as we don't have 64-bit software. And we won't get 64-bit software as long as we are running 32-bit hardware.)

Originally posted by: Vee

1: The linear virtual address space is (currently) 256TB for 64-bit code, and only 2GB for 32-bit code. Since the virtual space becomes fully fragmented, you also can't use all of it. This is the big 32-bit killer. This is why 32-bit is not viable anymore. Considering the low prices on basic 754 64-bit CPUs and MBs, I wouldn't dream of wasting any money on any 32-bit PC today. Remember the 8086 and the 640KB barrier?

When people start running 2GB of memory I will agree with you. When the standard for geekdom, as it stands, tops out at 1GB (very few enthusiasts currently run 2GB in their systems), there's still plenty of wiggle room with 32-bit's 'puny' ability to only address 2GB virtually.

When average people are running 2GB, it's way too late to start agreeing with me. First of all, this doesn't have terribly much to do with available physical memory. I have been professionally banging my head against the 2GB roof for what seem to be a long time now. And when I started to do that, I only had 512MB ram in a 1.5GHz P4.

I don't feel the "standard for geekdom" (I don't have terrible much contact with young male gaming geekdom, but I occasionally have some.) "tops out" at 1GB. Rather I think 1GB is the current "normal" for WindowsXP. And 1.5GB is quite reasonable, isn't it? Or am I totally in the blue?

Finally there's not "plenty of wiggle room".
The 2GB virtual barrier has been a real de facto limitation for games for quite a while.

You might find this comment (from as early as February 2002) by Unreal developer and Epic head Tim Sweeney enlighting:

http://slashdot.org/comments.p...=54835&cid=5371889
---------------------------------------------
Intel's claims are wholly out of touch with reality.

On a daily basis we're running into the Windows 2GB barrier with our next-generation content development and preprocessing tools.

If cost-effective, backwards-compatible 64-bit CPU's were available today, we'd buy them today. We need them today. It looks like we'll get them in April.

Any claim that "4GB is enough" or that address windowing extensions are a viable solution are just plain nuts. Do people really think programmers will re-adopt early 1990's bank-swapping technology?

Many of these upcoming Opteron motherboards have 16 DIMM slots; you can fill them with 8GB of RAM for $800 at today's pricewatch.com prices. This platform is going to be a godsend for anybody running serious workstation apps. It will beat other 64-bit workstation platforms (SPARC/PA-RISC/Itanium) in price/performance by a factor of 4X or more. The days of $4000 workstation and server CPU's are over, and those of $1000 CPU's are numbered.

Regarding this "far off" application compatibility, we've been running the 64-bit SuSE Linux distribution on Hammer for over 3 months. We're going to ship the 64-bit version of UT2003 at or before the consumer Athlon64 launch. And our next-generation engine won't just support 64-bit, but will basically REQUIRE it on the content-authoring side.

We tell Intel this all the time, begging and pleading for a cost-effective 64-bit desktop solution. Intel should be listening to customers and taking the leadership role on the 64-bit desktop transition, not making these ridiculous "end of the decade" statements to the press.

If the aim of this PR strategy is to protect the non-existant market for $4000 Itaniums from the soon-to-be massive market for cost-effective desktop 64-bit, it will fail very quickly.

-Tim Sweeney, Epic Games
-----------------------------------------------------



2: The '86-64 ISA specifies twice as many registers available for 64-bit code. Also, some registers are more flexible in use, as well. This should give good opportunities for some good compiler optimizations. Also, since the K8's sheduling window and register renaming are dimensioned for twice as many registers as are in use in 32-bit mode, we are not using the Athlon64s to their full potential. My guess is that we can get as much as 50% performance increase.

I agree 100% with the bolded statement. 50% performance boost is just conjecture; we don't really know yet (although I bet it is over 25% when a game is designed from the ground up to be 64-bit). However, 64-bit games are not out yet and it appears to be a couple years off before a mainstream game becomes 64-bit optimized (not some whacked out mod).

I agree that we don't really know quite yet, and I admit that I wasn't specifically considering the game type of application. But it's not just conjecture. Some early Windows64 application ports show 37%-57% performance improvement. And special, back-end apps like some encoding and encryption, show 100% and even more than 100% improvement, as you probably already know.

 

Gannon

Senior member
Jul 29, 2004
527
0
0
LOL. What a crock, it will take years before masses of people get rigs decked out with Gigs of ram. Game developers shoot for the lowest common denominator unless its something like Epic or ID software, or any other company that knows what the machine specs of its target audience are like. This is one of the reasons counterstrike is still so pervasive, it runs on ancient hardware and most anyone can play. Shooting too high with specs only reduces sales because no one has the hardware to run it acceptably, if at all.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Vee
Originally posted by: jiffylube1024

So you are honestly saying that you believe that 64-bit processors can display more detailed terrain (ie higher visual quality objects) over 32-bit processors, despite the fact that everything you see is calculated by the GPU and is related to the GPU's colour precision etc. 64-bit CPU's does not mean 64-bit colour! It's all done by the GPU!

We must be confusing the term "increased visual quality" here because a CPU cannot give you higher video quality/detail over another CPU. What it can do is, given a faster CPU (and with proper coding, 64-bit vs 32-bit will be 'faster'), it can render more objects onscreen simultaneously. But it doesn't mean "better looking terrain," just more stuff to look at.
Exactly. That is one reason why 64-bit will give us better looking 3D worlds. Another, is that the removed limit on virtual range, will make larger, more realistic and detailed 3D worlds possible.
So certainly, I'm "honestly saying that" I "believe that 64-bit processors can display more detailed terrain (ie higher visual quality objects) over 32-bit processors". I don't know anything about 'Shadow Ops' or why it does things as it does. But a 64-bit game can do so because the the program's virtual space can hold more and more detailed objects.

"Will" is the keyword. We are a few years off before 64-bit gaming gives any improvement over 32-bit. This Shadow Ops game is doing nothing spectacularly different from any games, and it could easily run those details on a 32-bit processor at a slower speed. And I'm willing to bet the difference would be marginal. We just aren't at the "need" for 64-bit quite yet.

Originally posted by: Vee

1: The linear virtual address space is (currently) 256TB for 64-bit code, and only 2GB for 32-bit code. Since the virtual space becomes fully fragmented, you also can't use all of it. This is the big 32-bit killer. This is why 32-bit is not viable anymore. Considering the low prices on basic 754 64-bit CPUs and MBs, I wouldn't dream of wasting any money on any 32-bit PC today. Remember the 8086 and the 640KB barrier?

Doom3, considered probably the most advanced 3d game released to date, runs just fine with 32-bit. When id released their next engine, it will probably need 64-bit. Until then, many/most future games will be using id's Doom3 engine, Epic's UT engine and Valve's Source engine. All three of those are 32-bit (although, yes I am aware UT has a 64-bit version already, however it does not need 64-bit yet).

When people start running 2GB of memory I will agree with you. When the standard for geekdom, as it stands, tops out at 1GB (very few enthusiasts currently run 2GB in their systems), there's still plenty of wiggle room with 32-bit's 'puny' ability to only address 2GB virtually.

When average people are running 2GB, it's way too late to start agreeing with me. First of all, this doesn't have terribly much to do with available physical memory. I have been professionally banging my head against the 2GB roof for what seem to be a long time now. And when I started to do that, I only had 512MB ram in a 1.5GHz P4.

I don't feel the "standard for geekdom" (I don't have terrible much contact with young male gaming geekdom, but I occasionally have some.) "tops out" at 1GB. Rather I think 1GB is the current "normal" for WindowsXP. And 1.5GB is quite reasonable, isn't it? Or am I totally in the blue?
[/quote]

1 GB, 1.5GB, whatever. 1.5GB is tons today and 2GB is, essentially, overkill. MORE than 2 GB is not necessary at the moment. By all current trends, 2GB will be the standard in a couple years from now. However, right now is not 'a couple years from now'.

If cost-effective, backwards-compatible 64-bit CPU's were available today, we'd buy them today. We need them today. It looks like we'll get them in April.

Any claim that "4GB is enough" or that address windowing extensions are a viable solution are just plain nuts. Do people really think programmers will re-adopt early 1990's bank-swapping technology?

Many of these upcoming Opteron motherboards have 16 DIMM slots; you can fill them with 8GB of RAM for $800 at today's pricewatch.com prices. This platform is going to be a godsend for anybody running serious workstation apps. It will beat other 64-bit workstation platforms (SPARC/PA-RISC/Itanium) in price/performance by a factor of 4X or more. The days of $4000 workstation and server CPU's are over, and those of $1000 CPU's are numbered.

Regarding this "far off" application compatibility, we've been running the 64-bit SuSE Linux distribution on Hammer for over 3 months. We're going to ship the 64-bit version of UT2003 at or before the consumer Athlon64 launch. And our next-generation engine won't just support 64-bit, but will basically REQUIRE it on the content-authoring side.

We tell Intel this all the time, begging and pleading for a cost-effective 64-bit desktop solution. Intel should be listening to customers and taking the leadership role on the 64-bit desktop transition, not making these ridiculous "end of the decade" statements to the press.

If the aim of this PR strategy is to protect the non-existant market for $4000 Itaniums from the soon-to-be massive market for cost-effective desktop 64-bit, it will fail very quickly.

-Tim Sweeney, Epic Games
-----------------------------------------------------

16-RAM slot equipped Opteron boards have no relevance to the casual/hardcore PC gamer.

Sweeney puts it best. Their next Unreal game will "basically REQUIRE 64-bit". That would be the Unreal 3 engine. Release date: 2006/2007. That's about 2 major computer overhauls and about 4 minor overhauls for myself and probably most of the people on this board. Ergo, 64-bit will become very important. In 3 years. When we're all running on Nforce5 or whatever Intel has out. For the here and now, 64-bit is, for most of us, a checkbox feature.