Computer oxymorons I'm tired of dealing with

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hanoverphist

Diamond Member
Dec 7, 2006
9,867
23
76
mebbe when they made gta games they wanted the look and feel to be the same as on the console. hl2 was a crowning achievement in dynamics, texturing and a showcase of their new engine. the polygon count used in that game arent the same as the ones typically used in a console game i bet. i know when i did my first project i went way overboard on poly count and it showed in the animation end product. the quality wa great, the operability wasnt. too much stuff to recalc for it to move fluidly. a lot of the definition in console/ low memory games is achieved by good texturing and bumping, not by actual model definition. sure, this is from a long time ago (i graduated in 2000) but it is still relevant now i bet.

why is the wii graphics so poor when the hardware could clearly support and display way better? because thats the look and feel they wanted for their games.
 

So

Lifer
Jul 2, 2001
25,923
17
81
I can assure you that a PC from 1999/2000 is going to have a hell of a lot more processing power than the PS2 and a computer from 2003 is sure as shit going to have several times that of a PS2. So don't give me that bullshit about the resolution being higher on the PC when you damn well know that there is more than enough video memory, system memory and processing capability on the PC to not only play the same damn game at 3X the resolution, BUT with more polygons & higher quality textures & more physics, etc. etc. as well, something you do not see on the PC versions of these games. The texture quality of these games on the PC is on par with the PS2 versions despite the available capacity that is monstrous in comparison to the PS2's hardware capability.

I used Half Life 2 as an example of a well programmed game and yes I'm quite aware that playing HL2 on a Rage 128 pro will make it look like HL1 but that says to me that they programmed the game to be as efficient as possible while the same cannot be said of GTA on the PC.


I'm not going to weigh in on the rest of this discussion, but you must factor in the fact that a PC has a whole hell of a lot more software overhead than a console. Not only is the OS doing all sorts of stuff in the background totally unrelated to your game, but the varied nature of the hardware means that devs have to write to programming interfaces that are several times slower than writing to the bare silicon that console devs can access. If you wrote HL2 to be a boot from CD game that ran only on a specific Dell Inspiron, you could probably get an order of magnitude improvement over the existing implementation.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Problem is, you make these counter statements and then you can't back it up. How do I know you haven't disabled half of the features that come with the default install of the OS? Don't make me do two installs of XP and pull up task manager to show you what I'm talking about..
sShot.png


Sorry, but WinXP right out of the box doesn't run much leaner then that.

That's not true at all. That's like saying Windows Vista is just Windows 3.1 with a bunch of patches.. that's bullshit and you know it!
Not really. It is completely possible to patch out code until the original isn't there any more. However, it really wouldn't surprise me to find win 3.1 code in Vista, heck, it wouldn't surprise me to find dos code in vista.

With something as complex as an OS, Pretty much Nobody starts from scratch now-a-days. Especially not companies that have been around for the last 30 years.

The fundamentals of NT and Windows 1,2,3,95,98,ME were indeed different. Notice how I didn't make comparisons between 95 and 2000 in terms of memory footprint and how 2000 has gotten "bloated" over 95. Windows NT was great in comparison to 3.x,9X...you know why? Because they built the OS from the ground up, they reprogrammed it to fix serious security loopholes and this is thanks to the development of OS/2. You can't tell me that through a series of patches, they could have made Windows 9X like 2000/XP.. There comes a point in time where you've got to revamp things, something that Vista was "suppose" to do but instead became this pig of an OS that nobody liked..
Tell me, how different is the WinNT api from the Win9X api. Answer: it isn't. Are things done differently in the background? Yep, the kernel IIRC is quite a bit different. However, because they used the same API you are going to find bits of Win95 in NT, there's no question about it.

They didn't reprogram it from the ground up.

Windows 7 is that same pig, Vista, after a losing 15lbs and with some lipstick.. Same pig, different day.. Yes it is true, a lot of software is just a patched version of its predecessors but there is suppose to come a point in a software development's life cycle where they actually reprogram from the ground up...

Who says? That's terrible programming. If you have to rebuild something from the ground up, you have written some really crappy code. Good programming means that you reuse code. Why reinvent the wheel?

Yeah and guess what? Intel tried to do something about this ancient design with its Itanium processors and while they failed at that, the idea to revamp their processors DID make sense. Alpha processors were better than Intel at one point because they were very optimized and were able to be clocked at high speeds despite running on processes like 0.5nm.
Stop pulling numbers out of your ass. Nobody is anywhere close to operating at the 0.5nm process today. Also, if you think that intel just threw away everything they've done with the 8088 and other processors when they started the Itanium, you are more retarded then people give you credit for. Engineering in general is about tweaking yesterdays concepts to fit today's problems. Why would you throw away 10+ years of work to have to take another 10+ years to make a product? Its expensive, costly, and nobody works like that.

Don't even get me STARTED on drivers.. Nothing has ballooned more in size than those things.. How did a driver that used to fit on a floppy that just lets my computer talk to my printer now become this monstrous 400MB install? WTF is up with that? Even if I want the slimmed down driver which HP now provides for its printers, it's still 50mb in size!
HP has bloated drivers, there is no question about that. Not every company does though.

Yup, you keep saying that and while you're at it, why not just say Windows 7 is just a series of patches over DOS... Maybe at Adobe that is true (I'd believe that!) but good software companies usually revamp their software at some point..
No, they don't. This is really telling about how little you actually know about the software industry. The only products made from the ground up are brand new products. The only time a company will do a total revamp on a product is when the code was so terrible in the first place that they simply can't reuse any from the original and maintenance is impossible.

The only other time is when the code is written in a defunct language (COBOL) that nobody really programs in anymore. Even in this case, they usually base the new code of of the old code to try and insure good interoperability.

No, you're not getting it.. The Playstation 2 has a 4MB video card with the processing power of a video card from 1998.
Ah, no. First off, the graphics card was a specialty one, kind of hard to compare it to any video card from any era. Second, PS2 practiced shared memory. And finally, the PS2 processor isn't an x86 comparing just about any number from it to a PC would be pretty worthless.

Why is it that I need a Geforce 3 to play GTA III on the PC at 800X600 and need an X800XT to play GTA San Andreas at a similar to slightly higher resolution? I'm quite aware of what the box says, but what the box says and what you actually need are quite different.. I already know the answer, and the answer is, console ports always suck. They do a shit job at porting things over and since their only constraint was the stupid 5 year rule, they effectively made a game that should have worked on a computer from 1999 and instead made a buggy, slow game that required a computer from 2003/4/5 which is utter bullshit. I'm quite aware of what resolution the PS2 is capable of outputting its games but I can assure you that a PC from 1999/2000 is going to have a hell of a lot more processing power than the PS2
Bull Shit. Utter and total bullshit. Do yourself a favor, Go try and run a PS2 emulator. You'll find very quickly that even today's computers have a tough time emulating what the PS2 does.

They are Different Architectures, comparisons between them is completely and utterly worthless. Saying that a PC of the era has loads more processing power then a console of the same era is so wrong on so many levels.

and a computer from 2003 is sure as shit going to have several times that of a PS2. So don't give me that bullshit about the resolution being higher on the PC when you damn well know that there is more than enough video memory, system memory and processing capability on the PC to not only play the same damn game at 3X the resolution, BUT with more polygons & higher quality textures & more physics, etc. etc. as well, something you do not see on the PC versions of these games. The texture quality of these games on the PC is on par with the PS2 versions despite the available capacity that is monstrous in comparison to the PS2's hardware capability.
grandtheftauto3_790screen018.jpg

gta3_790screen008.jpg


Take a guess at which one is which. Here's a hint, the one with the square cars, and reduced details, lower resolution, and jaggy edges, isn't the PC.
 

grrl

Diamond Member
Jun 21, 2001
6,204
1
0
maybe you should just stop, he's not listening. you're going to wind up bashing your skull on the keyboard.

Good advice. fleabag is like the Energizer village idiot and, like the commercials used to say, "nothing outlasts the Energizer."
 

fleabag

Banned
Oct 1, 2007
2,450
1
0
I'm not going to weigh in on the rest of this discussion, but you must factor in the fact that a PC has a whole hell of a lot more software overhead than a console. Not only is the OS doing all sorts of stuff in the background totally unrelated to your game, but the varied nature of the hardware means that devs have to write to programming interfaces that are several times slower than writing to the bare silicon that console devs can access. If you wrote HL2 to be a boot from CD game that ran only on a specific Dell Inspiron, you could probably get an order of magnitude improvement over the existing implementation.

Not that much overhead in comparison to how sparse the resources are on the PS2.. Just remember, the PS2 has 32MB of memory.. in 1999 a computer with 128MB of memory was extremely common. If 128MB was enough for a comfortable computing experience, then that would imply that no more than 64MB of ram is being wasted by the OS. I know we could get into specifics and argue about whether or not the PS2 had more processing power than the PCs of that time which is why I'm going to point out this... Why is it that you need a PIII 650 with 256MB of ram, Geforce 3 TI500 in order to play GTA III at a decent framerate...? But that's according to PC Gamer Magazine, when I tried it out myself, with my P4 1.7ghz and R9800pro 128MB, 512MB of ram, I still wasn't able to have that great of an experience playing GTAIII and that was on a 1280X1024 monitor!

XBOX is faster than PS2 and had better graphics yet all that had was a Celeron 733 and some strange Nvidia video card which resembled a Geforce.. Explain this one... Why do the GTA games on the PC look more like the PS2 versions instead of the Xbox versions? I remember playing GTA at my cousin's house and couldn't believe how much better the graphics were in comparison to the PC version or the PS2 version. It went XBOX -> PC->PS2 in terms of graphics quality. Again, you people fail to address the issue of how come a series of games that are available on the PS2 have ever increasing requirements on the PC AND do not look any better.
sShot.png


Sorry, but WinXP right out of the box doesn't run much leaner then that.
Yeah it does actually but I don't think you'd know because you probably weren't around for when XP RTM was released.. IIRC I had an install with the ATI Video drivers fully installed (opposed to using Windows' driver) with XP SP1 and all its updates installed have a commit charge of 96MB of ram..
Not really. It is completely possible to patch out code until the original isn't there any more. However, it really wouldn't surprise me to find win 3.1 code in Vista, heck, it wouldn't surprise me to find dos code in vista.

With something as complex as an OS, Pretty much Nobody starts from scratch now-a-days. Especially not companies that have been around for the last 30 years.
There isn't much of any 3.x code in Vista as they removed most of the useful compatibility. I'd say the last OS to have the most legacy code besides the 9X OS' would be Windows XP. Also I'm mostly talking about the kernel, which IS a patched version of what came out in 1993 with Windows NT 3.1 but is NOT a patched version of the 9X kernel. If you compare Microsoft's foray into the OS market, they have in fact built an OS (at least the core fundamentals) from the ground up because they needed to solve the issues that simply couldn't be patched out of the DOS based Windows OS that is 3.x/9X...

Tell me, how different is the WinNT api from the Win9X api. Answer: it isn't. Are things done differently in the background? Yep, the kernel IIRC is quite a bit different. However, because they used the same API you are going to find bits of Win95 in NT, there's no question about it.

They didn't reprogram it from the ground up.



Who says? That's terrible programming. If you have to rebuild something from the ground up, you have written some really crappy code. Good programming means that you reuse code. Why reinvent the wheel?
It's not a wheel, they've turned it into a hexagon and consequently it rides about as smoothly as one would imagine. Yes they did build it from the ground up, a hell of a lot more than from XP to Vista. NT is far closer to OS/2 than 3.X or 9X is. There is a reason why Windows 9X never got a "task manager" like you see in Windows NT.. No CPU utilization, memory utilization, kernel utilization, etc. etc.. You speaking theoretically is getting annoying and is just a red herring...

Stop pulling numbers out of your ass. Nobody is anywhere close to operating at the 0.5nm process today. Also, if you think that intel just threw away everything they've done with the 8088 and other processors when they started the Itanium, you are more retarded then people give you credit for. Engineering in general is about tweaking yesterdays concepts to fit today's problems. Why would you throw away 10+ years of work to have to take another 10+ years to make a product? Its expensive, costly, and nobody works like that.

Uh, I think you completely missed my point...
At the time of its announcement, Alpha was heralded as an architecture for the next 25 years. While this was not to be, Alpha has nevertheless had a reasonably long life. The first version, the Alpha 21064 (otherwise known as the EV4) was introduced in November 1992 running at up to 192 MHz; a slight shrink of the die (the EV4S, shrunk from 0.75 µm to 0.675 µm) ran at 200 MHz a few months later.
http://en.wikipedia.org/wiki/DEC_Alpha#Implementations

HP has bloated drivers, there is no question about that. Not every company does though.
Yeah well that seems to be the trend.. Installing drivers for various pieces of hardware, whether with the "utilities it comes with or not" can bring an install's footprint from 192mb (your case) to something in the arena of 400MB of memory utilization. I've seen it first hand, I tried to "clean up" the system only to discover that a lot of the memory was being taken up by device drivers, the closure of their utilities did nothing as the kernel memory utilization was well over 250MB in size.

No, they don't. This is really telling about how little you actually know about the software industry. The only products made from the ground up are brand new products. The only time a company will do a total revamp on a product is when the code was so terrible in the first place that they simply can't reuse any from the original and maintenance is impossible.
That's how I feel but that's clearly not how Microsoft feels.. They know they can get away with the status quo because of people like you going out of your way to defend them and their behavior.. I was appalled to find out the computer resource requirements of Windows Vista that I completely swore off the operating system indefinitely. I couldn't believe something so fundamental and basic could require so many resources in order to operate... I'm waiting for the day when game developers start releasing their games for Linux based machines.


Ah, no. First off, the graphics card was a specialty one, kind of hard to compare it to any video card from any era. Second, PS2 practiced shared memory. And finally, the PS2 processor isn't an x86 comparing just about any number from it to a PC would be pretty worthless.

Bull Shit. Utter and total bullshit. Do yourself a favor, Go try and run a PS2 emulator. You'll find very quickly that even today's computers have a tough time emulating what the PS2 does.
omg, I can't believe you actually used this in your argument.... I feared some moron would go to this length but to actually see it... ugh. To put it simply, emulating something and actually being faster than it are two completely different things..

They are Different Architectures, comparisons between them is completely and utterly worthless. Saying that a PC of the era has loads more processing power then a console of the same era is so wrong on so many levels.
No it's not because the tasks they are to accomplish are quite similar if not the same. People used to compare Apple computers to PCs all the time, whether those Macs had Motorola processors or IBM's processors.

grandtheftauto3_790screen018.jpg

gta3_790screen008.jpg


Take a guess at which one is which. Here's a hint, the one with the square cars, and reduced details, lower resolution, and jaggy edges, isn't the PC.
I can't tell as they look basically the same to me.. One looks a little sharper but that's a given when the PS2 outputs at a much lower resolution, but that's not what I'm talking about though...