I'm glad to see some sensibility. You're right, it's not really necessary, and stopped being so, oh, like 20 years ago.
When you're distributing 200 MBy or 4 GBy (typical video game) sized *applications*, why would it POSSIBLY be of interest to the computer owner / administrator that they just 'saved' you 20 MBy of disk space by pooling a few DLLs all over your partitions?
One can buy terabyte disc drives for less than half of the cost of a typical piece of application software like Photoshop, it's just irrelevant to think of saving disk space by congregating DLLs or fonts or icons or whatever in naive and insufficiently administerable ways.
On the other hand it IS a big deal that I'd want my application installations to be isolated, easily administerable, easily installable / deletable / relocatable, easily verifiable. If ALL the files needed to run a given application were under ONE directory tree root it'd be pretty trivial for me to say, ok, back these specific things up and I can be 100% confident of having a good complete backup of the application and its configurations. If they're spread all over 200 directories and two or more disk partitions I no longer even have basic backup / restore capability of that application. I can restore the whole DISC, but not restore a given application if it alone gets corrupted or whatever. Restoring the whole C drive isn't a solution because of course I may have other applications that have NEWER valid data associated with them than the date of the last full/incremental backup even if that was just hours ago, and I'd be foolish to LOSE that data concerned with OTHER applications when all I wanted to do is restore / repair ONE application for instance.
Furthermore if I find malware in something like c:\winnt\system\vbrun6.dll, what application do I blame for putting it there, and what applications do I worry about being corrupted / compromised if that same file is nominally installed by, shared by, used by 20 different programs? If that file does get corrupted now I have the great benefit of *20* applications not working rather than *one*... that's progress?
And besides having distinct COPIES of 'shared files' in N different locations doesn't mean I'm wasting SPACE to do it. That's what hard links are for in UNIX terminology or something like junction points in NTFS. Just tell the system you'd like 20 copies of one file in this set of locations and voila, they're each "really" copies of that file in different locations but they share the same disk space to conserve resources and synchronize the distribution of it where that's really needed.
Further, this is the age of databases. Now if they would do something like put all the application 'files' in a database and then let the database sort out merging / compressing duplicate data for efficiency, great.
Just so long as I can as a sysadmin do:
SELECT * FROM APPLICATION_FILES WHERE PROGRAM_NAME="Photoshop"
and then backup that table / set of files / whatever or verifity them or virus scan them or move them into other places or whatever I need to do.
And, indeed, on LINUX what you have is a DATABASE of shared library names, versions, disk locations. They do NOT necessarily or usually cram them all into one directory. If an application has a library it wants to 'share' with other programs in the system it just adds a record to the databases and runs a command to update the database records of the linker.
So although some say spamming files all over your disc(s) is done for a "good reason", I'm QUITE aware of what they THINK are good reasons, and I reject the reasoning entirely. It's archaic, obsolete, half-baked, and doesn't well serve the needs of end users / system administrators or developers a large part of the time. There are trivial solutions that'd be far better than the status quo.
It's like saying we should all use GW BASIC and make all variables global because having everything shared is just so convenient. Hello, welcome to the 21st century, you know, we want things to be object oriented, isolated, secure against application <-> application or application <-> os malware, etc. The mark of good OO design patterns is how LITTLE 'internal data' about something you can POSSIBLY SHARE outside of your little subsystem.
Frankly from a security and administerability perspective I'd be just as happy if all major applications weren't only self-contained in their own private directory trees, but went so far as to run in private VMs almost totally isolated from the rest of the applications / storage on the system. We'd have a lot less crashes and malware that way, and installation / maintenance would sure be a lot simpler.
Originally posted by: nordloewelabs
Originally posted by: Nothinman
I'd bet money that it had nothing to do with Adobe trying to make their apps more portable and that it was just a holdover from the fact that most of their apps started out on Mac OS.
i also guessed that this capability had to do with their origins on Macs. btw, i've only tested this portability on Photoshop 6. it might not work on Photoshop 7 or 8 (the first version named CS).
No, in most cases it's done for real, good reasons like shared libraries or services.
is this really necessary? i mean, are such libraries and services big enough to justify spreading files all over the system? i see the point of such design in a small 5-20Mb app, but the last Photoshop releases have been pretty big (CS > 150Mb; CS2 > 300Mb). i'm sure sharing resources has a role in those decisions, but i doubt they are the only reasons behind them. piracy is probably a bigger concern.
on current Macs, can apps still be installed by simpling copying their installation folders? in Linux lots of progs are self-contained, right? dont they need to share resources?