Sharing programs between operating systems

crustacean1

Junior Member
Nov 15, 2007
6
0
0
I was just wondering if it is possible to share a program between windows XP and Vista on a dual boot system.
For example, if I wanted to be able to use Adobe CS2 in both operating systems, could I do this using one instal or would I need to instal the CS twice, once in XP and the other in Vista?
Obviously it would be beneficial to have only one installation as it would save considerable amounts of space.
If anyone can tell me if this is possible, and if so how to do it, that would great.

Thanks very much.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Since XP and Vista can both read NTFS partitions without a problem, I would think that you could just install the app to a 3rd partition and have both OSs be able to access it. Not 100% sure how CS2 works though.
 

nordloewelabs

Senior member
Mar 18, 2005
542
0
0
in small and self-contained programs, that is possible. however, for most apps, it isnt, because the installation routine usually copies files to the WINDOWS, SYSTEM32 or USER folders. furthermore, data is also added to the Registry of the OS where the installation was performed.

Photoshop 6 can run from a single folder anywhere in your hard drive and be accessed by all of your installed OSs -- just as you want. however, writing to the Registry of each OS is still desirable as you'll want Photoshop associated to PSD files. that can be done manually from inside the OS, though.
 

QuixoticOne

Golden Member
Nov 4, 2005
1,855
0
0
Wow *one* thing Adobe does that I actually like... they had the brains to write a mostly self-contained application.

USUALLY because of the unexpected / unwanted / unspecified (the used never SAID to install anything to C:) behavior of spamming files and configurations and registry entries to random spots on C: even if the user explicitly chose another disc / partition for the install it is NOT possible to have the same application work if you dual-boot or otherwise change / replace C:.

In virtually ALL cases this is just STUPID of the application authors.

In several cases, frankly, I think it's tantamount to grounds for a lawsuit, e.g.
Microsoft Office uses their DRM 'OGA' thing to make sure it's 'genuine' and will stop working if that fails, and it'll fail if you change operating systems on C: even if it's installed on D:.

However there's nothing inappropriate about wanting to use your licensed office software on the SAME computer just according to whether you happen to boot into XP one day or into Vista the next day or whatever. And if you install it to a commonly accessable partition there's no reason it SHOULDN'T work, but due to the whole OGA/DRM thing, it won't. And if you just duplicate its installation so the 2nd OS on the same PC can also use it properly, you'll probably lose your license because of the stupid OGA DRM thing. So basically you can't use what you paid for even though I'm sure there are all sorts of anti-lock-in laws that would say that it's inappropriate for them to tie one of their products (Office) to another one of their products (say XP) and NOT let you have the freedom to still get benefit of Office if you happen to want to change your mind on any given day and run something else instead of XP (say Vista, LINUX, MacOS, whatever is possibly compatible).

 

toadeater

Senior member
Jul 16, 2007
488
0
0
You can create portable (no installation or registry access required) versions of just about any application with Thinstall. It's kind of like running a virtual machine, only it doesn't need all the extra RAM.

http://www.thinstall.com/
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Wow *one* thing Adobe does that I actually like... they had the brains to write a mostly self-contained application.

I'd bet money that it had nothing to do with Adobe trying to make their apps more portable and that it was just a holdover from the fact that most of their apps started out on Mac OS.

In virtually ALL cases this is just STUPID of the application authors.

No, in most cases it's done for real, good reasons like shared libraries or services. There are conventions, if you don't like them that's fine but calling people stupid for following them only reflects poorly on you.

In several cases, frankly, I think it's tantamount to grounds for a lawsuit, e.g.

Yea, good luck with that.
 

nordloewelabs

Senior member
Mar 18, 2005
542
0
0
Originally posted by: Nothinman
I'd bet money that it had nothing to do with Adobe trying to make their apps more portable and that it was just a holdover from the fact that most of their apps started out on Mac OS.
i also guessed that this capability had to do with their origins on Macs. btw, i've only tested this portability on Photoshop 6. it might not work on Photoshop 7 or 8 (the first version named CS).

No, in most cases it's done for real, good reasons like shared libraries or services.
is this really necessary? i mean, are such libraries and services big enough to justify spreading files all over the system? i see the point of such design in a small 5-20Mb app, but the last Photoshop releases have been pretty big (CS > 150Mb; CS2 > 300Mb). i'm sure sharing resources has a role in those decisions, but i doubt they are the only reasons behind them. piracy is probably a bigger concern.

on current Macs, can apps still be installed by simpling copying their installation folders? in Linux lots of progs are self-contained, right? dont they need to share resources?

 

QuixoticOne

Golden Member
Nov 4, 2005
1,855
0
0
I'm glad to see some sensibility. You're right, it's not really necessary, and stopped being so, oh, like 20 years ago.

When you're distributing 200 MBy or 4 GBy (typical video game) sized *applications*, why would it POSSIBLY be of interest to the computer owner / administrator that they just 'saved' you 20 MBy of disk space by pooling a few DLLs all over your partitions?

One can buy terabyte disc drives for less than half of the cost of a typical piece of application software like Photoshop, it's just irrelevant to think of saving disk space by congregating DLLs or fonts or icons or whatever in naive and insufficiently administerable ways.

On the other hand it IS a big deal that I'd want my application installations to be isolated, easily administerable, easily installable / deletable / relocatable, easily verifiable. If ALL the files needed to run a given application were under ONE directory tree root it'd be pretty trivial for me to say, ok, back these specific things up and I can be 100% confident of having a good complete backup of the application and its configurations. If they're spread all over 200 directories and two or more disk partitions I no longer even have basic backup / restore capability of that application. I can restore the whole DISC, but not restore a given application if it alone gets corrupted or whatever. Restoring the whole C drive isn't a solution because of course I may have other applications that have NEWER valid data associated with them than the date of the last full/incremental backup even if that was just hours ago, and I'd be foolish to LOSE that data concerned with OTHER applications when all I wanted to do is restore / repair ONE application for instance.

Furthermore if I find malware in something like c:\winnt\system\vbrun6.dll, what application do I blame for putting it there, and what applications do I worry about being corrupted / compromised if that same file is nominally installed by, shared by, used by 20 different programs? If that file does get corrupted now I have the great benefit of *20* applications not working rather than *one*... that's progress?

And besides having distinct COPIES of 'shared files' in N different locations doesn't mean I'm wasting SPACE to do it. That's what hard links are for in UNIX terminology or something like junction points in NTFS. Just tell the system you'd like 20 copies of one file in this set of locations and voila, they're each "really" copies of that file in different locations but they share the same disk space to conserve resources and synchronize the distribution of it where that's really needed.

Further, this is the age of databases. Now if they would do something like put all the application 'files' in a database and then let the database sort out merging / compressing duplicate data for efficiency, great.
Just so long as I can as a sysadmin do:
SELECT * FROM APPLICATION_FILES WHERE PROGRAM_NAME="Photoshop"
and then backup that table / set of files / whatever or verifity them or virus scan them or move them into other places or whatever I need to do.

And, indeed, on LINUX what you have is a DATABASE of shared library names, versions, disk locations. They do NOT necessarily or usually cram them all into one directory. If an application has a library it wants to 'share' with other programs in the system it just adds a record to the databases and runs a command to update the database records of the linker.

So although some say spamming files all over your disc(s) is done for a "good reason", I'm QUITE aware of what they THINK are good reasons, and I reject the reasoning entirely. It's archaic, obsolete, half-baked, and doesn't well serve the needs of end users / system administrators or developers a large part of the time. There are trivial solutions that'd be far better than the status quo.

It's like saying we should all use GW BASIC and make all variables global because having everything shared is just so convenient. Hello, welcome to the 21st century, you know, we want things to be object oriented, isolated, secure against application <-> application or application <-> os malware, etc. The mark of good OO design patterns is how LITTLE 'internal data' about something you can POSSIBLY SHARE outside of your little subsystem.

Frankly from a security and administerability perspective I'd be just as happy if all major applications weren't only self-contained in their own private directory trees, but went so far as to run in private VMs almost totally isolated from the rest of the applications / storage on the system. We'd have a lot less crashes and malware that way, and installation / maintenance would sure be a lot simpler.


Originally posted by: nordloewelabs
Originally posted by: Nothinman
I'd bet money that it had nothing to do with Adobe trying to make their apps more portable and that it was just a holdover from the fact that most of their apps started out on Mac OS.
i also guessed that this capability had to do with their origins on Macs. btw, i've only tested this portability on Photoshop 6. it might not work on Photoshop 7 or 8 (the first version named CS).

No, in most cases it's done for real, good reasons like shared libraries or services.
is this really necessary? i mean, are such libraries and services big enough to justify spreading files all over the system? i see the point of such design in a small 5-20Mb app, but the last Photoshop releases have been pretty big (CS > 150Mb; CS2 > 300Mb). i'm sure sharing resources has a role in those decisions, but i doubt they are the only reasons behind them. piracy is probably a bigger concern.

on current Macs, can apps still be installed by simpling copying their installation folders? in Linux lots of progs are self-contained, right? dont they need to share resources?

 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
is this really necessary? i mean, are such libraries and services big enough to justify spreading files all over the system? i see the point of such design in a small 5-20Mb app, but the last Photoshop releases have been pretty big (CS > 150Mb; CS2 > 300Mb). i'm sure sharing resources has a role in those decisions, but i doubt they are the only reasons behind them. piracy is probably a bigger concern.

The size of the files or app are completely unrelated to whether they are or can be shared or not and how does piracy even come into a decision about where to place files in the filesystem?

on current Macs, can apps still be installed by simpling copying their installation folders? in Linux lots of progs are self-contained, right? dont they need to share resources?

And I think that decision for Mac apps is a really stupid one. The only way to make an app completely portable is to either rewrite a lot of stuff yourself, embed the shared libraries in your bundle or to statically link them into the binary itself all of which are bad solutions because now you've got 1 more copy of that library on your machine and in memory. So you end up wasting disk space, memory and if a security hole is found in that library you've got to update every copy instead of just one shared copy.

Most Linux apps are distributed as packages that properly depend on the shared libraries that they need so no, they're not self-contained.

On the other hand it IS a big deal that I'd want my application installations to be isolated, easily administerable, easily installable / deletable / relocatable, easily verifiable.

So having to update 10 copies of a library is more "administerable" to you than just having to update one?

Furthermore if I find malware in something like c:\winnt\system\vbrun6.dll, what application do I blame for putting it there, and what applications do I worry about being corrupted / compromised if that same file is nominally installed by, shared by, used by 20 different programs? If that file does get corrupted now I have the great benefit of *20* applications not working rather than *one*... that's progress?

Yes it is progress. Because it also means that when a security hole is found in c:\winnt\system\vbrun6.dll you can update that one file and have all 20 applications fixed instead of having to hunt around your hard drive looking for copies and figuring out which copies are vulnerable and need updated.

And, indeed, on LINUX what you have is a DATABASE of shared library names, versions, disk locations. They do NOT necessarily or usually cram them all into one directory. If an application has a library it wants to 'share' with other programs in the system it just adds a record to the databases and runs a command to update the database records of the linker.

Most distributions of Linux have a package database saying what's installed and what's not, but that's all it is. On Linux shared libaries are almost always thrown into one of two places /usr/lib/ or /lib. Some packages will keep files in a strange place like /usr/share/packagename/lib but that's usually only for non-binary things like icons.

So although some say spamming files all over your disc(s) is done for a "good reason", I'm QUITE aware of what they THINK are good reasons, and I reject the reasoning entirely. It's archaic, obsolete, half-baked, and doesn't well serve the needs of end users / system administrators or developers a large part of the time. There are trivial solutions that'd be far better than the status quo.

Then go ahead and start making a system that works how you like it, if it really is better than it'll take off.

It's like saying we should all use GW BASIC and make all variables global because having everything shared is just so convenient. Hello, welcome to the 21st century, you know, we want things to be object oriented, isolated, secure against application <-> application or application <-> os malware, etc. The mark of good OO design patterns is how LITTLE 'internal data' about something you can POSSIBLY SHARE outside of your little subsystem.

But you still need the objects to be accessible, if everything was restricted then your app wouldn't be able to do much. That's just how shared libraries work, they only export a certain set of functions and the internal details don't matter to the linking application.

Frankly from a security and administerability perspective I'd be just as happy if all major applications weren't only self-contained in their own private directory trees, but went so far as to run in private VMs almost totally isolated from the rest of the applications / storage on the system. We'd have a lot less crashes and malware that way, and installation / maintenance would sure be a lot simpler.

If you want you can do that, take a look at UML. But installation/maintenance would be probably be worse and using the system would be a helluva lot worse. Running every app in a totally isolated VM would pretty much kill copy and paste, do you really want to have to pass data between programs manually with temporary files?
 

crustacean1

Junior Member
Nov 15, 2007
6
0
0
OK thanks guys, I think I kind of understand.
Though i must admit it did start getting a little technical (and even political!) for me!
Thanks for the replies though, no harm in giving it a shot, and if it doesn't work then I'll just instal twice, nothing a 1TB drive can't handle!