Are NFS (or other types of) shared files run over network or locally?

Cooky

Golden Member
Apr 2, 2002
1,408
0
76
We've got several file servers. Most are MS Windows, and some are Linux NFS.
My question is, when an executible is run, does it
1. get copied to the client PC first, and then run locally
or
2. run on the server and transmit the output over network to the client PC?

I'd like to know how the shared filesysem will impact our benchmark tests.

Any feedback is appreciated.
 

Smilin

Diamond Member
Mar 4, 2002
7,357
0
0
In some form or another it's copied down first. This may simply mean it's copied down to local memory but the important point is the file is pulled down all at once instead of a bit at a time.
 

Cooky

Golden Member
Apr 2, 2002
1,408
0
76
What did you mean by saying "the file is pulled down all at once instead of a bit at a time"??
 

Smilin

Diamond Member
Mar 4, 2002
7,357
0
0
Originally posted by: Cooky
What did you mean by saying "the file is pulled down all at once instead of a bit at a time"??

bit was probably a bad word...I didn't mean 1/8 of a byte :p

I meant the whole file is pulled down (to either disk or memory) and then the code is executed or processed as necessary. A data file can be sequentially or randomly accessed at the remote location. Executables are completely transfered from the remote location before access.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
In some form or another it's copied down first. This may simply mean it's copied down to local memory but the important point is the file is pulled down all at once instead of a bit at a time.

No it's not, it's paged in as necessary. I'm sure there's some read-ahead done so that it's not actually pulled down in 4K (or 8K on 64-bit systems) chunks, but in general the whole binary won't be read until necessary.
 

volrath

Senior member
Feb 26, 2004
451
0
0
Originally posted by: Nothinman
In some form or another it's copied down first. This may simply mean it's copied down to local memory but the important point is the file is pulled down all at once instead of a bit at a time.

No it's not, it's paged in as necessary. I'm sure there's some read-ahead done so that it's not actually pulled down in 4K (or 8K on 64-bit systems) chunks, but in general the whole binary won't be read until necessary.

That seems like a Bad Idea when running an executable over the network.
 

spidey07

No Lifer
Aug 4, 2000
65,469
5
76
that's why you don't run executables over a network or share.

considered bad mojo.

but as far as moving files, the whole file is copied to memory. the difference is database applications where a question is asked of the server and the result is returned.

or differences like mulitcasting or streaming media/voip where it is a constant stream of data and is moved in and out of memory.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
That seems like a Bad Idea when running an executable over the network.

So? Why special case NFS for no real gain? To me it seems like a worse idea to read in several M of executable and shared libraries if your'e only going to end up needing a few K of them. In most OSes the filesystem is irrelevant, the same semantics apply whether the file is mounted locally, remotely or even from an in-memory filesystem like tmpfs. There is a special case called XIP, that allows an executable to be executed directly from memory if the device permits it. So that you can run something from flash memory or tmpfs without making an extra copy for no reason, but that doesn't apply to network filesystem since you still have to have a local copy of the pages that you want to run.

that's why you don't run executables over a network or share.

Maybe in the Windows world, but for unix it's standard procedure in a lot of situations.

but as far as moving files, the whole file is copied to memory

The network is irrelevant in that case, even if the source and destination is local the file has to be read and written in totally, unless they're local to the same filesystem then you just rename the file.