Einstein is starting the roll out of the S5 data with a new app. All the juicy details can be foundHere
I'll update this as our generation and testing of our first S5 workunits proceeds
The names of these workunits are of the form h1_XXXX.X_S5R1x_* and l1_XXXX.X_S5R1x_*.
The application running these is called einstein_S5R1. This application incorporates many of the speedups and other code changes suggested by Akos. Note: please do NOT replace these stock executables with custom versions of 'albert'. They are not compatible.
There are two types of workunits: short and long. The short workunits have XXXX.X less than or equal to 0400.0.
There are also two types of data files: short and long. The short data files (l1_XXXX.X) are from the LIGO Livingston Observatory, and are about 4.5MB in size. The long data files (h1_XXXX.X) are from LIGO Hanford and are about 16MB in size. Note: once your computer downloads one of these data files, it should be able to do many workunits for that same file.
We are switching to a new uniform system for awarding credits. All users on all platforms will claim the same credit for each workunit, with an amount of credit proportional to the length of the workunit: 'equal credit for equal work'.
To try and increase the total amount of computing power available to the project, we have changed the target number of results and minimum quorum from 3 to 2. Only if the first two results from different hosts/users do not agree will additional work be generated.
Please be patient with us if we have to sort out last minute problems or other issues. We have been testing this privately for some time, so we are fairly confident that there are no significant issues that remain. Nevertheless, several of us on 'the cutting edge' are rather short on sleep!
Bruce
Added 14/06/2006: please feel free to post questions here, but keep in mind that it may take some time before we have time to answer everything.
Added 15/06/2006: we have now generated about 1500 workunits. So far all is looking well. There are a fairly large number of download errors in downloading data files. But these appear to be mostly due to some problems we had yesterday in replicating data to our mirror sites. There are also interesting errors on *some* Mac OS X PPC systems. I'm sure we'll sort this out quickly.