While synchronize works quite well, it is not flexible enough for what I had in mind. To be able to integrate Amazon S3 with a real life backup and DRP policy, I want a set of atomic functions that enable me to simply transfer particular data objects between S3 and the server.
Using the Jets3t rich set of S3 objects and the samples provided by the author, I found it quite easy to write additional proof of concept tools.
I now have Create Bucket, Upload Single file, and DownLoad Single File programs. These are all barebones tools, use them at your own risk..
All tools use the iUtils helper tool that provides for central management of parameters and credentials. In fact, iUtils uses exactly the same configuration file Synchronize does.
I timed the Upload and Download tools with a 500MB savefile, and on my network it took less than 30 minutes to go either way.
The usage is quite simple:
- CreateBucket requires a single parameter, the new bucket name.
- uploadSingleFile requires two parameters: the target bucket name, and the file name to upload
- DownLoadSingleFile requires two parameters: the source bucket name, and the file name to download.
Using Amazon S3 in an iSeries backup/recovery scenario
S3 can be used as an offsite backup for some of your data, maybe even for all of your data. A library can easily be saved to a save file, and the save file uploaded to S3 storage until needed, at which time it can be downloaded and restored to your iSeries.
For example, look at the following set of commands that saves changed objects to a save file, zips it and sends it to S3. The extra zipping is required because the data compression built into the iSeries save commands is not very efficient, and because I have not implemented it yet as an integral part of the upload tool (although the functionality exists in the jets3t package).
/* create save file */
crtsavf s3wk200901
/* backup objects changed since December 31 to save file */
savchgobj obj(*all) lib(mylib) dev(*savf) savf(s3wk200901)
refdate('31/12/2008') clear(*all) dtacpr(*high)
/* copy save file to IFS */
cpytostmf frommbr('/qsys.lib/mylib.lib/s3wk200901.file')
tostmf('/backups/s3wk200901.savf') stmfopt(*replace)
/* further compress save file */
qsh cmd('jar cMf /backups/s3wk200901.zip /backups/s3wk200901.savf')
/* upload compressed save file to S3 */
qsh cmd('cpytos3.sh mybackup /backups/s3wk200901.zip')