Difference between revisions of "Cedeus DB backups"
(→File transfer) |
(→How to set up Backups) |
||
Line 59: | Line 59: | ||
This example is based on the shell script posted here: http://stackoverflow.com/questions/854200/how-do-i-backup-my-postgresql-database-with-cron | This example is based on the shell script posted here: http://stackoverflow.com/questions/854200/how-do-i-backup-my-postgresql-database-with-cron | ||
For a better Postgres dump script it may be worth to look here: https://wiki.postgresql.org/wiki/Automated_Backup_on_Linux | For a better Postgres dump script it may be worth to look here: https://wiki.postgresql.org/wiki/Automated_Backup_on_Linux | ||
+ | |||
+ | ==== Result Summary of GeoNode Data DB Backup ==== | ||
+ | |||
+ | * server: CedeusDB | ||
+ | * cron job running nightly at 1:00am | ||
+ | * using the script ''geonodegisdb93backup.sh'' | ||
+ | * copies the PG dump file to CedeusGeoNode into folder ''/home/cedeusdbbackupuser/geonodedbbackups/'' | ||
+ | *: ToDo: perhaps change this and copy it to cedeusgis1 for straight backup on a drive | ||
=== dump of the GeoNode user db - on CedeusGeonode VM === | === dump of the GeoNode user db - on CedeusGeonode VM === |
Revision as of 18:35, 4 December 2014
>> return to Cedeus_IDE
Contents
- 1 How to set up Backups
- 1.1 notifications
- 1.2 dump of the GeoNode DB - on CedeusDB
- 1.3 dump of the GeoNode user db - on CedeusGeonode VM
- 1.4 tar/zip of the (uploaded) GeoNode file data and docs - on CedeusGeonode Vm
- 1.5 MySQL dump for Elgg miCiudad - on CedeusGeonode VM
- 1.6 tar/zip of the (uploaded) Elgg miCiudad files - on CedeusGeonode VM
- 1.7 MySQL dump for Mediawiki(s) - on CedeusGeonode VM
How to set up Backups
notifications
To get notified about the backups via email, a/the shell script may send emails via "mailx" - i.e Nail. => see http://klenwell.com/press/2009/03/ubuntu-email-with-nail/
Btw. postfix may work as well
=> ToDo: Install mail program
dump of the GeoNode DB - on CedeusDB
- create a shell script that contains the pgdump instructions - see /home/ssteinig/pgdbbackup.sh on CedeusDB
- test if script or script execution actually works. A simple script for testing may perhaps be this (/home/ssteinig/touchy.sh)
-
#!/bin/bash touch /home/ssteinig/ftw.text
- create a cron-tab entry for user ssteinig with "
crontab -e
"- then add entry such as "
00 01 * * * sh /home/ssteinig/geonodegisdb93backup.sh
" to run the dump script daily at 1am - => when using the user "postgres" to do the db dump
- check if postgres user has a password assigned already (use ALTER... to do so: http://wiki.geosteiniger.cl/mediawiki-1.22.7/index.php/Setting_up_geonode#Some_PostgreSQL_commands )
- create a .pgpass file to provide the password: http://wiki.postgresql.org/wiki/Pgpass
- then add entry such as "
- check also if the cron is running: "
sudo service cron status
" otherwise start it... - to see what the cron tab contains use "
crontab -l
"
File transfer
To tranfers files I decided, for safety reasons, to create a new cedeus backup user on the receiving computer (20xxb...p).
A file transfer can be accomplished using scp or better rsync e.g.:
- "
scp /home/ssteinig/ftw.txt user@example.com:/home/backup_user/dbbackups/
"
- However, a ssh key should be generated first so no password needs to be provided. A detailed dscription can be found on: http://troy.jdmz.net/rsync/index.html
- in short do "
ssh-keygen -t rsa -b 2048 -f /home/thisuser/cron/thishost-rsync-key
". But do not provide a pass phrase when generating it, otherwise it will always asked for it when establishing a connection. - Then copy the key to the other servers users .ssh folder (using scp), and add it to the authorized_keys. (Note, the authorized_keys should be chmod 700).
- Then we would use "
scp -i /home/ssteinig/cron/thishost-rsync-key /home/ssteinig/ftw.txt user@example.com:/home/backup_user/dbbackups/
" - note that it is probably necessary to initialize a server connection once (with whatever file), so the connection gets an ECDDSA key fingerprint.
- "
- having my ssh keys setup, the code for syncing the cedeusdb directory with rsync would be
- "
...ToDo...
"
- "
Example geonodegisdb93backup.sh
#!/bin/bash logfile="/home/ssteinig/geonode_db_backups/pgsql.log" backup_dir="/home/ssteinig/geonode_db_backups" touch $logfile echo "Starting backup of databases " >> $logfile dateinfo=`date '+%Y-%m-%d %H:%M:%S'` timeslot=`date '+%Y%m%d-%H%M'` /usr/bin/vacuumdb -z -h localhost -U postgres geonodegisdb93 >/dev/null 2>&1 /usr/bin/pg_dump -U postgres -i -F c -b geonodegisdb93 -h 127.0.0.1 -f $backup_dir/geonodegisdb93-backup-$timeslot.backup echo "Backup and Vacuum complete on $dateinfo for database: geonodegisdb93 " >> $logfile echo "Done backup of databases " >> $logfile # sstein: email notification not used at the moment # tail -16 /home/ssteinig/geonode_db_backups/pgsql.log | mailx blabla@blub.cl
This example is based on the shell script posted here: http://stackoverflow.com/questions/854200/how-do-i-backup-my-postgresql-database-with-cron For a better Postgres dump script it may be worth to look here: https://wiki.postgresql.org/wiki/Automated_Backup_on_Linux
Result Summary of GeoNode Data DB Backup
- server: CedeusDB
- cron job running nightly at 1:00am
- using the script geonodegisdb93backup.sh
- copies the PG dump file to CedeusGeoNode into folder /home/cedeusdbbackupuser/geonodedbbackups/
- ToDo: perhaps change this and copy it to cedeusgis1 for straight backup on a drive
dump of the GeoNode user db - on CedeusGeonode VM
blabla
tar/zip of the (uploaded) GeoNode file data and docs - on CedeusGeonode Vm
blabla
MySQL dump for Elgg miCiudad - on CedeusGeonode VM
blabla
tar/zip of the (uploaded) Elgg miCiudad files - on CedeusGeonode VM
blabla
MySQL dump for Mediawiki(s) - on CedeusGeonode VM
blabla