Take Backup, Install Automatic update locally and commit changes to remote server

We encourage users to post events happening in the community to the community events group on https://www.drupal.org.
anilbhatt's picture

I would like to discuss my problem with you folks.

I want following the tasks to be automate with my local machine drush or shell script :-

1) Take a svn / git checkout of my remote server to my local machine.
2) Take backup of database and code-base.
3) Install the drupal updates of core and contributed module in local machine.
4) if #3 is successful then commit the changes to git or svn.
5) Make a SQL query file which contains list of SQL Queries those we ran with #3 at local machine.

Manually i except the last point i can do it but i want to to set up a cron job task for it,
is it possible to do this with Drush and shell scripting?

Thanks in advance.

Comments

greg.1.anderson's picture

See https://github.com/greg-1-anderson/utiliscripts/blob/master/drupal-backu...

Only git is supported, and I will also warn that this is a work-in progress; there may be some minor bugs. The Drush Extras project is required; you must install that first before using this script.

The process for this script is similar to what you describe above, but slightly different. The sql-hash command is used to determine if anything changed in the node or user table since the last backup; if not, the backup is skipped.

When a backup is done, the backup site is brought live on the local machine, so you can insure at any time that your backed-up database is in a working state just by visiting the backup site. Also, the git hash for the version of the code associated with the backed-up database is written into the database; this would make it possible to recover an old version of the site from an old backup of the database. Presently, only the most recent sql dump is saved, and there is no script to restore from older versions of the site, but that is planned for the future. I also plan to set up point-in-time recovery for my databases, so that I can recover to any snapshot point in the past using the binary logs.

The script also checks to see if updates are needed for the site; if there are any updates needed, then a second live copy of the site is made, and pm-update is executed on it. The updated code is not automatically committed to the repository. A critical step in the upgrade process is to test your site to make sure that none of the introduced code changes break anything. My plan is to write an administrative devops block to enable only on the update site, so that the admin can push a button on the homepage to commit the code and push it to the live site once testing is complete. I haven't even started this yet, though. In any event, the list of updated module versions is cached so that the update is not re-done if the admin does not test the site right away,

As for logging the SQL queries executed to copy and upgrade the site, I have no plans to do this, as I do not think that this information is interesting or useful. I do plan on insuring that the output from the pm-update command, including the output from all of the update hooks that are executed, should be logged and emailed to the admin. This is not done yet. Once I have the output from this script cleaned up, so that important info is emailed and mundane progress information is logged, then I plan on running this script daily from cron. Presently I run it manually, though.

This script is unsupported, but I will answer questions about it here. If it does not do things exactly the way you would like, it might still serve as an interesting starting point for you to use in putting together your own backup and update scripts.

You might also consider investigating the backup and migrate module, although it does not automate the pm-update step.

Need Steps of installation of your git script

anilbhatt's picture

Thanks a lot for your descriptive reply

I would like to add few more things here :-

You are checking only node and user tables but if update founds with the other tables like taxonomy or any other contributed module or custom module tables then your script will skip the backup. That means we are only relying on the node, users table for backup and why don't you use the drush makefile command to take the backup instead of checking the sql-hash.

Your script is really helpful for me thanks a lot for this but it would be more help full, If you would add some installation steps of your script, like your script require ssh, rsync and drush to set up on local machine before we run this script.

Thanks
Anil Bhatt

Yes, the script sort of

greg.1.anderson's picture

Yes, the script sort of assumes that if the node / users table are not changing, then the modifications to the site are probably minor and can be skipped. This is a design decision that was basically a judgement call based on the usage pattern of the sites I am backing up. You could add more tables here if you wished, or skip the sql-hash and back up every day.

Drush make is for code, whereas sql-hash checks for database changes, so the two are not interchangable. Currently, I check in my entire Drupal codebase into git, a process I have never been super happy about, although it does work. If you'd like to try checking in just a makefile, and use that to restore the code for your site, that might be a viable strategy. See http://drupal.org/node/1368242.

QUERY LOG - drush updatedb

anilbhatt's picture

Can anyone know that why drush is not printing Query Log, even devel sql queries log is enabled, i am running following command,

$drush -v updatedb;

Is there any other way to print the sql queries log with drush.

Thanks in advance,

drush

Group organizers

Group notifications

This group offers an RSS feed. Or subscribe to these personalized, sitewide feeds: