Sync Staging (From Staging to Production as Simply as Possible)

philbar's picture


To me a staging server is one where I try new modules, make settings changes, and otherwise screw around with the data structures. A production server holds the content, interacts with my users, etc.

My problem is when I need to create a staging server (by dumping the production server database) it immediately goes out of sync with the production server.

Attempted Solutions

  • Deployment - The deployment framework is a series of modules which are designed to allow developers to easily stage Drupal data from one site to another. Unfortunately, all the modifications have to be done manually, so on a large production site it can be very difficult to keep the stage site up-to-date with user content
  • Staging - Description is vague since it is a new comer to this arena. Appears to be designed to send staging configuration to the production site. Could potentially write over user created content.
  • AutoPilot - AutoPilot is a complete build system that captures code and configuration changes from entire teams of developers, merges and synchronizes it, and provides a framework for easy, consistent, and optionally, scheduled builds. Not developed for Drupal 6.x
  • Stage - The stage module taps into Drupal's revision system to allow host-based staging of content stored in the database. No module is available and appears to be abandoned.

Proposed Solution

What I would like to propose is a module which automatically syncs all modification made on the production site, back to the staging site. This includes new comments, node revisions, new users registrations, etc.

The way it would work is when an event happens on the production site, it is also triggered to occur on the staging site.

If a change is being made on the staging site that might conflict with this sync, the permissions should be modified to prevent this from occurring.

For example, if you are working on switching from using the Image module to using ImageField, you can temporarily turn off permissions to upload Images for all users. This way production events don't mess up the work your doing on the staging site. This also guarantees nothing falls through the cracks when all the images are migrated.

This will allow site builders to keep their staging site up-to-date with the production site. When the staging site is stable and it comes time to make the switch, you can just put both sites in maintenance mode and replace the database of the production with the staging database. When you disable maintenance mode, the new features and user content should be available with no headaches.



RobLoach's picture

Another solution out there is Features. I was talking with Ken at DrupalCon and he also brought up the idea of using Domain Access to mimic a staging website, but have special settings and stuff for the staging site.

I have this exact problem as

JayKayAu's picture

I have this exact problem as well. I would have thought that this problem is not only common, but would appear in pretty much every instance of Drupal out there?

In any case, thank you for taking the time to investigate these other solutions.

I've heard a suggestion from a colleague who works in a Joomla shop that there's a Joomla module which keeps track of any changes that users make to a site, and lets you roll-back any changes (useful if they're busy breaking the site by digging around in the admin area). I'd love to get my hands on something like this as well!

beanluc's picture

Under the model which the Deployment module serves, "staging" is where changes to managed content are made. Such managed content is never entered directly on production. The Deploy module sends approved content to production after editing, proofing, etc. is done at the staging site.

Things like comments are un-managed content: they're never staged, they're only ever entered in production. If you did want to manage "un-managed" content, production is where you'd do it, because such content isn't staged at all. No Deploy module anywhere in this picture.

So, how about getting a Development site which mirrors Production?
You could have a clone of your site, the clone's MySQL database would be a slave of the real live site's MySQL database. When slaving and replication is enabled and running, all DB changes in PRO are replicated to the slave database in realtime. If you then change things in DEV, your DB gets out-of-sync to the point where replication could fail. That's no big deal: it won't crash your DEV site, it will just stop receiving replicated updates/inserts/whatever from PRO.

Identify which specific tables you need replicated, OR which ones you DON'T want replicated.
For example: Maybe all you want replicated are nodes, comments, etc., so only replicate those tables.
OR maybe you know you'll be updating (let's say) menus and modules in DEV, so replicate all tables except those.

The idea here is that you don't expect to need to receive replicated changes to menus, modules, etc. because you don't expect any changes to those will occur in PRO, for one thing, and for another thing, those are the things you'll be changing in DEV. So by ONLY replicating the tables where you DO expect updates in PRO, you can change other things without introducing replication conflicts.

Google on the subject of replication. This is ONE way you could slave a test site from a production site and do experiments on the slave. Under this model, once you have your new configuration all tested on the DEV/slave site, you could either wholesale migrate it into PRO, or, you could repeat the config steps in PRO as if it was a stand-alone site, or you could stop replication long enough to use the DEPLOY module from DEV to PRO (maybe).

Just some ideas. But I do think that DB replication is going to be part of an ideal solution for you, with what's available today.

Dev to Prod

marynz's picture

This is exactly what I need, i've been looking for a relatively easy to implement solution for this (noob here).

Does anyone have any further solutions? Seems like a lot of folks need this but there is no easy answer...yet (hopefully)

i think that the solution can

Wassim Ghannoum's picture

i think that the solution can be a new module based on Backup and Migrate

the Backup and Migrate make a back up file for the database of the site each xxx minutes (it's specified from the configuration of these module), and store these files in a specifice folder on the server...

so now the idea is to make a new module, that get the last backed up databases from the production site (it should be on a specifice folder on the production site), and then importing these database.

the module should automatically get the exported database (by the Backup and Migrate) from the production site and import it into the back end site...

also it should keep the last xx (it should be configurable) copy of the database..

deploy module looks ace

jayboodhun's picture

Wow, the Deployment module looks ace. Makes it so much easier to synchronise database changes

i think that the new solution

Wassim Ghannoum's picture

i think that the new solution should be based also on the Backup and Migrate Files

so when we use These 2 modules: Backup and Migrate and Backup and Migrate Files we will get a completly backed up site..

Deployemnt is not the Best Solution, it can only Backup Data base, so if a user post a new node on the production site with an image using imagefield or ... this image will not be transmitted to the back-end site...

also if the administrator upload a new module or make any changes with any themes or any file in the production site, site changes will not be applied on the back-end site....

so we have to find a solution that will synchronise the Data base, and the files.... and this is not the Deployemnt case...

Google Wave as a conduit...

JayKayAu's picture

Hi there,

I've had an idea of how to possibly approach this problem using Google Wave as a conduit.. I've described it here:, and I'd be interested in your feedback.

Drupal Back Up / Migrate / Development Site

fcortez's picture

Check out this module Backup Client-Server:
It makes this process fairly simple. It not only backup your Database/s but the Website files too.

It's fairly straight forward with multiple configuration options for the more advanced user.

Were are we at on this

robertcfi's picture

This is amazing! We have sites which are constantly being updated on preview and live. How far along is this? Being able to have a module keep our preview site up-to-date would be amazing!

Staging is not for backup

rogical's picture

Usually, we have several servers in the project:

development -- this is only used for development, all changes are allowed
changes(from developer): system tables, data structures, code files and other none user generated contents

test -- new changes from development would come into here, and be tested with cases

staging -- new changes passed through test would come into here, and be ready for migrate to production in a scheduled time

production -- sync changes from staging in a scheduled time, such as midnight,
before that would make the site into maintenance status.

So, staging is just a changes pool for production, every changes only be allowed to migrate to production after the production administrator's approve.

Such module 'Backup and Migrate' is not capable for this.

One method for dev / staging

iancawthorne's picture

One method for dev / staging / live servers we use is with a configuration data only backup migrate profile. We will manually go through the database checking which tables are for configuration and exclude any which are not, so that data on the live site is not lost, such as tables storing node data, user data, order data, taxonomy data etc.

This works and hasn't had any side effects, but the possibility of data loss is quite high due to human error. I'd love to find a more fool proof and easy to manage solution.

I've been looking for a more

patrickavella's picture

I've been looking for a more fool proof and easy to manage solution since I started using Drupal about a year ago. So far, what you're doing, is the closest I've seen as far as "what everyone else does". A module that creates dumps of tables depending on their usage (maybe with checkboxes?) might be useful, just to ease the pain of deciding which tables to push/pull w/o disrupting a high traffic site.

One method for dev / staging

bmartinP4's picture

This is similar to what we do, in that we run shell scripts that rsync all the important directories (the theme, files etc.) and then use Backup and Migrate module to just backup the database on our stage site and restore it to the live site. This is a useless process for most sites as it would overwrite comments and other user data, but we don't have any user submitted data so we get no conflicts. I'd much rather set up selective SQL table replication (or just have a module that does it for me), and so far Deploy's inability to work with certain very popular modules is preventing us from getting any use out of it. Better integration for Deploy with other modules or another option entirely is needed for content replication/testing.

Curious if you wouldn't mind

aniebel's picture

Curious if you wouldn't mind sharing which tables you found as configuration tables?

It depends on your set up.

iancawthorne's picture

It depends on your set up. The more contributed modules enabled, the more tables need examining for configuration tables.

General rule of thumb was that the following table are content related :


All others are config. User tables would be added if uses are to be excluded.

Cheers, that's a great help!

aniebel's picture

Cheers, that's a great help!

'site_update' module solves the problem?

dtwork's picture

I saw a talk @ BadCamp2011 by Dave Cohen on his site_update module:

and have been testing it these last couple of days.

Basically, as I understand it, you create a 'base' site, in addition to your usual {dev,stage,live} sites.

Settings & configuration changes that are to be shared in common across ALL the sites are made to the 'base' site. Then, the module dumps the DB for 'base', storing it in your site tree, which you then check into your version control.

Any other site, e.g. dev, then checks out from 'base' -- including the dumped 'base' db, and the 'dev' instance of the module then is used to sync up changes.

What seems really nice about site_update's approach is a bit of sql record ID trickery. Normally, when you sync like this, all 'dev' changes would be overwritten -- but, with site_update, any/all changes made to the 'dev' DB are written to its DB with record #'s > 1,000,000; i.e., 1000001, 1000002, etc. The dev site's local changes are NOT overwritten, as the 'base' DB instances records are in the low-record-numbers.

Of course site code you manage as usual via check-outs via version control, e.g. git.

I've been taking it for a spin, and it seems to work as advertised.

What concerns me a bit is that noone else seems to be talking about it, or using it. Last thing I want to do is come to depend on an unsupported approach :-/

Seems like a great approach to me -- what do other people here think?

I think this might solve my problem

bailz777's picture

I am going to try this out as I think it might solve my problem.

I have a base site with the basic settings and have created a number of sites with the same settings, but different content.

I have been looking for a way to update all my sites basic structure by updating the db but without affecting any content on the relative sites.

Have you found any draw backs on this yet?

I got stuck at stage one :(

bailz777's picture

am I supposed to update the site_update_dump script or just run it?

I tried running it, but it doesn't do anything.

I understand that it should be doing a php_mysql dump of specified tables.

Please let me know how you got it to run.


can you post the command you

dtwork's picture

can you post the command you executed, and any console output you get?

Here is what happened in my Terminal

bailz777's picture

I included the path so you know exactly where I am at.

No I am sure I am supposed to change something in the site_update_dump script, but the explanation that is given is very unclear.

[lance@vps1 bin]$ ./site_update_dump
USAGE: root/base-site/sites/all/modules/site_update/bin/site_update_dump /sites/default/settings.php [--mysqldump_arg=value]
(Typical mysqldump arg is --compatible=mysql40)
[lance@vps1 bin]$

[lance@vps1 bin]$

dtwork's picture

[lance@vps1 bin]$ ./site_update_dump
USAGE: root/base-site/sites/all/modules/site_update/bin/site_update_dump /sites/default/settings.php [--mysqldump_arg=value]

You're not doing what the USAGE snippet indicates -- where's your argument?

the explanation that is given is very unclear.

On the module's site,, it suggests: "Be sure to read the README.txt before you enable and use the module!"

The README has simple, clear instructions of exactly what to do.

Read the README before asking the question

bailz777's picture

I did read the README.txt but maybe my question was not clear.

In the README.txt it states this.

The dump will by default go to stdout, but we want to save
the output to sites/all/database/site_update.sql. Why there? Because
update.php will look for it there later. So run the site_update dump
like this (change all paths as needed for your install).

cd /path/to/my/drupal
sites/all/modules/site_update/bin/site_update_dump site/localhost/settings.php > sites/all/database/site_update.sql

Once the script is run, you can commit
sites/all/database/site_update.sql into source code control, in order
to replicate it on every copy of the site. When settings are changed
on the base, re-run the dump script and commit the changed site_update.sql.

Unfortunately for me I am very new to drupal, sql and bash :(
I was hoping that you could help guide me to better understand the README.txt instructions and to setup the module correctly.

I did read the

dtwork's picture

I did read the README.txt

That was my point. DO what the README tells you to:

  So run the site_update dump like this (change all paths as needed for your install).

  cd /path/to/my/drupal
  sites/all/modules/site_update/bin/site_update_dump site/localhost/settings.php > sites/all/database/site_update.sql

NOT what you've done:

  cd /base-site/sites/all/modules/site_update/bin

More importantly, this sort of support should really go into the 'site_update' issue queue ...


charos's picture

I wonder why no one mentioned Aegir to deploy a staging site and then rsync or make some Drush tricks ( drush rsync @live @dev)! Check also

And here's a fresh concept :

charos's picture

And here's a fresh concept : dog-new-era-drupal-sitebuilding


Christopher James Francis Rodgers's picture

dog (lower-case-only) = Drupal On Git (or is it Gitt?)

Vapor-ware ??

  • portable, deployable Drupal site packages.

All the best; intended.
-Chris (

"The number one stated objective for Drupal is improving usability." ~Dries Buytaert *

Agreed with bmartinP4 totally!

rmjackson's picture

I was really bored last night and so decided to write up a general description about a solution I implemented not long ago. The production db is setup to replicate downstream to both Staging and Development slave instances of the Drupal DB. Each of those, in turn, serve the actual Staging and Development instances of the site.

What's important is that both the Staging Slave and Development Slave be setup as chained slaves, very key!

When staging and dev need to be refreshed at the next code sprint, replication is turned off on both Staging Slave and Development Slave, assuming all table engines aren't InnoDB. Those data directories are copied over to a fresh instance of the Staging Master and Development Master with a bash script.

That's how you keep it all going all the time, for those who seem to never stop coding/bug fixing. :) You can leave production on a reliable/costly cloud instance. Then move staging and development internally. It's a wonderful feeling when it all comes together and the Development/Testing flow starts to happen again the way it should.

Note: Takes some time to setup correctly -- my.cnf and replication-- but it's totally worth it in the long run.

[[|boring blog post]

This would integrate well

MisterSpeed's picture

This would integrate well with my LifeCycle project: