Drupal development - staging - Production - deployment strategies

We encourage users to post events happening in the community to the community events group on https://www.drupal.org.
chinchunarayan's picture

Dear all,

I thought it would be very helpful if I would bring it up here.

Our situation is that we are moving a large, multi-user knowledge management Portal into private beta. While we wish to have a "live" version of the site, with content creation, file upload , image upload and addition etc, we wish to maintain active development on several aspects of functionality.

In general, developing functionality, modules, etc, will require modification of the attached database, creation of nodes, views , content types etc on the "Development" version of the site.At the same time that users are adding nodes, content, etc to the "Production" live version of the site.

This would seem to make the advice "record and replication of what you've done" a little impractical. If you design a series of modules and nodes, replicating the node/content is not necessarily trivial. If you design portions of the site based on the node structure (nids or tids etc), and those numbers are taken by the action of users on the "live site," replicating the work can clearly NOT be trivial in many situations.

How do people handle issues such as these in "live, dynamic production environments?"

One solution that comes to mind is to link both live and production versions to the same database, with very frequent automated backup of the db, such that "if anything breaks" because of db changes, the database can be quickly reverted to a previous state (and the broken database etc then inspected). Is this practical?

What other strategies are out there?

Here's my current process:

  1. Dump Mysql production database.
  2. Over-write my development site's database with my production site msyql dump file.
  3. Install the new module/functionality/configuration option code on my dev server.
  4. Reconfigure the module's admin settings on the dev server via the admin interface on the dev site.
  5. Write down - by hand - the module setting changes I've made on the dev site (so that I can replicate them on the production site)
  6. Push the module code changes to production.
  7. Log into the production site, replicate the module setting changes I've just made on the dev site.

This just seems like a lot of work. And what if I don't configure the module settings just the same way on the production site as I did on my dev site?

I really love Drupal, but until I can convince my organization that there are solid ways to test and deploy modules from dev to production, I need to go for other CMS or options.

Thanks,

Chinchu

Comments

Don't base on TID/NID

spp's picture

Hopefully, you already have a pretty good handle on your taxonomy and replicating TID changes shouldn't be particularly difficult. As far as NID, don't base your site on them. Use pathauto or suck to create good path names and base your development on those paths. You might also want to look at some of the modules that allow you to easily dump and restore your database and migrate content from site to site.

Features

qbnflaco's picture

The features module allow you to group a bunch of content types/views/spaces, etc into modules which can easily be turned on. It also uses strong arm so you can even add in entries from the variables table. You can also set what modules are required for each particular feature. Also permissions are included in this. I think there is some sort of exportables integration with taxonomy(?). I'm starting to use this for re-usable functionality groups on my sites so I don't have to start from scratch for image galleries, news sections, events calendars, page editor, etc.

I've also started looking into deploy module for creating nodes on dev and then moving to live at a certain time.

How much do you use features then?

AndyF's picture

There are some modules that sound awesomely powerful that I really want to get familiar with: features, context, spaces, strongarm... I just haven't found the time yet. With my current knowledge (ie not knowing how to use these modules) the best way I know to tackle the OP's task is to use a custom module to create necessary config changes in code, splitting different areas into different files to be a bit more source control friendly. So my question: do you find that you use features exclusively, or do you use a mixture of approaches (eg. image gallery and news sections are features, smaller site-specific configs are made within custom module, something else entirely???). Would love to hear more...

Thanks!

Andy

As an

qbnflaco's picture

As an themer/builder/can't-yet-create-a-module-on-my-own kind of guy, I've used already existing modules for most functionality and most of my projects have been to new sites. One of the new site's I've been experimenting with building with features so I can create a "page" feature, for example, with WYSIWYG, role, required modules that are needed for page editing/creating functionality. If any modules use the variables table I can also import those using the strongarm module in features. It takes a little bit to wrap your head around, but it seems to be pretty good so far.

I'm also going to create site-specific one like a "homepage" context that brings in site-specific blocks as well as a "global" context that sets blocks on a global basis.

I haven't fully completed the project I'm starting this off with, so I'll let you know how it gets closer to the end. :-)

Thanks, sounds fascinating

AndyF's picture

Thanks, sounds fascinating stuff. Really must get round to checking them out.

this may be usefull

Two aspects to dev-staging-production

mukesh.agarwal17's picture

IMO, there are 2 aspects to the dev-staging-production:

  1. code -- use cvs/svn/git to maintain code versions -- i strongly recommend to have all the contributed modules in sites/all/modules/contrib and all the custom modules in sites/all/modules/custom and also implement the theme within sites/all/themes so that you always need to update/commit from sites/all

  2. database -- the point that you have been talking about is exactly what i gave a presentation on a Drupal camp in Pune, India. Here is the link -- http://www.slideshare.net/drupalindia/migraine-drupal-syncing-your-stagi... -- it talks about using Migraine.. but it does require some understanding from the technical point of view. It has been very handy and has worked cleanly for all my projects. Let me know if it is helpful. If interested to try out this solution, here is the link for the script required for D6 -- http://groups.drupal.org/node/19196

Cheers,
Mukesh Agarwal
www.innoraft.com

Link did not work

mukesh.agarwal17's picture

Try this -- http://bit.ly/c4IMUc

Cheers,
Mukesh Agarwal
www.innoraft.com

Hi Mukesh, I took a look at

AndyF's picture

Hi Mukesh,

I took a look at your slides a while back, and was rather scared off by the Limitations page (more generally by the idea that some tables and some changes might not be neatly categorised as just content or just configuration) and found myself preferring the idea of using a custom module to provide config changes. How big a problem have you found that with migraine? Do you use migraine in combination with other techniques to store config?

TIA

Andy

Not really limitations

mukesh.agarwal17's picture

Hi Andy,

You can assign all doubtful tables as one of configuration or content and then repeat the minor configurations if required. Actually, I have been using Migraine with drupal installations which have over 250 tables and I havent really had a problem yet. With the concept of sequences table being removed from Drupal 6, it has become a lot more easier. I have been using only Migraine for database migration. Actually its always a good thing to create a branch in your repository for your database dumps, that way you know which database to rollback to if you have a major code rollback.

Btw, custom module will face the similar issue. I'm surprised that with custom module, you are achieve a generic database migration and synchronize the same with code.

Cheers,
Mukesh Agarwal
www.innoraft.com

Thanks for the quick

AndyF's picture

Thanks for the quick response!

repeat the minor configurations if required

Could you elaborate on this?

Btw, custom module will face the similar issue. I'm surprised that with custom module, you are achieve a generic database migration and synchronize the same with code.

Just to make sure we're on the same page, when I say custom module, I'm actually talking about something a bit more specific. The idea I'm thinking of is using a module's install file to handle configuration. This lets you make all sorts of changes from one place, in a source control (and export) friendly way. The update hooks play nicely with this idea, and if you have many internal revisions to one live revision, you can refactor the small internal revisions into one update. I don't know of any issues with this approach, but it a) is a bit more complicated than using something like features, and b) doesn't necessarily allow as easy export and reuse (though much better than db dumps).

mukesh.agarwal17's picture

Hey Andy,

By repeating the minor configurations I meant to make the same change on production site after having made it on the staging site. Again, this does not really helps. Depends on the use case for a table to be able to be categorized both as content and configuration. If your staging server is the right setting, then you can treat the table as configuration and if the right setting is a production site, then you can categorize the table as content. You need to decide based on the transition you are making --> dev to staging/staging to production/production to staging.

Talking about the module that you are talking about, I really think its a good idea. Update hooks will surely help. So all you will need to do when you move from staging to production is update the module. The update hook will run ALL the changes. The question here would be that how will you capture the changes on the staging/dev site. I might make changes again and again to make the final call -- do you intent to capture all the update/insert queries? Even if the answer is yes, how would the module know that the update/insert/delete query is something you want to migrate coz you might just updating/adding/removing a content from staging to test a new feature? I'm curious to know about how you can capture all the changes in a module coz that way we can contribute this and make it available for all the drupal developers.

Cheers,
Mukesh Agarwal
www.innoraft.com

Hey Mukesh change on

AndyF's picture

Hey Mukesh

change on production site after having made it on the staging site

I think the Holy Grail is to eliminate that step... easier said than done of course.

The question here would be that how will you capture the changes on the staging/dev site.

Yes, that is the question! I've not been using this method for long, but I'll give you my two cents. As I see it, the module should use the core Drupal API, or something higher level (eg. Views API)*. If necessary, this means hand-coding changes, but where possible you want to avoid this. So you want to export whenever possible: modules like schema, node_export, views_export, content_copy, permissions_api, and variable_dump can help with this. Also look for support of CTools' exportables API - it'll be great when wysiwyg supports it for example. These are relatively low level modules, for the nuts and bolts. There are a number of modules** that are designed to help with exporting anything you want, but I haven't found the time to investigate yet. I'd love to hear anyone's experience with those modules.

do you intent to capture all the update/insert queries?

Yes. I think anything else is creating problems for yourself down the line.

how would the module know that the update/insert/delete query is something you want to migrate

Basically it's not an automagic approach - you're crafting the module to build your site, rather than setting it to record, and then to play:)

And of course something like features might make this approach seem old-fashioned:)

Andy

* James linked to a Tag1 Consulting article that uses the install/update method, but where the functions contain direct SQL queries. This seems more error-prone, less upgrade-friendly, and generally scarier to me, but I'd be interested in hearing people's thoughts.

** Not mentioned in the comparison, but high on my check-it-out list are feeds and Transformations -- Drupal data. (I think the top of my list is now potential alternatives: features and yamm.)

I've been using a custom

jherrmann's picture

I've been using a custom module to promote changes to multiple environments. I wasn't sure whether Features would handle all the updates that we need to address.

There are a lot of strategies out there for promotion. I've been combining them piecemeal into a comprehensive solution that works for our Dev, Test, Stage and 5 Prod boxes.

I would suggest reading this ( http://books.tag1consulting.com/scalability/drupal/start/staging ) on a sourcing strategy and watching this ( http://www.commonplaces.com/resources/drupalcon-paris ) on "Managing Your Project in Multiple Environments" from DrupalCon Paris.

James Herrmann
Sisters of Mercy Health System

Some interesting thoughts on

davidwatson's picture

Some interesting thoughts on the subject! As a solo developer, the workflow I currently use is something like this:

1) Back up the database on Dev.
2) Configure everything to my heart's content - using the UI if necessary - to "get a feel" of what I want and what modules/hooks to achieve it.
3) Create exports of everything exportable: Views, Rules, and Macros for the really off-the-wall stuff that's hard to achieve otherwise.
4) Write code to duplicate effects that aren't exportable, adding it to incrementing hook_update_n() and hook_install() for the modules in question as necessary. Core site functionality goes in its own custom module.
5) Revert the DB on Dev, and run update.php to confirm that changes are successful.
6) Commit to the repository.

7) Back up the database on QA/Staging
8) Pull the changes from Dev repository to QA/Staging, and run update.php
9) Run functional/acceptance tests as necessary. If things break, file a ticket(s), revert and fall back to Step 2.

10) Put the site into offline mode, and pull the appropriate revision from the repository
11) Back up the database on Production (while in offline mode!)
12) Run update.php, and confirm that everything has deployed correctly
13) Bring the site back online. Repeat ad infinitum.

This ensures that everything changes the DB in an incremental way - that is, because functionality is versioned using hook_update_n(), it can always be upgraded regardless of the state of the DB on the target site. It also brings feature-related DB changes into revision control, making life much easier down the line.

never heard of the macro

qbnflaco's picture

never heard of the macro module before, awesome! :-)

Macro is amazing, but...

davidwatson's picture

Macro is a strange, wondrous and dangerous beast - basically, it provides an API for mimicking form submissions in an exportable format. While this sounds absolutely fantastic (and don't get me wrong, it is for many things), there are a few design flaws in the module, the most notable being the way in which it handles cached data and the fact that it assumes each form will be submitted only once per session. This Lullabot article explains it a bit better, but suffice it to say that it's usually best to look for a module or hook better suited for the job, first. ;]

Have you ever scripted the

jherrmann's picture

Have you ever scripted the creation of Relationship Types in the User Relationship module? What might be a good approach to this?

James Herrmann
Sisters of Mercy Health System

Not yet, at least...

davidwatson's picture

I can't say that I have, no. But personally I'd follow the same procedure I would for any similar task: if you have x feature of y module, see if the module provides an API to do it for you. It's almost always cleaner and more efficient (computationally and otherwise) than using Macro to export the entire $form_state and replay it, especially if the module provides its own export format like Rules or Views does.

The API Documentation for the module is always a good starting point. Although there is no method documented here for creating relationship types directly, we might look at the way the database is structured and attempt to do it manually using SQL if the query is simple enough (making those "back up your database" steps vitally important). user_relationships_types_load(), the method for loading all relationship types, should provide some insight as to how the tables are structured and how your INSERT query will have to look. If this proves too cumbersome, then I would feel comfortable falling back on Macro exports, bearing in mind the caveats outlined above.

Note, however, that this API documentation is tagged as 5.x, although the project page links to it - confirming that this part of the API remains unchanged in 6.x is an "exercise left for the reader," as they say in my field. :] If it is different, I'd definitely file an issue in their support queue.

HTH!

chinchunarayan has a solution?

Kevin P Davison's picture

chinchunarayan, have you found your solution from this string of comments yet? I have yet to decide for myself, and I have a very similar situation between DevelopmentProduction. Users continue to add content, while we continue to develop. Development falls behind, and then we need to migrate carefully, with as little down-time as possible (15 minutes MAX).

Kevin Davison, Web Development Manager

Down time not acceptable

mukesh.agarwal17's picture

Most of the time, any kind of down time is not acceptable. Looking forward to a stable solution for the same.

Cheers,
Mukesh Agarwal
www.innoraft.com

Not to say that it's the "One

davidwatson's picture

Not to say that it's the "One True Way" by any means (and I'd love to hear ideas on how to improve it), but using the method I've outlined above, I've never clocked more than three minutes of downtime. Maintenance Mode, back up the DB (while the site is frozen, to prevent data loss!), run updates, confirm, switch to live. This assumes you've done QA and deployment testing on another environment ahead of time, however.

Hi timsethay01, You can make

krishnaa's picture

Hi timsethay01,

You can make different packages of different sections by Features Module from staging version, which includes Content Type,Views,Panels,Menus,Taxonomy,Dependency Module and many more and then you can use these packages on Production version of the website.

http://drupal.org/project/features

For content Migration,you can use the deploy module with Services.

http://drupal.org/project/deploy
http://drupal.org/project/services

Just getting into staging/deployment processes

Andy B's picture

Hi.

I used to just work on the live site. I know, not a very wise idea as I found out a few times on some of our old projects. I am now just getting into the idea of staging from dev to prod/live sites. I am somewhat new to Drupal as well. For now, we are using nothing but Drupal 6.19 core and contrib modules so the custom programming/coding is nonexistent except for theming changes. Here is what the server setup looks like:

  1. "Local dev" which is on the computer I actually sit at. This runs xampp 1.7.2 with Drupal 6.19 on XP Home. This is where all of the proof of concept, goofing around and dev get pushed into. An example, "here is a module I might be interested in..." dump it on the dev server and see what it looks like. If it looks good, use it in the site. If not, remove it and try again.
  2. "test server" is where everything that passes testing and is going to be sent to the live site goes. This runs Windows Server 2003 with IIS 6. Since server 1 and 2 are totally different server setups, there is a need for a "test server" so fine changes can be made and an inital test run can confirm the dev push works.
  3. "live server" which runs on server setup 2 above. If everything on test server runs good, it gets sent to live server.

Is there any easy way to get this workflow between servers done? The other thing, since custom coding might be involved later (especially theming), how would you setup source control? It would need to be on the dev server of course.

Deployment module may solve

Abhijeet Sandil's picture

Deployment module may solve your problem.
See http://drupal.org/project/deploy

Could you please not post

Xano's picture

Could you please not post every discussion you start in groups that have NOTHING to do with it?

Thanks.

Just bumping this a bit, I've

trothwell's picture

Just bumping this a bit, I've been looking heavily into this process and I've come across Drush & Aegir. I'm just wondering if anyone here has had any success with these tools with the process of deploying between development -> staging -> production with very minimal downtime?

Cheers.

Not speaking for Aegir, but

threading_signals's picture

Not speaking for Aegir, but rsync for localhost (use virtualbox for application update testing for distro specific OS as well?) which gets user defined synced to ip restricted "beta" site (deny robots as well), multisite for staging to production with the use of symlinks. Make sure you have backups and change confs for each stage. I've heard of Capistrano for database scripts, but it takes some work.

Cheers.

Paul

Jesus answered, "It is written: 'Man does not live on bread alone, but on every word that comes from the mouth of God.' - Matthew 4:4