Staging Content to Production Servers

We encourage users to post events happening in the community to the community events group on https://www.drupal.org.
kbahey's picture

There is a discussion on the development list about some clients wanting to "push" content from a staging server to the production server.

One way to do this for content is using John VanDyk's publish and subscribe.

Another is Node Import/Export which is being written by Jeremy Epstein as part of SoC.

Those can be taking further with users and roles too being migrated from staging to production, and that can be a truly federated environment.

You can see the original discussion in this thread.

Thoughts?

Comments

Hybrid approach

mfredrickson's picture

If I were to implement this, I would take a hybrid approach. First, I would set up two drupal installs in a single code base using standard multi-site (sharing modules and themes). I would share the users and roles tables between the two.

I would use pub-sub to push content from the production server to the staging server (so that it always has a live copy of the production materials).

When edits had to be made, enter mysqldump to dump the staging server db and import it to the production server. A few quick scripts would save most of the effort needed to make the dumps and uploads.

perhaps I can find some time to test this idea (I doubt it) and report back...

confused

moshe weitzman's picture

kbahey is talking about pushing content into production, but it seems mfredrickson talks about pushing content back to stage.

Different goals, different definitions?

mfredrickson's picture

Yes. To me a staging server is one where I try new modules, make settings changes, and otherwise screw around with the data structures. A production server holds the content, interacts with my users, etc.

My problem is when I need to create a staging server (by dumping the production server database) it immediately goes out of sync with the production server. By pushing content from the production server back to the staging server, I can keep the content in sync while still making data structure/settings change in the staging environment.

When I want to make my staging server the production version, I can dump the database and upload it (or even easier, just change the database information in settings.php)

If you need to moderate content, just use workflow in the production server. Much less overhead, much less work and it will do what you need.

-M

I agree with this, and have

jantzen's picture

I agree with this, and have in fact been looking for a way to do it. Currently I just test out stuff on one installation and then if it works out I install it on the production server, but it would be nice to have more of a "workflow" to push new modules, new flexinode types, and maybe even upgrade core drupal (IE 4.6 to 4.7).

1.5 years ... what is the status

kbahey's picture

Now 1.5 years have passed since the original thread. The requirement is still there: some web sites want to create content on a staging server and then click a button to push that to the live server. No direct publishing on the live server.

Last we checked it seemed that publish and subscribe are the way to go. However, it does not have a 5.x version, and the world is now more CCK, Views ...etc.

Import/Export API is another route, but does not do that at the moment.

I can't believe that I am the only one who gets this "staging server" requirement with clients?

Drupal performance tuning, development, customization and consulting: 2bits.com, Inc..
Personal blog: Baheyeldin.com.

Drupal performance tuning, development, customization and consulting: 2bits.com, Inc..
Personal blog: Baheyeldin.com.

Feed API

Boris Mann's picture

Feed API + improved pub sub...sort of.

Very much a concern, we may have more work to do on this. Djun Kim is leading this for us.

A solution?

freixas's picture

WorkHabit claims to have a solution. See http://www.workhabit.org/autopilot-beta-schedule. The have not released their final code, although there is an early version of the Autopilot module available at http://drupal.org/project/autopilot.

Tony Freixas
tony@tigerheron.com
http://www.tigerheron.com

Not for content

Boris Mann's picture

Khalid was specifically talking about content. AutoPilot is mainly for code and configuration.

Content staging would be for something like a magazine or other more rigorous workflow, where content is reviewed centrally and only the "final" version gets pushed live to a production server.

My problem is not how to get

ebeyrent's picture

My problem is not how to get content from the staging environment to production, but how to export all views, content types, configuration changes, etc. Sure, you can individually export some of this, but when you have a large change set of files and changes made through the admin interface, it's difficult to manage.

On my current project, I have five developers working on different features. Each developer has his own sandbox to make changes in, and we have a staging environment where we merge all our individual efforts into.

Since AutoPilot is still in its infancy, does anyone have any suggestions for how to migrate our changes from staging to production?

Amen!

janelle11's picture

We have the same problem and perhaps this requires a new thread. My project is in it's final stages of testing, preparing to beta launch in a few weeks and we are very concerned about change management due to this problem and the large audience for the site. We've been hoping for AutoPilot to mature, but it looks like we'll need an intermediary solution for the time being. If anyone has a good system in place to manage frequent or large scale changes across multiple environments, please share your experience. Thank you!

Export views and cck to a module

Boris Mann's picture

As soon as you finish using the GUI to create views or cck types, they should be exported and added to a module, then checked in using SVN. Then, those become versionable using you regular code repository, rather than sitting in the DB.

But what about navigation

ebeyrent's picture

But what about navigation changes? Taxonomy? Settings in the variable table?

Would you recommend putting all of these changes in a module?

No

Boris Mann's picture

Just pointing out that we have solutions for some of these things. What you are talking about is the deployment tasks that need to be rule based and automated. And, also, off topic WRT content staging :P

Agreed. The problem is that

ebeyrent's picture

Agreed. The problem is that they are still manual and not integrated with each other for a single unified solution. You shouldn't have to write modules to easily move large numbers of views, content types, etc. Maybe the solution is to pour as much support into the AutoPilot as possible to speed up development?

a side benefit?

seanburlington's picture

I've been looking to improve the security of my Drupal installations - a key concern being the way eval() is called on database content.

This tends to be done in exactly the places you describe - the GUI parts of CCK and view construction.

It looks to me that it would be possible to disable the eval statements in the production setup you describe - while retaining the GUI for the staging (or at least development) server

Sean Burlington
www.practicalweb.co.uk
London

Sean Burlington
www.practicalweb.co.uk
London

Views already has the UI as

catch's picture

Views already has the UI as a sub module which can be switched off, whether or not you export your views to a module.

Repo is the Answer

Souvent22's picture

Yes, Boris is spot on as usual. You should export your "structures" (Views, CCK) to a module of some sort so that you don't have to worry about them and pushing them to your site is really easy.

As far as "content" changes, that's a different animal. My personal feeling is that content changes should and do happen at the API level, thus enters the MacroMacro module. I think content changes from staging to production should happen through MacroMacro becuase it will exhibit the SAME exact behavoir that one does in the staging environment. This takes care of say content that exists in both production and staging and new content.
This means you could not only update the "About Us" page, but also add a new ( of 5 new ) announcements of type "story". The bottom line...you don't have to worry about ID's!

Also, regarding repos, are you branching and tagging? Meaning, when you get to a stable point for a staging/QA release, you Branch the trunk to a new "1.3" branch, and when ready, tag the 1.3 branch as "QA-SITE-1-3-1", then "QA-SITE-1-3-2" to work on perhaps 1 feature in the 1.3 branch? (I made all that up for the sake of discussion)?

Again, I've always believe change management is always a combination of technology AND best practices. You need both to make it all work.

I can't believe that I am

Dave Cohen's picture

I can't believe that I am the only one who gets this "staging server" requirement with clients?

Don't believe it. You're not alone.

You may be interested in these discussions:
http://drupal.org/node/181128
http://www.dave-cohen.com/node/1779

Re: Staging Content to Production Servers

hasan-gdo's picture

We're currently using a series of bash scripts to control the staging > content push mechanism. As new or updated content is readied, PMs review the content and if it's good, it's pushed live in a matter of secs. Works well even if content needs to be sync'd across several sites. I tend to prefer this vs FTP, etc until a "one-click" solution emerges...=)

P E A C E

Hasan

P E A C E

Hasan

New Files

Souvent22's picture

Hasan,

How does your script reconcile new file uploads (e.g. images, attachments, etc.) ?
Are they just pushed as a 1-1 qa.fid => prod.fid along with the node?

I've just suggested a BOF at

Dave Cohen's picture

I've just suggested a BOF at DrupalCon to discuss issues like this. Please vote and/or comment on that proposal if you'll be at DrupalCon:

http://boston2008.drupalcon.org/session/updating-and-upgrading-live-sites

For reference, also see

1kenthomas's picture

http://drupal.org/node/140430#comment-726331

Which seem the two best options to me...

~kwt

Looking for a solution on a smaller scale

chadarizona's picture

I think my requirements are a bit simpler. I do not manage a team of developers or coders, I work by myself, building a local site on my computer, then loading it to my external web host as I make changes to the code, themes, add modules, etc.

Is there a simple tool, some simple steps, or even a tutorial as to how I can do this effectively?

I think I found it

chadarizona's picture

Ok, after some more searching I found the tutorial that I think I am looking for:

Site to site transfer with phpMyAdmin and a FTP Client
http://drupal.org/node/53479

If anyone has any other thoughts...

This saturday a video broacasted session about staging

JBI's picture

This saturday will be held a session about staging. France24.com team will explain there odd and even numbers solution.

That way we will catch up with BOF session at last Drupalcamp when we were not able to engage a dialogue across the ocean because the sound was not working.

More information on https://barcamp.pbwiki.com/DrupalCampParisStaging
Fell free to edit the page and tag the videos.

We are considering around 3PM French time.

As there is time differences could you please tell me when at the earliest you would be available ? on the comment.

how is this going on? how

joetsuihk's picture

how is this going on?
how about tracing the "UPDATE", "INSERT", "DELETE" queries,
store in a new table, and push the queries into production?

attachments can also be traced by investigating the queries..

A single click solution is

elvis2's picture

A single click solution is welcomed! I might be doing this the hard way but... just in case someone is looking for other ideas...

Setup sites with multisites:
www.something.com
dev.something.com

In sites/ directory each www and dev have their own modules and files directories.

When the client comes to me for changes, I backup the dev database, then mysqldump the www database and import it in to the dev database. As a rule of thumb I always compare the two databases side by side (removing cache, cache_*, watchdog tables) with a file comparison. I look for possible problems within the comparison then move on to the tasks at hand. Note: this is not at all efficient for large databases. If needed you can split up the schema into one set of files and the data (INSERTS) into another - this helps a little. To make the comparison ever quicker alphabetically order the INSERT queries, this ensures quicker reads with your eyes and gets each sql file as closely order to each other as possible.

Once it comes time to import the dev database into www, I do another side by side comparison and change what is needed. Again, this is not efficient for large sites with lots of nodes, files, comments etc. I suppose some of the efficiency part depends on how often you do work for the client.

===
Elvis McNeely
Blogging about Drupal and web development: http://www.elvisblogs.org/drupal

forking or staging - D5 sites

tomhung's picture

One problem with site copying (forking or staging) is the database name. Many tables, including the variables table, use database name.

Variable table is specifically a problem. If you do a mysqldump and search and replace any occurrence of the database name you may be in for trouble. The variable table serializes the string, so if the string length is different from the text replace you will get hosed when you import your SQL.

If you FORK or PUBLISH your SQL database to another system or another database name you must use the same string length.

We make the name of our database a constant string length as well as a unique string. ex. "drupal_abccompany_123project_jkdsuikjlds" the last bit is a random string to make the complete string 40 characters. This way you don't have to modify the serialization metadata when you search and replace the database name from the sqldump! :)

One-click publish to staging, publish to live

kevcol's picture

I'm also trying to figure out how to set-up a site with a staging and live server.

I'm talking purely content -- editors would make updates, push a "publish to staging" button, review their work and if it looked good (no typos, weird line breaks, bad links, etc.) click a "publish live" button to make it live. (In larger organizations, one editor pushes to test server and gets another set of eyeballs to review it.)

Every major news and sports content site I've worked on has that and, frankly, I think it's a prerequisite for any professional site with a decent audience. Otherwise typos would be going live all the time.

I don't know anyone at The Onion or My Play, but I'm guessing they must have this, no?

If not, does anyone think they know how to build it? (And how much it would cost?)

(Code changes are a totally separate issue and need their own server -- so there are three levels: dev for code changes, staging for content preview and live.)

One more thing

kevcol's picture

Oh, one more thing:

The sites I'm working with do NOT want/need complex workflows with notification and multiple steps to publishing. When news breaks, they need to move FAST -- just a quick once-over on a content staging site, and then live. E-mails notifications, etc. are unworkable. (People literally shout across a newsroom or send an IM, and get a green light within seconds. E-mail or in-system notifications can stagnate for minutes.)

There are successful

moshe weitzman's picture

There are successful workflow systems where you use only the live site for this and notifications need not be a part of it. But you seem convinced that those are undesireable so far be it from me to argue otherwise. For pushing content to other environment, you should check out the newly revamped deploy module. It has grown up and become a perfect match for your desired solution.

Thanks

kevcol's picture

Cool, I'll check out that deploy module.

BTW, my bias against workflows is based primarily on other CMS's. From Vignette's StoryServer in the mid-90's to a woeful Rhythmyx set-up I recently had to dismantle, every workflow I've encountered has been unsuited to the fast-paced publishing model I'm interested in. The perfect workflow for that may exist -- I just haven't found it yet.

williadx1's picture

I need to be able to design changes in dev, and push these to live without losing community based and "live update" type articles. Currently, the only way of updating involves dual entry of articles into dev and live, then copying the entire site ( files& db) to overwrite live. This means that
A) downtime occurs on the live site
B) dual entry of articles occurs and
most importantly
c) loss of all community type content on the live server.
I am new to the community and would love to know how other people avoid these issues (especially the data loss issue).

Sync Staging Site with Production Site

philbar's picture

I created a post for a module request to provide the functionality discussed in this thread. See it here:

http://groups.drupal.org/node/26008

Hell of a problem...

jhl.verona's picture

I've added my 2 eurocents as a comment to the thread "Maintaining live (production) and development versions of large dynamic site"

HTH

John

P.S. Where did the preview go?

My Situation

mahnster's picture

I think my situation is up the line with this discussion. I think one thing people lack in knowing what the OP is saying is that they need an example. Here is mine:

I have a new module which created a content type. This content type is called "Project Email" which uses the Title and Body of a regular node (for email subject, email body), then adds fields, such as email type (HTML, plain), what project the email is for (project is another custom content type which is for our newprojex.com site). So, now this is done, on the staging server, I need to enter a bunch of emails that get sent when a "project" is at different stages or users act upon the project. These emails are to be multi-lingual, with various translations that get sent based on the user's language.

So, my production server doesn't have these nodes I am making. I have lots to make and then translations to make. I don't want to have to redo this on the production server. However, already new nodes have been created by users already (such as the project content type)...I can't just export my node table from staging, not even "WHERE type = 'project_email'" and then import them to production, or otherwise there would be two nodes with the same NID (e.g. when production was cloned to staging, top NID was 1000... I add 1001 and 1002 nodes, a user adds node 1001 to production... can't import my nodes to production!). Also, my node relies on fields in another database table, project_email...those are related by node id. So even if I could import my node table into production and get new node id's set, my new project_email table still has the old nid's...

What's the solution???

Hmm....

Foreign Key

cleaver's picture

You will probably need an additional key. The problem seems to be that node id has different meanings. It is the primary key for the node table, and in addition, you are using it for reference to the email table.

I'm assuming that the relationship between the node and the email table is one-to-one. This means you could have a foreign key in either the node table or email table. It seems to me the most straightforward is to add a field in the node as a foreign key referencing the primary key of the email table. This key will not change. Then you can import the nodes and not care if the node ID changes.

To import the nodes, you might look at Feed API or it's successor Feeds.

The Problem of Clashing IDs

sassafrass's picture

The problem of unique IDs clashing from one environment to another is not new, nor is it unique to Drupal. The common solution to this problem in the software engineering world is to assign all database entries a set of IDs that is always unique to that environment. For example, allow the development environment to only use IDs from 0-10000 and the production environment to use IDs from 10001 and up, or another option that I like better because it is not so constrained, is to assign the development environment odd numbered IDs and the production environment even numbered IDs. As with all solutions this isn't ideal. It requires the writing of code and, on top of that, not all Drupal tables have a numeric unique ID field. The work around is to write additional code to add one yourself. It's a complicated solution to a complicated problem, but once written, like any piece of software, it automates those repetitive, manual, error-prone tasks that we humans are so poor at doing.

Your Situation

SandStorm's picture

You made everything dependent on the Nid? I doubt if there is a way to match data from staging to production ... and match Nid's

Maybe if you add a serial number field to your content type to avoid being dependent on the nid and then using views to filter accordingly, you might get around the problem. You could then easily use the backup and migrate module to move the tables you need.

Would that help?

There's another alternative

braham.p's picture

There's another alternative that gets tossed around: the "Deploy" module. It's supposed to leverage Services to make staging content relatively painless.

Warm regards
Braham Pal Singh

Transfer Staging Data to Production

sumit22kalra's picture

Hi,

i am not able to understand that how can i transfer my data from staging server to production server .

Problem :
In Production Server day to day user registered and node create every time.

For Example In Production Server nid is 5256 is used for Registered user but in staging server 5256 is used for my New Content Type.

In above case if i export data from staging to production server then nid create the issue

Please tell me how can i solve this issue

At my previous job, we have

wuinfo - Bill Wu's picture

@sumit22kalra

At my previous job, we have spent a lot of time and effort to find a safe way to sync between staging and production.

I have written a blog for the process: http://wulei.ca/blog/drupal-deploy-strategy

I think it is a problem or a challenge that Drupal is facing. I hope Drupal 8 will solve it completely.