This should definitely not be the way to do this !!!!!

Events happening in the community are now at Drupal community events on www.drupal.org.
maurya777's picture

So far I have been developing sites on my local machine and dumping all the files + database to my live server after the development is done. These were some small projects for which I used some contributed modules and wrote some small modules for them.

Now I have a requirement to provide some changes and updates to my live projects. The only way I can do it now is to keep working on my local site and keep a track of all the steps I have performed. Once I am done with the local site perform the same steps on live server.

Definitely this should not be the way to do this !!!!!

What is the proper way of releasing these updates?

As far as I have searched I have found Features module, but this will not support my custom modules. Those modules have their records saved in database but they are not nodes (I have not implemented them as a content type).

Comments

Try backup migrate module

abhijeet sandil's picture

Try backup migrate module .

It will be helpful here.

Or you can make your custom

dalin's picture

Or you can make your custom modules compatible with Features.

--


Dave Hansen-Lange
Director of Technical Strategy, Advomatic.com
Pronouns: he/him/his

You can check the blog of

DjebbZ's picture

You can check the blog of Nuvole, they have interesting posts about Features-driven development and how to store database stuff in code in case Features module isn't compatible with your custom stuff.

specify nature of "some changes and updates"

benoit.borrel's picture

Depending on the nature of what you referred as

some changes and updates

hook_update() and storing views in codebase (not database) could help you.

Benoit Borrel

This is a problem with all CMSes !!!

maurya777's picture

Thanks friends for your replies,

I am planning a solution in which i will create a dump of schema of data base with or without data on machines of everybody working on the same project. These files will be kept in SVN and merged. Once every body in team is done with the changes they will commit their changes in SVN. A script on staging server will take svn update of this schema and Compare it with the data in database and it will update the pre-configured* tables in database. Now if all tests are passed same procedure will be done on live database. A new schema will be generated on live server and committed to SVN which in turn will be used by team again.

This is just a plan in my mind, what do you guys think that the problems I will face. I know other will have also tried this solution but nothing is available online so far. Please provide you thoughts. This is a generic problem with all the CMSes in the world I guess.

The solution may not be a Drupal module may be but just a php script run through Cli or Drush for Creating, Comparing and Merging of Database.

  • We have to manually select the set of tables categorizing between Configuration(Settings) and Data(Content).

This method puts a heavy

c4rl's picture

This method puts a heavy reliance on version control to do the "right" thing which could be completely ambiguous in the case of conflicting commits or conflicting primary/unique key data.

Even if certain items are not natively supported by the features module, it is still a best practice to have all configuration changes authored into code, and handled (generally) with hook_update_N(), drush scripts, batch API, etc.

@maurya777, if you figure out

mpaler's picture

@maurya777, if you figure out a relatively simple, automated solution to this problem, that is both robust and replicable, you will be forever revered in the hallowed halls of Drupal stardom. Good luck to you.

Good luck, but I don't think

cweagans's picture

Good luck, but I don't think that will work very well, and more importantly, I don't think it will be adopted by many people. It's one thing to build a good system for your own use, but if you get everybody else using the same system, sooner or later, people will want to improve it.

The reason I say that probably won't work very well is that it's a bad idea in general to have software deciding what to do with your database. I think that a better approach would be to have a script that can deal with the following scenario:

1) I build version 1 of a site and dump it to version1.sql
2) I continue working on the same site and when I'm done, I dump it to version2.sql.
3) I run this hypothetical script like this: script.sh version1.sql version2.sql
4) The script runs a diff on the two database dumps, strips out all the unneeded stuff (like cache entries and that kind of thing). If there are new tables, the script would figure out what module created them. After it has all of this information, the script would write a file called 'sitename_custom.install' with a hook_update_n() implementation in it that has a bunch of code statements in it that use the Drupal API to duplicate all of your changes on whatever site you enable the sitename_custom module on.

Not sure how much sense that made, but something that is like features but work at the database level to figure out what needs to be done.

Honestly, your best bet is finding a good way to get all of your changes into code.

--
Cameron Eagans
http://cweagans.net

Ummm no, it's not really

websupportguy's picture

I have to take exception to the suggestion that devel-to-live migration is an issue with all CMS products. It really depends on what is being stored in the database and folders, versus what is stored in the configuration files.

If the CMS is set up so that site-specifics are all stored in one or two config files (e.g. settings.php for Drupal), then you can migrate your database from devel to live any time you like and it should work. Similarly, if you can quarantine your settings file from being uploaded, you can synchronise your site files any time you like and they should work.

Unfortunately Drupal is not quite built this way. But it should be.

Tony

A CLI script with Drush

jrsinclair's picture

I've written up my approach to release management, which sounds similar to your approach (but uses Git instead of SVN). You can check it out at: http://www.opc.com.au/web-development/drupal-release-management-drush-an... The CLI script is available for download at the bottom of the page. The advantage of this approach is that you can customise your Drush site aliases to ignore certain tables, which makes moving data between development and production much easier.

The approach works well for smallish sites, but you will run into issues if there is 'secure' data on the production server that your developers shouldn't see.

Let us all know how you go.

A small team of developers

decibel.places's picture

A small team of developers can use VC (eg SVN) and database dumps using Backup & Migrate as part of the controlled filesystem.

The repository can then be pushed to staging, and once tested there, to live.

Occasional discrepancies in the dbs can usually be resolved with a diff/compare editor (IDEs like Eclipse have that - hint: decompress the sql before comparing!).

The key when using SVN is to commit frequently, and when working with others, update then commit even more frequently.

High performance

Group notifications

This group offers an RSS feed. Or subscribe to these personalized, sitewide feeds: