A critical look at the implementation of Plugin Manager - Underlying issues and possible solutions

adrian's picture

I have been harboring some growing concerns about the direction of the plugin manager integration proposals for Drupal core almost since their inception, but have not been able to make an informed opinion on the subject until I had properly reviewed the proposed system on a much deeper scale.

While my impressions of the project and the current implementation are fairly negative, I have aimed to frame my criticisms in a positive constructive manner because I believe that the goals of the project are worthwhile, and that for a large section of users this could be a very flexible and usable solution.

Unfortunately a lot of my concerns have been well founded, and I believe that the implementation of this system needs to be taken in a different direction for it to be able to succeed in its goals, while not negatively affecting projects like Aegir, Open Atrium and many other ‘serious developers’.

These are the underlying problems with the current implementation. Keep in mind, that these aren't necessarily problems with the underlying 'idea' of the system, so they can still be worked around or somewhat resolved by adapting the implementation.

Stability problems

The system as it is implemented now relies on a full Drupal bootstrap, because it is a Drupal module. There are more ways to break a Drupal site than I can even count, but this system automates these problems so that they occur with a much higher frequency.

Updates can break the system and have no way to recover

These issues include, but are not limited to :

  1. Modules can and will conflict with each other
  2. Modules can break the system update page
  3. Only enabled modules can be updated

An example of this I actually ran into while testing this patch is the following :

I could literally have used any mechanism to completely break this system, but basically what I am getting at is this should not be a Drupal module.

This system needs to be insulated from all the ways Drupal can break, and still be available if the rest of Drupal isn't. If this is not done, it will cause an explosion in the support requests that are made by users who do not understand Drupal.

This system is not being developed with real data

During my research I struggled to find even a handful of projects that have multiple Drupal 7 releases at this point. I did however find the update_module_test project, but that just illustrates the point that this entire system is being built before we actually have a way to properly test it.

Almost all the other packages I found created side effects ranging from completely blowing away the site to subtle somewhat recoverable issues (such as blowing away my menus and blocks).

We simply can not try to build and finalize this system until the packages it is meant to install have matured. To do otherwise would be putting the cart before the horse.

The current implementation can not be opted out of.

Something that not a lot of people seem to understand is that this system is ONLY really useful for a very specific use case, namely that of the 'end user' installing drupal on shared hosting where they already have ftp available to them.

With this baked so deeply into the update module, it's impossible for people who this system will not work for to disable it. There needs to be an opt-out functionality for this, that can be easily deployed.

The requirements for running this system are in no way universal outside the target audience.

I have been a Drupal developer for many years, and not once have I allowed FTP access to any of the sites I maintain, due to the fact that FTP is inherently insecure and that I simply do not want the site admins messing around with the code on the site. If I were to give out FTP access, it would only be for the files directory of each site.

There is the ssh backpath, but that is even more difficult to roll out as there has never been a stable version of this pecl extension and it is not packaged by most distribution vendors. There
are also some fundamental issues around the security issues surrounding this extension that I will get into later in this document.

I had to go through a lot of trouble to get this system even working to test it. I've never run SSH or FTP servers on my local development environment, because I've never needed them. Why would I ?

This system breaks build management.

This system only really works for occassions when there is one and only one instance of a site. in a real development environment you have the local development instances, the shared development instance, the staging instance and the production instance.

The code running these instances needs to be managed with a revision control system like SVN.

This system completely breaks that model by making each of the systems separately updateable at all times, with no oversight of what the other instances are running and no way to get all this stuff back into a revision control system and actually managed.

With no 'off switch' for this functionality, it will become a hell of a lot harder to follow proper development methodologies and make the lives of 'serious' drupal developers a lot harder.

There's more on this subject I will be going into later in this document, but suffice to say this is rather a big issue for me and others.

The security model is security theater

A big deal has been made about how this system is supposedly more secure than the wordpress implementation, but as it currently exists it is may be more dangerous.

you are entering your credentials in a potentially hostile environment

This is a major major side effect of the functionality being a Drupal module and running in a fully fledged Drupal.

Any XSS exploit allows you to gain access to the uid1 account, will allow you to set up a custom block or something similar that can then simply sit back and wait for the affected user to update their system.

The first thing people do when a security release is announced is update their system, and this allows you to harvest system logins from infected sites, which will have much longer lasting consequences than anything else.

There are many many vectors this can be done through in a full drupal, you could add some js to the javascript cache, you could modify the menu router, you could use FAPI to plug into the functionality and write some files etc. etc. etc. The flexibility of the Drupal API is our own worst enemy for this use case.

The plugin manager system should not be part of the normal Drupal code flow, it needs to be able to depend on a set of KNOWN good files available to it.

It is unsafe by default because of plain text passwords

Even if you set up the SSH2 extension, you are still sending your password via plain text.
We could set up a packet sniffer at drupalcon and get ssh / ftp login for hundreds of sites over that period of time.

To make this actually safe, you would need to set up HTTPS for the site. This is impossible for the main use case, and for the security conscious (who in all honesty would already be using Drush for this), it's simply another thing to manage.

I'm willing to let this slide ONLY because FTP itself sends passwords via plain text, which is a major reason why most security conscious users won't have FTP available for use.

The ssh2 extension is not really safer either

There has never been a stable release of the ss### extension in 5 years, it frequently doesn't even compile with the current stable version of libssh2, the requirements to build it are in no way commonly installed in hosting environments, much less the extension itself as no major OS vendors even package it.

After you go through the entire rigmarole to get the SSH2 extension working, you will still be sending your password via plain text. All the ss### extension will be able to do is encrypt the connection from the web server to the web server itself over the localhost loopback.

It is easier, and simpler to install an FTP daemon and configure it to only accept connections from localhost, and gives you the same conceptual security once you have configured HTTPS for the site.

This extension has no place in Drupal core. It has incredibly complex set up requirements for nearly no benefit.

Web server write-able files are not really that insecure

Firstly, we need to make peace with the fact that Drupal already does write PHP files.
That's how we create the settings.php.

What makes that 'secure' is that we set the permissions correctly after we have written it.
We don't need FTP or SSH2 available just to create the file, and neither do we need to do that for module downloads.

I actually went and looked at how wordpress does it, and their implementation is actually better than our current one.

If your permissions are correct for wordpress to do the filesystem level changes itself, it just works. But if they are not, AND wordpress doesn't have the right permissions to be able to change them ... it uses FTP solely as a mechanism to log into the site and change the permissions so that wordpress IS able to update itself.

Once wordpress is done, it sets the permissions back to a safer level (the same way we do for settings.php).

That system is a lot less complex and probably slightly more secure than the current implementation of this system, as the only thing it does is a chmod.

It doesn't try to write files with a higher permission level than the web server and there's no way to plug in and extend this functionality.

This system is not suitably multi-site aware.

This is a big issue for large implementers, and is a direct consequence of the narrow focus on the use case this system suffers from.

What needs to be understood with this functionality, is that Drupal is has more in common with wordpress MU than wordpress. A lot of the assumptions that wordpress can afford to make do not work for Drupal.

This system is destructive, replacing original packages.

I tested this by creating multiple D6 sites, and placing a module in sites/all/modules.
From one of my test sites, i updated this module and the code went and replaced the
shared instance of this module, instead of placing the module in sites/$site/modules.

What made it even more spectacular, is that the version of this module that was downloaded
broke the system spectacularly, breaking not just the site i upgraded but ALL the other
sites using this module.

Potentially hundreds of copies of modules.

If the system was correctly downloading the modules in sites/$site/modules, it would mean
that every module would be installed possibly hundreds of times in different file system locations for each of the modules.

There is no way that you can sensibly use this system in a multi-site environment.

The ftp credential nightmare

You would need to provide FTP credentials for each of the sites in a manner that they do not have the ability to wipe out each other's files in a manner that this system would understand.

This makes the requirements for using this system much more difficult.

Updates for shared modules

Because this site is running from WITHIN Drupal it has no concept of any of the other sites that
are running on the Drupal stack. What this means is that if it does update a shared module, or do a Drupal core upgrade, it automatically breaks all the other sites on the same code base.

It can break them so badly that you can't even log into the sites to be able to run the updates.

This system needs to be far more intelligent than Drupal is about it's own environment.

This system is too deeply Drupal.org oriented.

This is a side effect of not being able to disable this system, and goes deeper into why this system won't work for a wide variety of use cases.

Only drupal.org sourced packages are managed.

This system is only capable of feeding from Drupal.org. This means that if you developed a custom theme / module for this specific site, this system will not be able to install or update it.

There is currently a mechanism in update module to override the update url in the module .info file, but future plans for this system includes fetching and installing modules for install profiles and the like during install.

This is an issue related to build management, because the build is tested with a certain set of modules and other components. It's very likely that the code that is being automatically downloaded could break your sites in a variety of interesting ways.

But because this system doesn't allow you to easily syndicate your own glue code, it means that your entire build process is for naught.

It is very difficult to provide your own package source.

This is not an issue with the implementation of this system itself, but part of the underlying problem with it being centralized.

Until we had the feature server, the only way to provide a package source was to be running the project module locally, and also be running the cvs scripts and buy into the entire drupal.org packaging and management environment.

Doesn't care about local changes

Another case this will break down is if you needed to patch an upstream package to work with this specific environment. Even if you have contributed the patch back upstream and are working towards getting it included, this system will wipe out your modified files with no easy way to get back to what it should be running.

Even if you had completely valid reasons for needing to modify a module, there is no way to syndicate your modified version to use instead.

There is no mechanism to provide versioned dependencies.

This is as much an issue of Drupal core as any system built on top of it. If you have a glue module for your project that provides a bunch of panels for panels 2.x, there is no way to specify
that you should not just upgrade to panels 3.x when it comes out, blowing away your entire site with no way to recover from it.

How can we fix this ?

The implementation in wordpress works, and it works well, but even in wordpress I was able to find the cracks in their armor.

I fully believe we can implement this system, and implement it better than they have, but we have to accept that our requirements are far more complex than wordpress'.

This is a separate system.

Pure and simple, this can not and should not be a standard Drupal module. For this system to work for drupal it will need to be a completely separate system that is shipped with Drupal and may use some of the Drupal core APIs.

What this entails is that there is a separate 'plugin.php' or similar, that operates with it's own code flow and is never running when Drupal itself is.

The update module could still link into this system if it has been enabled for that site, but for security purposes it should use a different login / session ID , so that exploits on the site won't allow the attacker to gain access to the plugin manager.

Do not bootstrap Drupal fully

Similar to update.php, this system needs to be able to run even if the actual site is not running.
The moment we hit drupal_bootstrap_full, the contrib modules that may possibly be broken will be loaded, and people can start writing exploits for this functionality.

If we choose to use the Drupal API from within this system (and there are reasons we could choose not to), we would likely need to implement another bootstrap phase for Drupal,
namely something like DRUPAL_BOOTSTRAP_CORE, which only loads the files we know we can depend on.

From the perspective of major drupal version upgrades, it would be better if this system did not actually rely on the underlying Drupal API to function, as that would mean a broken core upgrade could still be fixed from within this system as it will still be functional.

Get rid of the false security

The code already in core is incredibly complex and actually counter productive. Even if we decide not to go the web writable route (which is a lot simpler, requires a lot less code and is very likely equally or more secure), then we should still get rid of the ss### extension.

Use a web writable 'overlay' search path

One of the things that I noticed from the wordpress implementation, is that unlike Drupal it only stores the modules it downloads in a single path that needs to be made writable namely the wp-content directory.

For all cases other than a core system upgrade, I believe it is sensible to have the downloaded code be stored in a new contrib/ top level directory, that mirrors the currently used directory search paths as implemented in : api.drupal.org/api/function/drupal_system_listing/7

This would involve a small loop which just prefixes 'contrib/' to the search paths, and avoids a lot of build management issues WHILE providing a simple way to roll back to the original code base.

Provide a simple killswitch for the entire system

A large majority of current Drupal users will not be able to use this system due to the points i mentioned above, so in that sense it should be possible to touch a file or something similar to stop the update links from being generated and so-forth.

Default site only login would also partially accomplish this, because if there is no sites/default this system will be silently disabled as there is no way to log into it.

A kill switch is a very simple way to fix the fact that this system breaks build management, proper development methodology and is far too drupal.org oriented while still making the very useful functionality available to users who can make use of it.

This system is meta-drupal

We need to accept the fact this system needs to know more about the Drupal installation than Drupal itself does. It needs to be aware of which sites are on the system and which modules they are using.

The way I envision it, there will be a single instance of this system for the entire Drupal installation.

To mitigate the issues in a multi-site environment we could enforce that only the uid of sites/default has access to this system, and that when this system operates it will take
all the other sites offline while it runs the necessary updates on them.

Another possible option is to manage the permissions to this system separately, in a sqlite database, which will be a convenient way to store the overarching list of packages on the entire system vs what's available and make sure that it's available even when the database isn't.

Make an exception on code freeze for this system

After playing with this system, I've come to the conclusion that it is going to be very difficult to do an iterative development process on it until we have a good set of actual data to use with it.

If we do choose to develop this as a separate system that is shipped with Drupal, I feel that it would be prudent to allow this system to be implemented primarily during code freeze.

This would allow us time to get it implemented correctly, and would allow us to build on the D7DX work going on in the community. As more packages become available for testing we can automate upgrades and migrations between various versions of these packages and fully kick the tyres.

The potential for this system is huge, but we need to make sure we implement it right.

Other things we can do ..

There are some issues I haven't touched on in these suggestions, but once we have the basics right we can talk about things such as solving the 'many copies of the same module' and other issues.

Picture 27.png18.08 KB
Picture 26.png15.33 KB
Picture 29.png28.61 KB
Picture 25.png25.65 KB


A few minor corrections

chx's picture

You can specify project status url in info files already. Also, the version patch is separate and being worked on. Finally, the filetransfer class we already have is also to be used for settings.php because it sucks that you need to hand-copy on install.

Great review, one quick thought: plugin.php for D6?

dww's picture

Thanks for taking the time to so carefully write up and explain the problems with the current approach, and to propose concrete steps to make something that's going to work for everyone.

One quick thought: Why not build a version of plugin.php for D6 core? I haven't deeply pondered, but it's not clear if you'd actually have to touch the rest of core at all to make this work. If you could have a version that works on D6, there's certainly plenty of real data to test with. ;)

Also, I don't know what exactly is in the current implementation that makes you think it's too d.o-centric. update_status lets the .info files specify the URL for their XML feed of available update data. I don't know what's in the current PM-in-core approach that breaks that. In principle, there's no reason it should...

Provide a simple killswitch for the entire system

pearcec's picture

+1 for touch a file.

I wish more systems did this. Touch a file to kill something that is known to have potential for causing fatal problems can be a life saver. And the implementation is usually a piece of cake.




dmitrig01's picture

There are several disagreements, and some of the stuff is irrelevant and wrong. That said, a lot of it is correct. I think if you eliminate all the things that are wrong (in my opinion) that I outlined, then you should be left with the stuff that's right.

1) dhtml_menu
You installed a version that broke your JavaScript. That has nothing to do with the patch, and it's dhtml_menu's problem. Try updating with some of my test modules (http://dmitrizone.com/update-fake).

2) "This system only really works for occassions when there is one and only one instance of a site. in a real development environment you have the local development instances, the shared development instance, the staging instance and the production instance."
And the current system does nothing. If you want to submit a better suggestion, i encourage that. Otherwise, this is better than nothing.

3) "With no 'off switch' for this functionality, it will become a hell of a lot harder to follow proper development methodologies and make the lives of 'serious' drupal developers a lot harder."
What do you mean, an off switch? It doesn't happen automatically, so you don't need to ever make it happen. There also aren't off switches for a lot of other things, why should there be an off switch for this thing

4) "Any XSS exploit allows you to gain access to the uid1 account, will allow you to set up a custom block or something similar that can then simply sit back and wait for the affected user to update their system."
Sure, you can do tons of things with an XSS exploit. This is slightly worse, but it's not that bad.

5) "Even if you set up the SSH2 extension, you are still sending your password via plain text."
So we educate people of security risks... not that hard.

5) "it frequently doesn't even compile with the current stable version of libssh2"
it wored for me on the first stable release.

... you seem to be dissatisfied with SSH. You say it's bad to use it. It's also very hard to install. So, only people who have any idea what their doing will do something...

6) "This extension has no place in Drupal core. It has incredibly complex set up requirements for nearly no benefit."
FTP isn't hthat hard is it? only SSH.

7) "it uses FTP solely as a mechanism to log into the site and change the permissions so that wordpress IS able to update itself"
how is it more secure? both ways it's ftping in

8) "This system is not suitably multi-site aware."
It should work - the way it works is using drupal_get_path to find the location of the item to update. but it sholud be easy to fix.

9) "breaking not just the site i upgraded but ALL the other sites"
we encourage a backup, and potentially will add functionality for backup later.

10) "This system is too deeply Drupal.org oriented."
I doubt this is the case - i somehow managed to get the updates at dmitrizone.com/update-fake working, i don't understand your complaint

11) "Until we had the feature server, the only way to provide a package source was to be running the project module locally, and also be running the cvs scripts and buy into the entire drupal.org packaging and management environment."
That's the case with or without the patch, so I don't see how this has any relation to the patch

12) "Another case this will break down is if you needed to patch an upstream package to work with this specific environment. Even if you have contributed the patch back upstream and are working towards getting it included, this system will wipe out your modified files with no easy way to get back to what it should be running."
We do encourage backups, as stated earlier. in the words of chx, If you want to shoot your own foot we won't stop you.

Too late to reply to it all

adrian's picture

But the point of this item is :

1) dhtml_menu
You installed a version that broke your JavaScript. That has nothing to do with the patch, and it's dhtml_menu's problem. Try updating with some of my test modules (http://dmitrizone.com/update-fake).

The point is that we have no control over what the modules do, as contrib is a wild wild wild place. We should isolate the modules from being able to break the plugin manager.

The dhtml_menu was just an example I managed to take screenshots of, but this can happen in a hundred different ways with as many ways to trigger it.

You can't rely on the modules to be working correctly, and testing with a golden set of 'ideal use case' modules will not uncover any of the errors that this system will cause.

Problem of Assumptions

mfer's picture

I think we have a problem with some assumptions.

1) This functionality might not happen automatically but, what happens when an admin who isn't the developer decides to make a change outside the dev workflow of dev -> qa -> staging -> production on the staging or production server. It creates a mess and there are people who will do this.

2) "we educate people of security risks... not that hard." It is hard to educate people on security risks. Many don't care until their site gets hacked and then they are mad at the tool. Most people who build sites with drupal are not professional web/software dev/engineers. Educating them is hard. It should just work without education.

3) "we encourage a backup". Sure we tell people to backup. It's in the update docs and all. That doesn't mean people actually do it. Lots of people don't do backups. We have to assume no backup.

I think we need to know our audiences better and make some good solid assumptions based on practice for professional devs and weekend hobbyists.

Matt Farina
www.mattfarina.com | www.palantir.net

that's the thing

adrian's picture

Due to really deep fundamental issues with the entire concept, this system will never work for professional developers.

Until you can build this to feed off of your own code repositories and we build a lot of supporting systems around this, this system will only break dev -> qa -> live even further.

A major major part of my concern relating this system is to make sure it does not make the lives of real developers more difficult.

Staging solution vs plugin manager

Dries's picture

A staging solution should not be confused with a plugin manager, and vice versa. I think we all understand that these are two completely separate things for two completely different audiences. I don't think anyone suggests implementing the plugin manager in such a way that it prevents staging solutions to emerge. As Adrian said, care should be taken that we don't make lives of real developers more difficult, but honestly, that should be easy to do. I don't see any problem here.

adrian's picture

One of them I mentioned was to integrate this system into the update.php file.

So we need to tread carefully with any changes we introduce to the existing code, because doing staging with drupal is already very difficult, and any changes we make to these systems that don't specifically keep this fact in mind could introduce more issues.

Re: Disagreements

JoshuaRogers's picture

Hey Dmitiri. I see the points that you're making. Let me see if I can help clarify.

1) As has already been pointed out in other responses, the fact that it is integrated means that other modules can break it. That means rollback functionality, even if added, would be broken too.
2) Maybe. The patches I've tested have only worked for sites/all/. I have a multisite installation with SEVERAL sites. I'd love for people to be able to upgrade modules for their sites without upgrading ALL of the sites.
3) I think the point of this was that, the way it is now, even if you don't want this, you can't disable it. Not without disabling all of the update module.
4) XSS is definitely bad. A key logger or root kit recording FTP or SSH creditintials though... much worse.
5) The point of this one was, there is no actual security here. There is no point in using SSH if you send your creditentials in plain text. I would say that is even more dangerous than FTP. With it you can only transfer files. With SSH? Well, you've gained a bit more power.
7) I don't think this point was about FTP. I think the point was that the webserver writing files in a safe manner. (Well, at least relatively safe.)
12) This would be one of those times it would pay to use md5sums. We really don't want to blow people up.

Debian for the win ...

kbahey's picture


Your review seems comprehensive and tells us why we should not consider plugin manager for core in D7. I was a co-mentor for it on D6 for SoC 2008, by the way.

The important thing is to have a replacement soon. I am fine with an exception for it after code freeze, if Dries agrees. I don't think it will be an issue if it does not break many APIs anyways. This is what dww is alluding to in making it a drop in thing. The more it is so, the less resistance to it post-freeze.

A .deb model is best, as we have discussed years ago, and this is what you are aspiring to I assume.

Drupal performance tuning, development, customization and consulting: 2bits.com, Inc..
Personal blog: Baheyeldin.com.

Drupal performance tuning, development, customization and consulting: 2bits.com, Inc..
Personal blog: Baheyeldin.com.

Okay.. ugh. Not fun to wake

JacobSingh's picture

Okay.. ugh. Not fun to wake up to.

I actually agree with many of the most important points, and have since the beginning. I just took it for granted coming in that people had already raised them and the current path was decided. In all my conversations it was universally declared that it would have to transfer files up.

There are many points which I disagree with or are based on outdated information, but the post is so massive, it's hard to comment on.

Let's try to start by separating my responses:

Not at all related to this implementation, but general problems w/ Drupal

This system is not being developed with real data

True. Hard to help really. There is no real data, so stubbing it was the best start. I'm not too worried about this though. In fact, developing a bunch of good dummy mods is most likely a good way to ensure consistency in tests (manual functional tests, not unit tests).

It is unsafe by default because of plain text passwords

I don't get what the problem is here... we can only be as secure as infrastructure allows. If the user has HTTPS, great. If they don't we have plaintext. If they can use SSH, great, if not, FTP which uses plain text. The structure is there to build an SSH client which uses keys, it just hasn't been built yet.

Outdated info or I think it is wrong

You are entering your credentials in a potentially hostile environment

Yes, this sucks, and if I got uid 1, couldn't I just execute whatever PHP code I wanted? In which case, what's the point of hacking the site to add a module which executes PHP code I could have just run in the first place?

This system is destructive, replacing original packages.

I don't know if that is the case. Did you get the most recent patches? If a module is in sites/$site/modules, it will be updated there itself.

The dhtml_menu Javascript thing

Actually, the most recent patch as of Sunday, I made the whole thing work sans JS. If you read the issue, you can see it.

With this baked so deeply into the update module, it's impossible for people who this system will not work for to disable it. There needs to be an opt-out functionality for this, that can be easily deployed.

Why, we just need to add a couple of if statements and a variable_get() am I missing something?

Meh... not too important

The ssh2 extension is not really safer either

I didn't realize all you wrote there, that's interesting and disconcerting that they ship it with PHP if that is the case. Here is why it makes sense: If we provide a key based FileTransfer class, it would not be plain text AND there are people who host on friend's servers who may not have FTP, and only SSH. It is unlikely, but possible. I don't have FTP, and although I would most likely not use this system in production, I would use it while running an install profile.

This system is too deeply Drupal.org oriented.

There is nothing stopping us from making it less d.o. oriented or fixing these problems. But at a first pass, we're not looking to replicate apt-get. I think it should be d.o. oriented. All of the points in this section are unrelated to how it is being built, just require a lot more coordination and a lot more engineering.

I agree

Updates can break the system and have no way to recover

This is a Drupal problem, exasperated by the registry. No other system in Drupal would improve this. apt-get spec has a conflicts[] field I think, I don't think we do.

Web server write-able files are not really that insecure

Yes, I think there was too much panic about it. When I first entered the fray I was burned at the stake for suggesting this.

This system is not suitably multi-site aware.

I agree, but I think your points are related to a WIP system. Any new system is not going to be any more MS aware, it's just a matter of writing a little more code. I actually thought about this a lot, and I don't think there is a very usable way to articulate this to normal end-users. The best concept from the world of normal people using software is Window's "Install for this user or all users", but then it can become quite complicated. I like your idea of a "site manager" site.

Next Steps

This is a separate system.

I totally agree here. I fought tooth and nail to keep any drupal apis out of the FileTransfer classes (as they were eventually called) classes for exactly this reason. I'd advocate a reduced bootstap through if possible

Get rid of the false security

I agree here. Although it was very painful to write those classes and get them through the anti-OOP cabal, I'm personally not of the opinion they are needed (and could have been easily gleaned from PEAR if I was allowed to). However, since they are there, and they are heavily reviewed and tested, I think we shouldn't just ditch them and they could be useful in other applications, chmod (or this one if we find problems with chmod'ing stuff).

Use a web writable 'overlay' search path

I disagree with this. The idea is sound, the implementation doesn't make sense to me. But that's a detail which can be dealt with later.

Make an exception on code freeze for this system

This would be nice, but we can also get what we've got in by code freeze. I am all for ripping parts of this out (although it hurts a lot) and rebuilding it in many of the ways you've suggested.
I also think throwing out a system which is a 90% there is also foolish just because it is not "the best idea". Ideas are fine, but code runs. Maybe we won't build the module browse and install screen until we've thought about it more, but updates are really really close as are install profiles. This enables packaged distros people can bootstrap and update on their own. Nearly every point you've made has merit, and I thank you for spending time to write this, but also I think we can proceed with what we've got for now because:

  • The security concerns are really not that massive IMO, there are bigger fish to fry in Drupal
  • You don't have to use it, and you can disable it for your clients (I certainly would)
  • We tell people to take a backup.
  • Drupal Developers need to stop building tools for themselves and start building them for users, covering every edge case is not needed IMO
  • If you've ever watched someone who is not at all a developer running their own Drupal site, you'd know that this is the single most important feature Drupal can offer, and for them, with one site, no SVN, no budget, no developer, this is good enough. *except, taking a backup - which sucks, but a module can easily be written for it.

Also good to know

It also really sucks to be very nearly at a beta state AND THEN have someone come along and create a big shit storm, essentially invalidating a huge amount of my Sundays and evenings for 3 months and the work of many others over the past 5 months. These are incredibly huge, grueling patches which went through wireframes, UI review, engineering disputes, further UI review, etc. etc. by literally dozens of people. That is not cool. Please try not to keep your feelings to yourself for 4 months in the future. Also, before writing this novella, did it occur to you that you could reach out to individuals involved and discuss the problems / find resolutions as they come up in your head? We could actually be at the system you envision by now if you had just caught me in June.

Okay, end of rant. I probably should have read this post coffee, but it's too late now.

Thanks for the write-up Adrian, see you in Paris.

All the best,

a bit OTT?

sime's picture

It also really sucks to be very nearly at a beta state AND THEN have someone come along and create a big shit storm ...

To be honest, this language jars me, especially when directed at a contributor like Adrian. What if only one comment from Adrian results in changes that save countless hours of extra work and frustration for hundreds of developers?

I'm not against the project

adrian's picture

I'm against parts of the implementation. I believe it can and will be a kick ass system for a certain class of users.

I'm also aware that I am not the use case for this system.

I only seriously started looking at this code 2 weeks ago, as i've been aware of it's existence and have been playing in the same space for a bit, but I had hoped I would be able to opt-out of using it for my use case, however the current implementation meant that I wouldn't be able to.

I had tried to contact you via skype, and via IRC and I have been very public about my review process, but I felt that I shouldn't invade the issue queue until I had an actual case to make. Believe me, the original 'concerns' document that prompted this review was far more harsh in tone and I needed to properly evaluate what was going on before I could make some of these statements.

This is actually part of a larger document I was working on that illustrated how these issues present themselves for a set of 4 different use cases (the target use case, 'real developers' using proper development methodology, distribution developers and hosted service provides). I found that this system adequately serves the target use case but can very easily get in the way of the other use cases. This review illustrates all the issues, but the possible solutions mostly focus on ironing out the primary use case and providing a proper opt-out mechanism to avoid the issues the rest of the use cases will face.

Additionally, this does not mean that all the work is thrown away. The wireframes and UI design can still be implemented, we'd just be implementing them in a 'clean room', that drupal modules can't interfere with. In some ways this will make this system easier to write as we won't be wrangling large core patches, but instead using what we need from core and working in our own space.

With this functionality baked directly into update.module I fail to see how you can turn it off easily without physically removing the update module from all your installations. I completely agree that this is a system for users, but it needs to be implemented in a way that doesn't affect what developers need to do.

The ssh2 extension is only available in debian sid. PHP does not 'ship with it' to my knowledge. The target use case for this system is not likely to have ssh but not ftp, and if they do they are also unlikely to be able to install the ssh2 extension which still offers no added security over the ftp extension. If we change the system so FTP is only used as a fallback to change the permissions on a directory/file, we can simply instruct users who have ssh installed to log in and change the permissions (if they can log in with ssh in the first place they very likely know how to change permissions)

The system being destructive and not multi-site aware enough is still valid. If a module is installed in sites/all/modules, many sites have access to it. Under no circumstances can this system just assume that because the module is found there it should install it there. Unless it understands what is going on, and is able to update all the sites that have access to that module, it should always install the modules under sites/$site/modules.

I'll happily work with you to iron out these issues, and I apologize if I spoiled your morning.

Aegir, Open Atrium and feature servers

chx's picture

One could be paranoid and see this whole rant as an attempt to derail the plugin manager so there is more room for feature servers vs install profiles and Aegir. Of course, this is not likely as Adrian is working on install profiles too. But still, I needed to get this off my chest.


eaton's picture

Adrian has been helping build fundamental components of Drupal longer than you've been using it, chx, and he has a long track record of seeing architectural needs before the rest of the crowd realizes they are even on the horizon. His work on Aegir in particular has been done in public, with open discussion, collaboration, and transparent development for several years. It's uncalled for to suggest that he's trying to "derail" anything, especially when his complaints are clearly explained and his suggestions constructive.

Ad hominem attacks like this

moshe weitzman's picture

Ad hominem attacks like this have no place on this web site or in this community.

Not nice

deekayen's picture

It's not a rant. The topics touched on seem to be all issues with the underlying design and architecture to get the points covered. Stopping to write a patch to cover this post just wouldn't work.

We have a hard enough time getting people to even review patches in core. I think it's good to step back and look to see if the design works. There are plenty of people that would use it, decide they didn't like it for any combination of reasons, and never tell anyone why so it could be fixed.

adrian's picture

if we only write files in a single directory (except for major version upgrades of course), and we use ftp only for changing the perms.

you will be able to change the perms of the directory on your local development server by hand and use the system without needing to install any additional services.

Hey, that's clever!

dman's picture

A much tighter API/bridge/whatever to address the security issue, without having to write and test scripts that go out and move files around.
Just an FTP OR SSH login that unlocks the directory, code that directly messes around in there using usual means, then locks the door later.
So much easier to write.

Damn. With that working, I could even use a scripted CVS update or browser-invoked drush dl again.
Nice thought.

what i want out of this...

anarcat's picture

The issue with dhtml_menu is a good example, in my opinion. The fact that it's dhtml_menu's fault doesn't change anything: if any module upgrade breaks, you're screwed and you have no rollback. Adrian did present an alternative which was to install the module in a separate, temporary layer to allow for testing.

I'm amazed to read that it's "not hard" to "educate people of security risks". If we introduce a new user-based vulnerability issue, then it's a new issue. Fixing security bugs is fairly simple, close the loop, fix that bug... Educating people is exactly what is hard and introducing a vulnerability at that level has serious consequences that shouldn't be overlooked.

I like the idea of using FTP to only change permissions on the filesystem and let the application do the rest, then revert permissions to sane levels. The alternative (having the application have full access) is problematic because there is no separation whatsoever. Layers will bring additional security, blurring and removing them will reduce security.

The issue with using SSH in the backend is obvious. It tells the user the upgrade is "safe" while we are actually protecting the least vulnerable part of the stack: the communication layer between the webserver and the filesystem, which is usually (to say the least) on the same server. In the very great majority of Drupal sites out there, the frontend is not SSL protected and we /will/ be requesting users to type in their credentials over that unsafe, in the clear, channel, which is very often susceptible to XSS, wiretapping and other cute attacks.

I must also state that I am very uncomfortable with the direction core is taking with this. For me, this would require modifications before being shipped with Drupal 7 (but who am I to tell...), namely:

1) a kill switch: not everyone upgrades their modules that way, and some people will want to keep their systems tighter than others and close that hole. yes, it's probably just a matter of a few variable_set, but I'd prefer this would be a separate module that could be more easily turned off from outside.

2) provide SSH-based backend only if the frontend is SSL-protected, and clearly indicate the implications of typing that password in there. in fact, I would rather see a separate layer where the webserver has write access and that is in the search path than have Drupal fiddle with its own modules the way things stand right now, but I'd be ready for that compromise.

3) do not break interoperability. multi-site is one of the greatest features Drupal has to offer, and people are only starting to realize what it can do for us, even if it's been there for so long. i understand some of the issues pointed out by Adrian have been fixed, and that's good, but we should keep that in mind.

I'm not totally against the plugin manager either. I have only recently (in the last 2 weeks) became aware of it. There's a lot of stuff happening in the Drupal community, and especially in Drupal 7, it's really hard to keep up with it, so it's a bit unfair to blame the community for not reviewing what is actually work in progress more quickly.

Full disclosure: I'm also developping Aegir, at Koumbit.org (@chx: hi! ;). We are using it for our needs, and we hope to make it a platform to maintain N drupal sites easily. I do not agree with the "one site" theory. Most of the use cases I saw for our clients revolved against one site at first, but then a second one, and a third one, or a test one, and a prod one, or one for me, then one for my friend. There's never "only one Drupal". And even if there's only one, it's never fire and forget. People have to take care of the security upgrades, and the trouble is not as much training people to use FTP and do backups, but "how to recover from that damn upgrade that broke my site". That's why we have maintenance contracts on Drupal sites. That's why we're developping Aegir with staging functionalities: so that we can test upgrades, and "fix them once, run everywhere".

I'm all for making people more autonomous, in fact, it's one of our core founding values at Koumbit. But I'm not sure that slapping in a plugin manager will achieve that. People will still have to do backups before they run the upgrade, they will still have to fix broken upgrade paths (don't tell me that will stop happening). Our approach revolves around offering Drupal as a service instead. A site that is automatically maintained and updated. You don't have to worry about upgrades, backups, it's all being taken care of, it's the service you're paying for (through advertisement, directly, or heck, get it for free). I feel all the big shops are turning in that direction. Everyone of them has their collection of scripts, provisioning software, etc. It would be unwise to leave them dead in the water.

With Aegir, we hope to turn those efforts into a collective energy that will create something everybody can use. I can see how the plugin manager fostered a similar energy, and I hope we can work together to further those goals.

A kill switch or contrib

bonobo's picture

Like many of the other commenters in this thread, I'm not the target use case here --

But with that said, having something in core that has the potential to disrupt development best practice is not ideal. Of equal or greater concern, however, is the ability an untrained user has to break a site using something like the plugin manager. The dhtml_menu example is telling. If that had happened to a user unfamiliar with Drupal, this would have easily turned into another "Drupal Sucks" post -- "I tried to update my site using the core Plugin Manager and it broke completely." The fact that the problem resides in the contrib module and not the upgrade tool is a moot point to the non-technical end user -- and that non-technical end user is the use case for this.


Ideas are fine, but code runs. Maybe we won't build the module browse and install screen until we've thought about it more, but updates are really really close as are install profiles. This enables packaged distros people can bootstrap and update on their own.

This is awesome functionality, and has some amazing potential. This would be able to develop a lot more freely in contrib than in core. Having this in contrib for D7 would allow for people to see use cases that we haven't imagined yet, largely because we haven't seen this functionality tested with live sites.

RE educating people to security best practices: when it comes to non-technical users, the target use case for this, you can actually hear their eyes glaze over when you try and explain the difference between FTP, SSH, SFTP, clear text, and encryption. In a general sense, they care, but wrt the technical details, they don't care. My concern with a tool like this is that it creates the illusion of security, when in reality it is only secure when it is configured correctly, and it is more likely to be configured correctly by someone who understands security best practices, who wouldn't need this tool in the first place, and therefore isn't the target use case for this.

Non-technical users should definitely have a tool like this. But issues of this magnitude this close to code freeze make me very nervous.



Click. Connect. Learn.
Using Drupal in Education

i think we can make it

adrian's picture

I really think we have enough time to get it into core for D7

we're not scrapping everything we have and starting over, we're just re-arranging the pieces more sensibly.

If we choose to go for web writable directories, we would be simplifying code, not adding new features so i feel that can happen in code freeze.

the great thing about gdo

chx's picture

is that i can unsubscribe from issues and I do. Do whatever you want, I do not want to waste my time fighting. I am sick of fights.

Not fighting

Boris Mann's picture

If you read the thread above, there's a pretty good back and forth. Yes, it's unfortunate that Adrian didn't get a chance to weigh in until now.

So, let's go build plugin.php or whatever and make something that kicks ass for end users for once. That's about the one point I see here -- is a lot of people going "yep, I'm a dev that will never use this" - so I'd love to see how perhaps pieces of this system could a) help end users 1,000,000 times by helping with updates / install profiles / new modules / etc. and b) have code that could also be harnessed for various dev use cases.

Off the top of my head, distributing updates - esp. to install profile like "things" - seems like something that would be win-win for both end users and devs.


adrian's picture

We're aware that this system is not designed for us. And we're ok with that.

We want to make sure it works amazingly well for the people it was designed for, and doesn't break for the people it wasn't designed for.

we might end up seeing Drush make use of some of these API's in the future, but not while Drush still needs to support d5 and d6.

Okay, I apologize for coming

JacobSingh's picture

Okay, I apologize for coming off a bit pissed. If anyone missed the point earlier:

1). I agree with almost all of what Adrian has written.

2). I don't think the current system should be scrapped, I think we should make code freeze with it, or something slightly better. Most developers think that most people who use Drupal are like their clients. I don't think they are. The vast majority are people with one site, on shared hosting, probably installed with fantastico or by their teenage kid or something - and they matter - a lot, and this group is the future of Drupal.

3). I still think the process here is a total drag. This needed to be raised a long time ago, iteratively, with individuals or in the queue and it could have been. I never received any communique to this affect from Adrian, in fact, we haven't talked at all since DC. Basically, if this was raised in June, we probably could have saved about 120-200 engineer hours; so expensive mistake. But I'm willing to let that go and try to salvage what we can and move in another direction if one of the two "holy" people who can actually commit this want to go there.

4). Ouch. Not much time. I will continue to slam on this, but we need to agree on a direction, and establish some stewards to take this forward who are committed to working on it, not just talking about it and we need a blessing from Dries or Angie before I waste any more of my time.

So can we get to work now?


we should meet up on irc

adrian's picture

I'm on almost all of the time. Vertice on irc.freenode.net.

2) i didn't say we should scrap it. i just feel weird about patching core files for code freeze just to unpatch them again afterwards. I'm pretty sure most of the current patch can be fairly easily translated to a separate file. Some code from update.php and replacing the menu hook with a function call. At the level we are bootstrapping to the forms will still work, the batch api will still work, we will still have access to the update module's tables and the system table to do login validation.

3) I wasn't involved yet 3 months ago. The patch queues related to this are also not the easiest thing to get into, it is a hydra of a beast and it is quickly evolving code (as was illustrated by my dev environment late last week already not having some of the fixes you mentioned).

4) i can contribute some time this week to help coordinate, review patches to try and get us back on track. It's only fair since I am throwing the proverbial spanner in the works.

Let's chat on irc

A feature request

SLIU's picture

This thread inspired me to submit my first feature request to Drupal:


If nothing else happens .. I

Macronomicus's picture

If nothing else happens .. I implore you, please get the opt-out killswitch chingus in! I dont let any of my clients do updates cause it wastes too much of my time. Plus with my very own (hopefully soon) features server and Aegir platforms it will be easier then ever for our clients to get the good stuff without worrying about the overtly techie bits. At the end of the day its what nearly every average user wants .. plug in, turn on & go on about their day job.

Great Points

irakli's picture

Great analysis, Adrian.

These are very real concerns that you raise and it's important that they are paid proper attention to. I will even take it one step further and say: we, the Drupal community, need to finally figure out where we're headed. Either we embrace the fact that Drupal is being considered so mature and enterprise-ready that even most popular US government websites do not shy away from it, or we try to target $5/month shared hosting installations and FTP-based updates. We can't do both. Any system that tries to do "everything", falls short of doing anything well, we all know that.

Don't get me wrong, I don't know a Drupal developer who would not welcome an improved plugin architecture. We should also, definitely, be appreciative of the work of people who are trying their best to create new, better things, but the "new" stuff should be built based on what has worked for Drupal (and what has not), and when somebody, who has as much experience with Drupal, as Adrian does, raises so many concerns - that should ring a bell.


Thanks yous

perandre-gdo's picture

I just want to say thank you to all developing & reviewing the plugin manager. You make people happy!

Late to the party...

bjaspan's picture

I had a chat with Joshua Rogers about this last night, enclosed below. I wish I had posted this last night but it was too late. Ah, well.

Executive summary: I don't think we want an update system running inside Drupal. I suggest a native client-side app (like DAMP), the Drupal Updater, which sshs to the server and uses Drush for updates. A yum/apt-style package manager as Khalid suggests is possible too. Now, here's the chat log:

Joshua: Hi.
Joshua: Do you have a few minutes?
Barry: Hi.
Barry: Sure.
Joshua: So, I've read through the entire article that Adrian wrote.
Joshua: I'm talking with him in IRC now.
Joshua: He sounds like he's making some good points.
Joshua: I wanted a second opinion.
Joshua: I don't want to blindly follow the bandwagon.
Joshua: I thought you might have some insight.
Joshua: Mainly the articles that Adrian wrote. Would it be better for us to seperate plugin manager from Drupal proper?
Joshua: much like update.php
Joshua: Would it really be better for us to directly write php files with the webserver?
Joshua: (Assuming of course that we fixed permissions before and after.)
Barry: Let's take the 2nd question for now.
Barry: I first learned of this "update files via ftp" approach from Joomla a year ago.
Barry: My initial reaction was: That's stupid. The reason to do it is because if the app can write its own code, then if you subvert the app, yo ucan take over the server. But if the app can ftp to its server, then if you subvert the app, you can still write files and take over the server. So you've add complexity and fake security without gaining anything. However...
Barry: The advantage of the ftp/ssh approach is that the app the ncannot write files without human intervention.
Barry: Which does seem like an improvement.
Barry: But it is an improvement that comes at its own risks.
Barry: It encourages users to transmit server-controlling credentials to a PHP app which is known to be periodically insecure (http, xss, etc.).
Barry: So requiring ftp/ssh with human intervention has some security benefits but some security risks. It also adds substantial extra complexity.
Barry: I guess I'd summarizing by saying it seems like a really terrible idea, except the idea of giving the web server write permission to the modules directory seems even worse. HOWEVER, the fact is that every site in the world is basically running in a configuration where it has write access to its own docroot.
Joshua: Lesser of two evils then?
Barry: I'm trying to figure out how to summarize a position.
Barry: I think the idea of a web app downloading and installing add-ons is an obvious cool feature from the user's point of view, and a truly horrible idea with no acceptable implementation all at the same time.
Joshua: Wow.
Joshua: Suprisingly enough, that actually cleared up a good bit...
Barry: WP and Joomla have basically poisoned the well by making a bad decision to provide this functionality, thus making it important that Drupal does too.
Barry: What I'd like to see is a browser JS app that interacts via SSH directly fro mthe browser to the server, not via PHP on the web server at all. That would be a true improvement. Except the browser certainly cannot be trusted to access your real SSH keys.
Barry: because browsers are so insecure themselves
Barry: So that brings us to a desktop app (e.g. drush).
Barry: The desktop app would be able to spawn ssh for secure communications.
Barry: Okay... so now what 'm thinking is that someone needs to produce somehting like the DAMP installer, except it will be the Drupal Updater, that users can download as an executable for their native OS and run locally.
Barry: This app would then interact with drush on the server.
Barry: And thus I am addressing your first question too.
Barry: Namely, yeah, I probably think that plugin manager should not run inside a booted drupal.
Barry: I'd almost forgotten, but this is practically the first thing I wrote for Acquia, in January 2008.
Barry: I basically concluded that you can't update Drupal from within Drupal, for many of the same reasons as Adrian's.
Joshua: You have been a great help.
Joshua: Thank you.
Barry: you're welcome.

Don't think it should be separate

Boris Mann's picture

As Dries said, I think update.php or plugin.php shipped with core is the only way this is going to work. See Firefox or WP as being the easy way things work - both included in the core distro which is why it gets used in the first place.

I totally get your "poisoned well" approach. I have barely gotten this functionality to work in some WP sites that I manage, because I had to jump through a dozen hoops to get the ssh and supporting libraries working correctly. As Adrian said, I don't run FTP on any of my sites.

But, the vast majority of people that will benefit from this DON'T have commandline access at all, and FTP is their only tool. The pain they go through in even just uploading a new module is ... well, it's painful to watch.

My thoughts (and decisions)

Dries's picture

Adrian, a big thank you for writing up this post. While a bit late, it is still very valuable. My mother always tells me: "better late than never". ;-)

Like most people, I find myself being mostly in agreement. I support changing our direction based on this discussion. Like always, I want us to focus on building the best possible solution.

All things considered, I still support a plugin manager in core.

  • Let's not confuse the plugin manager with a magically quality assurance tool. Broken modules are broken modules. It is not the task of the upgrade system or the plugin manager to deal with that. Regardless of the plugin manager, we need to raise the bar for module maintainers. One way to do so, is to roll out the SimpleTest test bot for contributed modules, which we are working on. It will help module maintainers as well as end users. ;)

  • It would be nice (but not required) if at some point, we could support a private CVS or SVN repository as a "file transfer" protocol instead of FTP or SSH2 (e.g. check in uploaded or upgraded files). It would allow for nice rollback functionality.

  • To make rollback support possible in the future, I support moving to a stand-alone plugin.php, or better yet, integrate the code in update.php and build on the existing update.php system. Like install.php can now be executed from the command-line, it would be nice if update.php could be executed from the command-line (think apt-get like functionality) in addition to having a web UI. If we don't want to support rollback support, I don't see the value of making this a stand-alone system. Without rollback support, a stand-alone system is equally destructive. (Does that align with your thinking, Adrian?)

  • We should improve versioning support in core. chx has been working on a patch for that. That will help the plugin manager too.

  • I'm happy to get rid of the sshx extension and to focus on FTP. Related to that, I'm also happy to follow more of the Wordpress model and go with web server write-able files, especially if it simplifies things. This sounds like the most important thing to do, and more important than making it a stand-alone system at this point. I recommend that we prioritize our efforts properly -- it is easy to obsess about the wrong things.

  • I'm undecided about providing an exception or extension for code freeze. My current thinking is that it will depend on how much progress we have made prior to code freeze. For an extension to be granted, I think the functionality needs to be 75% complete instead of the current 20% complete. I want to see it fully functional before code freeze. If so, we can obsess about the details and some extras after code freeze. The base functionality has to be there though.

  • I think it is good to focus on drupal.org for the time being, but I fully support opening up the system at a future point. I would be great if people could upload .zip or .tar files, connect to other repositories, etc. However, I think it is best to leave that as a follow-up patch. This shouldn't be top priority before code freeze.

  • I support adding a kill-switch for the simple purpose to enforce certain development practices in certain environments. That said, the kill-switch could be as simple as a variable or even a permission. As Dmitri said, it is easy enough to ignore the feature if you don't want to use it. As said, it would be nice to have a kill-switch, but let's not obsess about the kill-switch -- that should be trivial to add regardless of any other implementation details, and it certainly isn't a release showstopper for me. This shouldn't be top priority before code freeze.

Code freeze

robertDouglass's picture

I'm undecided about providing an exception or extension for code freeze. My current thinking is that it will depend on how much progress we have made prior to code freeze. For an extension to be granted, I think the functionality needs to be 75% complete instead of the current 20% complete. I want to see it fully functional before code freeze. If so, we can obsess about the details and some extras after code freeze. The base functionality has to be there though.

Depending on how autonomous this system is it could have its own overlapping release cycle. By this I mean D7 goes into code freeze, and perhaps even releases, but that some amount of time later we release D7 + plugin manager.

If it's REALLY autonomous it could even get its own extra core release manager - the webchick of the plugin manager.


irakli's picture

Hmmm. Plugin manager should provide means to safely recover after a plugin breaks entire system. You should NOT have to go to backup for such thing. How would you feel if after installing a plugin with a minor bug you had to re-install Firefox?



JoshuaRogers's picture


Okay, so as long as no one

JacobSingh's picture

Okay, so as long as no one has a problem with it, Adrian and I are going to meet w/ Dries this week to hammer out a plan of action and a list of must haves to try to get something in by code freeze.

Adrian is willing to kick in some time, I'm willing to keep at it a bit as well. We have to stay very focused on the possible not the ideal here, or this won't happen. It's very important this doesn't become everyone's pipe dream / laundry list. There is always a better way to do something, actually there are as many better ways as there are people with user accounts on d.o.

We'll be re-tooling some issues as a result of our conversation, and we'll keep anyone interested in helping in the loop here and on the original issues.


P.S. While I'm totally into the ideas stated about using apt-get / deb or sshing in and running drush, etc - I'm pretty sure the bar is raised here to include anyone hosting a Drupal site. This means windows, so AFAIK both of these solutions are pretty much out. Plus, they are very very far from being implemented on a server side, client side or cultural level. In summary: wicked cool, not gunna happen right now IMO.


Dries's picture

I just talked to Matt Mullenweg from Wordpress a bit and they didn't ran into major issues so far. He described the quality assurance as mozilla-style "tested up to" which is reliant on the developer to update. Clearly, Mozilla and Wordpress get away with it ... As I wrote, let's not confuse the plugin manager with a magically quality assurance tool. Broken modules are broken modules. If we add module rating, module reviews and automated testing, module maintainers might become (even) more careful.

According to Matt, the biggest thing they haven't tackled yet is compatibility checking -- he doesn't like how they do compatibility checking so far. It seems like with chx' patch, we should be in good shape there.

That said, we shouldn't look at Wordpress only -- there is real value in looking at some other plugin managers to see what we can learn from those.


adrian's picture

But as was pointed out, installing a broken firefox extension doesn't hose your firefox.

What wordpress does is it disables the module, does the upgrades and then tries to re-enable the module. If no error occur it stays enabled, otherwise it is disabled so it doesn't interfere with the rest of the site.

Our problem is a fair amount more complex since wordpress doesn;t actually have codified dependencies, and only ever updates 1 package at a time.

This wasn't brought up in this comment, but I am going to address it here.

We should not integrate this functionality directly into update.php, because people who are managing their own builds still need to run update.php as it exists now.

update.php should still exist as is, but it needs to be api-fied, so that this system can call that functionality when necessary. This is also the most non-invasive and future tolerant way to implement this, as this system will eventually evolve to a point where it will update a module and then need to run update.php on multiple sites.

We already crossed this with the Drush project, where we have 2 commands (updatecode and updatedb), which handle the relevant parts of the functionality and we have an 'update' command which just calls those two commands for you.

Another small issue i have is naming.

plugin manager is wordpress terminology, i believe the most accurate name for this system is 'package manager', and therefor i suggest the entry point be something like 'packages.php'

update.php is now mostly API-ified

dww's picture

I just helped get #233091: Move code from update.php into includes/update.inc into D7. It's not beautiful, but it's a start. So, packages.php (or whatever), should be able to just include includes/update.inc and invoke the same functions that update.php uses. I completely agree we should not merge this package manager stuff into update.php. That should stay its own beast for doing its own thing. I ranted about this in issue #233091 months ago.

Circular Dependencies

irakli's picture

While we are on the subject: let's not forget about the Achille's heel of all package managers: circular dependencies and the need to install them all at once, not in sequence. That's something Drupal does not handle too well, even in the current UI-based, manual manager.


this isn't too much of a problem on upgrade

adrian's picture

You should read my post on the drupal 'ports' collection if you are interested in these problems : http://groups.drupal.org/node/21295

We should be able to get by during upgrade, unless a module has new dependencies added in a new release , which i don't think the current system accounts for. We would only be bitten by circular dependencies in this case if the new dependency added is not already on the system and introduces a circular dependency.

We're also fscked because the dependencies are on modules, not packages. So you couldn't add a dependency to just views_ui, because you need to specify views the main module of the package.

Afaik, we don't have the entire dependency tree available to us yet, so we can't generate the necessary graphs ahead of time.

I actually don't think it's technically possible to build a proper complete packaging tool within drupal itself, because the scope is a lot wider than just this drupal directory. You essentially need an installation of this system PER SERVER, not per Drupal directory.

We can however try to package some of the information in a way that's useful to this tool on drupal.org, but even then we are just making it harder to get your own package source up and running, which makes it harder to use this system in a proper development environment.

why not build an index?

miro_dietiker's picture

hi adrian

i'd like to point to:

Sure, we need not only per module, but also per package meta information very first.
Most package managers build complete local indexes after importing the referred repository package information. Sure this sounds a little like much data for a simple install.. But if we don't have that, we could only query for example a repository server to return compiled information (to get chunks of the index).

Info files should remain minimal. Instead we should add an additional filetype which containes the fully dumped dependency tree including module specs. May be we should dump in XML?

I'm sure everyone will profit of such a perspective. I Already only begun with drupal.org where we could publish much more reliable package information, relationship, relevance, conflicts, ...

I really hope you're going to improve drupals' package handling. good luck!


adrian's picture

that's what we need to do.

also, my personal feeling is this should be a directory tree of YAML files.

unfortunately an index like this will get very large. Check the ports post i linked to.

Perhaps you just threw YAML

Garrett Albright's picture

Perhaps you just threw YAML out there off the top of your head, but I would campaign against it. Not that YAML's bad, but it means that we'd have to add a YAML parser to Drupal. Drupal already has its own INI-style parser (which we'll hopefully be able to ditch soon now that PHP 5.3's parse_ini_file() supports array-like structures), so fhy don't we stick to using that?

because ..

adrian's picture

Drupal has a 'not invented here' issue, and we can't rely on php 5.3 for at least another 2 years (maybe drupal 9. MAYBE)

in the meant time there are very decent YAML parsers, and it also happens to be the most sane serialization format out there.

That's all well and good,

Garrett Albright's picture

That's all well and good, but not compelling reasons to include a new parser in core when we've got one there already, plus the ones built in to PHP. But I suppose we're getting off track of the main issue.

JSON as YAML subset

Robin Millette's picture

Could it be JSON instead, if the appropriate subset of needed features is supported?

I think we should use JSON.

anarcat's picture

I think we should use JSON. We already use it in lots of other places, it's well supported and we don't want to have another language to parse....


adrian's picture

JSON is a subset of YAML

and it's not nearly as human readable as YAML.

Also, php's json functions are not nearly the greatest.

but this is all bike shed colors right now.


irakli's picture

I am with Adrian here. Configuration markup must stay human-readable, hence it can not be JSON. And yes - it is "bike shed colors" :)



anarcat's picture

I like JSON. :P Okay, I'll stop now. :)

Build Scripts Should Handle Packages

irakli's picture

Since package is just a collection of modules (e.g. views is a collection of four modules), I think it should be the responsibility of drupal.org build scripts (the ones that create tar.gz downloas) to maintain information about packages. That way we would avoid information redundancy n and inconsistencies that can easily emerge from that.


More meta

miro_dietiker's picture

If a package would be only a collection of modules with no need of further meta information, the modules would better have been published all on its own.
A Package introduces a name, a namespace, a display title, and since it's a container by intention it needs at least a Meta description. With all that - When installing a package we potentiallly could suggest default or common configuration (module recommendation)

The package contents as a sum of all modules i fully agree could be added automatically.

Senpai's picture

I support all the work that has gone into this system up until now, and I extend a huge thanks to everyone that has left their droplets of sweat on the patches that are make this beast churn. I really think it's gonna make it in before code freeze, Cheers!

However, in all the above discussion regarding modules, versioning, and the like, I haven't heard anybody in this thread ask about themes. If this is to be a full-featured package manager, what about the themes?

If this package_manager.php system is to be a standalone, autonomous behemoth, will it be able to flawlessly cloak itself with the current, active administration theme set in the current user's Drupal site? And what if that theme is modified by a custom module like Skinr or Fusion which gives it all of it's page, node, & block styles? Can we still make the user's Drupal site look just like the Drupal site from a few moments ago if we're not fully bootstrapping?

Also, what about different themes chosen by distinct administrative users of various subsites during a system-wide replacement of that theme? If Drupal is not bootstrapped, what will it look like during the upgrade of that theme(s)? And then, can two admin users of two different subsites run the upgrade for their theme at the same time?

I really like all the talk of making this thing into an autonomous, Debian-like upgrade system, but not at the expense of having to design, create, or maintain two distinct administrative themes + the regular theme for each and every subsite in a 1000+ Drupal installation.

Is this a problem?

Joel Farris | my 'certified to rock' score

I would like to see a further point of abstraction

bhuga-gdo's picture

There's 2 layers to what's being done here.

One is a fetching mechanism that fetches components--we're talking about modules, but the idea of 'fetch and make installable' applies not just to modules and themes but also to install profiles, patterns, 'features' (which are currently just modules but I would not think that the d.org cvs repo is necessarily the best place for that), and maybe other things I'm not thinking of.

The other layer is how to resolve dependencies and all of that sort of thing. This is separate from the fetching mechanism and needs to be done per component type. That's what you mean by these theme-specific issues (and I picked your post at random, so please don't think I'm implying your comment is somehow misguided--you're completely correct).

I would like to see the fetching mechanism considered separate from the installation/dependency mechanisms. As it stands the conversation is muddied by an overlap of the two concepts. While they more or less need to be done in parallel, as having one without the other is not terribly useful, it's useful to separate out what's being done. With the two more separate, the fetching mechanism Adrian was discussing could be used for fetching themes. Meanwhile, some other set of code for defining, for example, the 'roles' of per-theme regions that install profiles could use to determine how to configure a theme and do exactly what you mean here. Importantly, the two discussions can be separate and the best solution can be found for both.

it can't and it shouldn't

adrian's picture

the theme (or any custom code) loaded in this system is a vector for an exploit.
themes are also likely to require function calls to modules which would not be enabled in this system.

this system should use the drupal_maintenance_theme as update/install uses, for security and stability reasons.

Nice article, looking

bertie2's picture

Nice article, looking forward to read your next articles.
You are in my bookmarks :)

Rock it

eigentor's picture

I really appreciate Adrian jumping in. Concerns need to be raised. As long as it does not block the thing from happening.

There have been so many thoughts recently in how using this: for install profiles, Features module etc, It coult be a massive help in improving on the fact that your basic Drupal Image Gallery needs ten modules and a ton of configuration.

Thinking solely from an end users perspective - this being an "outside" system sounds like making sense. But it needs to work mostly automatical: choose install all needed modules, you are prompted to enter your credentials, and on it goes and in the end presents you with the result on modules page or whereever.

Means if it is outside the box it still has to be able to interact with the rest of Drupal, be called and call other events. Would this also mean that onlly uid1 can execute it, like update.php? No Roles, no permissions?

The entire panic about it breaking your install: Basically it does not change much. A broken Drupal module can break your Drupal installation now and always could. WSOD and no way to roll back, especially for a non-tekky.

The new Danger is it will be so easy now to install modules por update them. One click of the button and everything gone.
And this is the reason why I like Adrians initiative: A way to easily roll back is exacly what we need. This can adress most any concern (apart from the Dev vs Enduser perspective, which is another cup of tea...)

Maybe talk to Boomatower: He did some crazy duplication of the entire Database in multiple instances for the Usability Testing Suite. As crazy as it sounds to replicate the entire database: It did not even take that long and dowlnloading and installing quite some modules will take some ten to twenty seconds at least anyway. I guess the user is prepared to that, given he gets a nice Progress bar... http://drupalmodules.com/module/usability-testing-suite
Replicated Databases do not even have a performance penalty once they are there. I guess we need only one Shadow copy so this won't bloat your database too heavily.

Life is a journey, not a destination

Easy backup goes along with easy updating

Boris Mann's picture

WordPress instructs people to backup their database before auto upgrading, and includes backing up in core. If we're going to do updating, we're likely going to need to make it MUCH easier to backup. That is, I agree that these same end users we are targeting don't have interfaces that make it easy to backup their database.

There is a lot in Backup Migrate that is really easy and just works.

Backups can take a long time and disrupt uptime

irakli's picture

Again: who are we building future Drupal for? Mom and Pop's blogs? Or enterpise? What's the future here, really?

Eaton tweeted just today: "Tell #drupal people it's okay to admit WP is much better for blogs.". With the same token - are we building Joomla in Drupal or are we looking ahead?

With all due respect, backup before an update of every lousy module that may do something stupid (there're 4,000 of them, after all) may be OK for my personal blog, but not for some of the huge websites being built on Drupal. Backups can take long time, and be disruptive. God forbid if we need to actually recover from a DB backup. And all of that for what? Because we do not want package manager to bypass Drupal bootstrap and as such be capable of disabling faulty module even if Drupal is broken? We are not asking for a magic tool here (at least I am not), but the basics must be covered.

I apologize if I sound harsh, but Drupal is the first PHP-based, open-source system that has real future with the "big boys" and it just pains me to see how we are discussing shared hosting compatibility and browser-based updates, while large-scale needs are put on the back of the priority queue.

Again, I apologize if I hurt anybody's feeling by saying this. Just my two cents, coming from very sincere love towards Drupal.


The little guys need love too

JacobSingh's picture

Hey everyone,

It's funny we've been hard at work on this for months, and hardly any of the people here commented on anything related to the project despite planet posts, IRC chats, etc, I didn't realize there were so many strong opinions out there!

There are a lot of use cases for this functionality, and a lot of reasons the little guys matter. Here are a few:

  1. Drupal is not "in with the big boys", I'm sorry, but it's still very far off. Drupal has recently (in the past year) become known as an "interesting new FOSS project" by some CIOs out there in the F500. It's not central to infrastructure, etc. There are some high-profile websites, I've worked on a couple, but it isn't going to bump off SharePoint anytime soon - but it should.

  2. Drupal got big because of people being able to use it quickly, and start contributing quickly. We are not going to go head2head with Sharepoint and Documentum, we need to come in from the side, and this means appealing to people at every level, and being a joy to use, not covering every feature under the sun. This will create a word of mouth tsunami type marketing campaign when the CEO of Nike's daughter's boyfriend makes a cool Drupal site for his band, etc. This is how we get in IMO.

  3. I volunteered to lead this effort because I see the new plugin manager having the following effects:

    • Opening up Drupal to non-technical users (this includes, btw many decision makers in the big players who want to be able to evaluate it quickly / try it out for their blog).
    • Improve the quality of contrib by changing the cultural from "commit it and quit it" to umm... (holding back the backwards compatible joke).
    • Makes it easier for me NOT to work in core anymore, and build contrib modules which I know everyone can use. (ahh... peace and relaxation).
  4. I did NOT volunteer to:

    • Build apt-get
    • Turn this on for my clients
    • Use it myself for anything but install profiles and maybe prototyping.

Seriously, people: Reeeelaaax. If we give people a tool they didn't have before, and the tool works fine, but sometimes contrib modules are messed up, we can't do much about that. It's still an improvement, and it is okay. It's not different from FTP'ing into your server and replacing the directory or using CVS.

The real issue is:

By opening up Drupal to non-technical users, we (as developers) will be forced to produce higher quality, BC code and better documentation

And I think this scares people, hell, it scares me enough to say that I would certainly not allow my clients to use such a system, but that's okay - I still see the benefit in it.

Jacob, I (and I am sure most

irakli's picture


I (and I am sure most of everybody here) have utmost respect and appreciation for all the hard work you and others have done on this, way before we joined the discussion. Sorry if that was not clear.

Neither is it an intention to come in late in the game and try derail the project. I may have my strong opinions, and wishes for Drupal, but of course that's not in any way where Drupal should necessarily head. All we ask is a clean and straightforward way to opt-out from the "small guy" workflow and use apt-get/drush-like tool for larger projects.

Thank you for your patience and for listening to us. I do realize it's not easy to take criticism late in the process.


Hi irakli, No hard feelings

JacobSingh's picture

Hi irakli,

No hard feelings at all, I'm just genuinely surprised there is so much interest all of the sudden. Thanks for being involved, would be great to get your feedback when we get it done too.

At present, Dries, Adrian and I met, we've developed the minimal viable product spec, I'm working on drafting it up, I'll post it publicly, and then Adrian and myself (and probably others) will start making new issue to address porting to an external file and a few other things.

A preview of the spec for Sep 1st (not forever) is:

  1. Be able to update modules one at a time
  2. Do dependency checking before update (but don't resolve them)
  3. Make a simple installer with no browsing interface, people can upload a tarball or copy-paste a link from d.o. and it will install it.
  4. Separate plugin.php, addon.php, etc not in update.module.
  5. Using chmod instead of actually moving files up (this one not 100% P1 yet, but most likely happening).

There may be a little more, there may be a little less, but that's the basic gist. I'll be sharing a more in-depth backlog, etc early next week.



emjayess's picture

when we see drupal gain favor from the CIO's office, & in otherwise what would've been yet another sharepoint site (YASS?)... I believe it'll be because of the absence of a strategy to go head2head with the "big boys", and because drupal stayed its course.

Folks who posit that there is a flaw in the current trajectory of drupal, or that a target market choice must be made (either we exclude the enterprise or we exclude the little guy)... seem too easily to forget what has already been accomplished, and what this community is capable of; and I don't buy it.

Matt J. Sorenson (emjayess)
d.o. | g.d.o. | WEBJAX'd! | twitter

matt j. sorenson, g.d.o., d.o.

Linux Kernel as analogy

Justin Riddiough's picture

Something that has shaped my perspective of Drupal was an interview Dries did some time ago where he described starting Drupal. The general idea was that the foundation to his development methodology is rooted in Linux Kernel development, especially in regards to modularity.

After that, I've always looked at Drupal from the perspective that it is something akin to an "OS" for a website. Not just any OS, I'm talking about the one that runs devices from cell phones to supercomputers and everything in between - the GNU/Linux Kernel. If the analogy holds, Drupal should be the solution for both the blogger and the enterprise, and of course, everything in between.

On occasion, I've even heard 'write one thing, and write it well' sentiments echoed in discussions...


adrian's picture

linux isn't an operating system. it's a kernel. redhat, debian and ubuntu are operating systems.

Is drupal the kernel or the operating system? that's what we need to be considering.

Drupal core: Linux

SLIU's picture

Drupal core: Linux kernel

Modules: Linux utilities & apps

Installation profile, features, ... : Linux distributions

There is enough room for everybody if we play by the rules (APIs).

Drupal : OS

Justin Riddiough's picture

kernel vs. os - Hadn't looked at it from this angle, but it makes sense. It even draws the line for comparison.

SLIU: Your analogies are pretty close to what I perceive, except the initial install of Drupal seems be more on the OS side.

A kernel to me would be the base code that has a 'default installation profile' applied to it that sets up all of the defaults (page, story.)

Optional Core modules includes a suite of applications (book, blog) that I wouldn't particularly expect with something like debian-netinst. Perhaps if I reach I can see them as a standard toolkit for interacting with users (i.e. ping, ssh)

db & file system backups before updates

purrin's picture

I know the time is very late to be discussing feature requests for Drupal 7, but in light of all the discussion regarding a plugin manager and potentially a Drupal updater, it seems like it would be pretty quick and straight forward work for us to one-better Wordpress in some regards, as well, rather than just shooting to be "as good as."

One quick, simple way to do that would be to add two checkboxes into a plugin manager that would cause the system to gzip a quick copy of the entire filesystem and/or a dump of the db to the file system during that process. This would be a major boon to wider adoption of Drupal and that is something that would ultimately be very beneficial to all of us. Would it be a potentially big (but temporary) server load? Sure.. Is this going to benefit serious developers and major web projects? Probably not... I think for users in general, though, it would be immensely useful. That's one reason why Wordpress has had tremendous success - standard operating procedure is to make it easiest for the 80-90% of users, knowing that power users will figure out how to get their work done in short order.

It actually might make more sense to just put this functionality as a standalone bit in core to make it easier for end users to stay within Drupal most of the time, including backups even when they are not upgrading any modules, in addition to firing off backups at upon updating a module. One thing that really frightens novices is the idea "I am about to do something that could potentially blow up and my site will be down and I won't know how to fix it." This would allow them a much larger, easier to access, comfort zone. I think anything we could do - especially when it is relatively easy to make happy - is great for users and good for Drupal as a whole...

-=- christopher

-=- christopher

Totally with you here. I

JacobSingh's picture

Totally with you here.

I actually wrote a module to just that for an Acquia product (our version of Wordpress MU) which we will be demoing at DrupalCon. I'm sure we will contrib it at some point.

In terms of the implementation, ours is super basic and will only work on a machine which can run shell scripts from PHP and uses mysql. we looked at backup_migrate but porting to D7 was too much work given the timeframe, and it seemed a bit overkill for what we needed.

All things considered, I think that for the time being, having a backup module in contrib is better. It can be the first module new users download with the plugin manager :)

If we were able to build a simple enough one to put in core, that would be a good investment too IMO. But if we get to the point where normal users can install modules easily and install profiles don't require you to separately DL each module, then core can become a little more light weight (code wise), but still appear feature rich to the end user, and I think this is a good thing.


one note...

purrin's picture

The one thing I usually feel compelled to point out in these sorts of discussions is basically in defense of the novice (to Drupal.) First, I do get keeping core as lightweight as possible.. I really do. Some things, though, are so beneficial to so many people - read: not experts - that it's worth giving a second or third consideration. So the thing I want to point out is... Every time we put a fundamental process a layer or more away from pure simplicity we make assumptions about the user. In this case, making a solution like I'd mentioned a contrib module, assumes that a) the user is aware that a solution exists that they need to be more 'safe,' b) they know where to look for it, c) they know how to install and use it, and d) they don't act like a normal human and be lazy :P (speaking from personal experience hehe)

On the other hand, properly executed, a plugin manager - with or without a backup system - will quite possibly in of itself bring about fewer upgrade 'crashes' by removing the user error/human element from the equation, so at least there is that. Anyways, just wanted to add a few thoughts there and play devil's advocate for the little guy for a bit. :)

-=- christopher

-=- christopher

Backup_Migrate pretty awesome

Boris Mann's picture

Sounds like it won't work on a lot of shared hosting setups in any case because of the shell scripts from PHP.

So, I'm pretty happy with backup_migrate. It's got a bunch of extra features around naming backups etc. - but things like the scheduled backup is pretty killer. Never mind the "migrate" so you can stick in a DB from your local machine and actual help with a "workflow light" for less pro developers.

And, if plugin manager means that we can define an install profile in core that includes contrib modules like backup, then that would work.

PM is *not* required to make better install profiles

dww's picture

I don't know what's giving so many people the notion that we need PM to improve install profiles. For literally years now, I've been explaining that it'd be a relatively easy job to make the d.o packaging script smarter so that install profiles checked into d.o's CVS can define a file that lists the other projects (modules, themes, translations) and the versions to include. It would checkout core, put the profile in the right place, and all the modules, themes and/or translations you defined into the right spots, and tar/zip up the whole thing. Then, you really have a 3 step process: 1) download the initial file, 2) unpack it, 3) visit install.php -- everything's already there -- you have your site up in no time. Even works if your site is offline -- so long as you got the initial .tgz or .zip somehow (USB stick, CDR, whatever). That's vastly better on so many levels than PM-based install profiles could ever be...

I'm putting on my post

JacobSingh's picture

I'm putting on my post Sep-1st hat here. This thread is all about D8 really, because D7 ain't gunna have any of this stuff and we can't even talk about it for D7 :) Okay, disclaimer over.

I think what dww is saying is right on. It is simpler and more reliable. AND, I can see a case in the future where install profiles are based on modules loaded from other repositories, or install profiles which are dynamically created. This would be harder to pull off.

I know it is a slippery slope to start comparing our setup with Ubuntu because there are many differences, but in Ubuntu land:

You have distros based on Ubuntu like Edubuntu.

These distros come on a CD you can download and install from. They have everything on board (this is the model @dww is proposing). It makes sense.

Then you have certain packages which are optional and don't fit on the CD, those are downloaded. We don't have this problem, because our software is very small and we don't need to send physical media anyway.

Sometimes, you have "meta-packages" these are more analogous to the Features work that DevSeed has been doing. These will actually install 5-6 other packages, and may contain startup / installation instructions, etc for the batch.

Should these be bundled up? A Feature may request packages from multiple distros (in the future), so it won't make sense to pre-bundle, but it could. :) You may very well want it to be more dynamic also, and be able to generate a spec for it and have it still be runable without re-bundling.

What is an install profile at the end of the day? It's just Drupal + a little procedural script to do some stuff. To make use of Drupal APIs to change things when the system is getting built. If we eventually support something like meta-packages, shouldn't an install profile just specify which meta-packages or "features" it wants, and then if it REALLY needs to, do some of its own setup?

So in a heirarchy of usability to flexibility: A Source (as in sources.lst) contains Install Profiles -> which reference Features -> which reference Modules and Themes.

And the Modules and themes could be included in the Install Profile, or could be downloaded separately.

Meh, dreaming, I actually don't think any of this is worth the time investment to get it perfect. :) I'd like to invest in integrating apt-get or yum or something with Drupal for these big problems, and build a simple PM for Windows folks and people on shared hosting who don't want to deal.



Boris Mann's picture

You are preaching to your biggest choir :P

My point being that if it ain't available in a core download, then the average end user on shared hosting just isn't going to see it -- or at least, very low penetration compared to "what comes out of the box".

And, did I mention that if anyone wants to run with this (that is, install profile bundling script upgrades for d.o.), the DA has matching funds for a fund raise available.

Both the Open Media Project

kreynen's picture

Both the Open Media Project and Radio Engage have already been talking to hunmonk and dww about funding install profile bundling on D.O. (basically treating profiles like modules and creating a single downloadable archive of the modules the profile). Both of these Knight funded projects have to deliver an install profile as part of next milestones. Getting the initial profile done isn't really an issue, but maintaining it is. If there are matching funds from DA, both of these projects should have funds to contribute.

I can't speak for Radio Engage, but we'll take you up on that match! Ping me on #drupal-openmedia if you want to follow up.

Coming to drupal as I

Macronomicus's picture

Coming to drupal as I did.... after tinkering with joomla for a couple of months & reading up on WP et al. I was a bit hesitant with Drupal (4.x demo site theme was soooo bad! lol) but after going back to d.o a few times I couldn't help but notice the fantastical modules folks were creating for drupal, & the intellectual level of discussions. After looking deeper I realized that drupal was nothing like other cms's, drupal is highly developer centric; and as such comes with the proverbial learning curve some have discussed. I wouldn't trade that for anything! ...it accords a certain level of understanding that were not merely some commercial interest resting on our laurels; were constantly stabbing @ the bleeding edge of whats possible, & whats more fun than that?

I dunno .. nothing against WP/Joomla (& they are fantastic) but they are not a challenge to me, they are for a different crowd, & that's not a bad thing, we are all in the same community anyways so why not let drupal remain dever centric? ...its probably the single greatest decision made early on in Drupals infancy IMO, & why its light-years ahead.

My two cents, I hope Drupal remains a challenge & keeps evolving so much that its never too "easy" for newbs instead it should be appetizing to devers who will ultimately contribute more & yes even come up with "ways" to get the power of drupal in non techi hands replete with options that do everything for them, & this is good so long as the core stays universal enough to create the in-flight management/entertainment platform in my Airship.

Looking into my crystal ball...... ahhhh yes ... I see an updater that soothes maintenance for non-techies and cowers in the omnipresence of developers.

Where we are today

adrian's picture

I'm just going to follow up with what jacob said.

This review has been successful in that it has allowed us to open up the discussion of the assumptions going into this system, while we still have enough time to correct them.

I've been meeting with Dries and Jacob, and I will continue meeting with them regularly until this is done. We have isolated a spec that avoids all of the issues I have made above, and will resolve any of the lingering problems I have about this system.

The new spec is completely capable of staying out of the way of serious Drupal developers, and will ship with a kill switch.

If this were an issue, I would feel no compulsion about closing it as soon as we create the new issues for this spec.

Something that is broken now

irakli's picture


have you guys covered the following case at all?

One thing that was broken in Drupal6, last time I checked: if module A depends on module B, as far as they are enabled in one go Drupal does not complain, but it does NOT check to make sure that B is actually installed before A is installed.

Basically the way it works now is:

  1. Let's say module A declares module B as a dependency
  2. Drupal makes sure that when A is enabled B is already enabled or indicated in the same submit flow (they are enabled in one go).
  3. Drupal collects all modules enabled, orders them alphabetically (or random? but definitely not looking at requirements) and installs one by one. It does NOT pay attention to the fact that since A depends on B, B must actually be enabled first.
  4. If "A" is using functions from B during installation (and since A comes first in alphabet, so it won't install B first) - you get nasty errors and broken installation.

A very real example of this is a module that deals with CCK and depends on content_copy, in the installation.

This is pretty bad. If correct queue-ing is hard to implement, we should at least ask user to not enable "A" until the installation of "B" is fully completed.

P.S. The only way out, right now, is to implement requirements hook in the installation script of A. Not a clean solution, though, and since the initial problem is not that obvious - a source of many possible bugs.

Have you run into such problems?



Hey, I have bookmarking that

waldmensch12's picture

Hey, I have bookmarking that page, I didn´t know it before but now I am realy happy to know such a good page with great issues.

Easy backup with easy updating

vado's picture

I think update.php or plugin.php shipped with core is the only way this is going to work.


Group organizers

Group notifications

This group offers an RSS feed. Or subscribe to these personalized, sitewide feeds:

Hot content this week