I have been harboring some growing concerns about the direction of the plugin manager integration proposals for Drupal core almost since their inception, but have not been able to make an informed opinion on the subject until I had properly reviewed the proposed system on a much deeper scale.
While my impressions of the project and the current implementation are fairly negative, I have aimed to frame my criticisms in a positive constructive manner because I believe that the goals of the project are worthwhile, and that for a large section of users this could be a very flexible and usable solution.
Unfortunately a lot of my concerns have been well founded, and I believe that the implementation of this system needs to be taken in a different direction for it to be able to succeed in its goals, while not negatively affecting projects like Aegir, Open Atrium and many other ‘serious developers’.
These are the underlying problems with the current implementation. Keep in mind, that these aren't necessarily problems with the underlying 'idea' of the system, so they can still be worked around or somewhat resolved by adapting the implementation.
The system as it is implemented now relies on a full Drupal bootstrap, because it is a Drupal module. There are more ways to break a Drupal site than I can even count, but this system automates these problems so that they occur with a much higher frequency.
Updates can break the system and have no way to recover
These issues include, but are not limited to :
- Modules can and will conflict with each other
- Modules can break the system update page
- Only enabled modules can be updated
An example of this I actually ran into while testing this patch is the following :
- installed dhtml_menu alpha1 in d7
- when i went to the update page to update the site , the following happened : [broken ftp details form](http://groups.drupal.org/files/Picture 25.png)
- when i used the broken form it resulted in the batch api JS not working : [broken update batch page](http://groups.drupal.org/files/Picture 26_0.png)
- I went and disabled the module, and found that it would no longer upgrade disabled modules : [disabled modules not upgradeable](http://groups.drupal.org/files/Picture 27.png)
I could literally have used any mechanism to completely break this system, but basically what I am getting at is this should not be a Drupal module.
This system needs to be insulated from all the ways Drupal can break, and still be available if the rest of Drupal isn't. If this is not done, it will cause an explosion in the support requests that are made by users who do not understand Drupal.
This system is not being developed with real data
During my research I struggled to find even a handful of projects that have multiple Drupal 7 releases at this point. I did however find the update_module_test project, but that just illustrates the point that this entire system is being built before we actually have a way to properly test it.
Almost all the other packages I found created side effects ranging from completely blowing away the site to subtle somewhat recoverable issues (such as blowing away my menus and blocks).
We simply can not try to build and finalize this system until the packages it is meant to install have matured. To do otherwise would be putting the cart before the horse.
The current implementation can not be opted out of.
Something that not a lot of people seem to understand is that this system is ONLY really useful for a very specific use case, namely that of the 'end user' installing drupal on shared hosting where they already have ftp available to them.
With this baked so deeply into the update module, it's impossible for people who this system will not work for to disable it. There needs to be an opt-out functionality for this, that can be easily deployed.
The requirements for running this system are in no way universal outside the target audience.
I have been a Drupal developer for many years, and not once have I allowed FTP access to any of the sites I maintain, due to the fact that FTP is inherently insecure and that I simply do not want the site admins messing around with the code on the site. If I were to give out FTP access, it would only be for the files directory of each site.
There is the ssh backpath, but that is even more difficult to roll out as there has never been a stable version of this pecl extension and it is not packaged by most distribution vendors. There
are also some fundamental issues around the security issues surrounding this extension that I will get into later in this document.
I had to go through a lot of trouble to get this system even working to test it. I've never run SSH or FTP servers on my local development environment, because I've never needed them. Why would I ?
This system breaks build management.
This system only really works for occassions when there is one and only one instance of a site. in a real development environment you have the local development instances, the shared development instance, the staging instance and the production instance.
The code running these instances needs to be managed with a revision control system like SVN.
This system completely breaks that model by making each of the systems separately updateable at all times, with no oversight of what the other instances are running and no way to get all this stuff back into a revision control system and actually managed.
With no 'off switch' for this functionality, it will become a hell of a lot harder to follow proper development methodologies and make the lives of 'serious' drupal developers a lot harder.
There's more on this subject I will be going into later in this document, but suffice to say this is rather a big issue for me and others.
The security model is security theater
A big deal has been made about how this system is supposedly more secure than the wordpress implementation, but as it currently exists it is may be more dangerous.
you are entering your credentials in a potentially hostile environment
This is a major major side effect of the functionality being a Drupal module and running in a fully fledged Drupal.
Any XSS exploit allows you to gain access to the uid1 account, will allow you to set up a custom block or something similar that can then simply sit back and wait for the affected user to update their system.
The first thing people do when a security release is announced is update their system, and this allows you to harvest system logins from infected sites, which will have much longer lasting consequences than anything else.
The plugin manager system should not be part of the normal Drupal code flow, it needs to be able to depend on a set of KNOWN good files available to it.
It is unsafe by default because of plain text passwords
Even if you set up the SSH2 extension, you are still sending your password via plain text.
We could set up a packet sniffer at drupalcon and get ssh / ftp login for hundreds of sites over that period of time.
To make this actually safe, you would need to set up HTTPS for the site. This is impossible for the main use case, and for the security conscious (who in all honesty would already be using Drush for this), it's simply another thing to manage.
I'm willing to let this slide ONLY because FTP itself sends passwords via plain text, which is a major reason why most security conscious users won't have FTP available for use.
The ssh2 extension is not really safer either
There has never been a stable release of the ss### extension in 5 years, it frequently doesn't even compile with the current stable version of libssh2, the requirements to build it are in no way commonly installed in hosting environments, much less the extension itself as no major OS vendors even package it.
After you go through the entire rigmarole to get the SSH2 extension working, you will still be sending your password via plain text. All the ss### extension will be able to do is encrypt the connection from the web server to the web server itself over the localhost loopback.
It is easier, and simpler to install an FTP daemon and configure it to only accept connections from localhost, and gives you the same conceptual security once you have configured HTTPS for the site.
This extension has no place in Drupal core. It has incredibly complex set up requirements for nearly no benefit.
Web server write-able files are not really that insecure
Firstly, we need to make peace with the fact that Drupal already does write PHP files.
That's how we create the settings.php.
What makes that 'secure' is that we set the permissions correctly after we have written it.
We don't need FTP or SSH2 available just to create the file, and neither do we need to do that for module downloads.
I actually went and looked at how wordpress does it, and their implementation is actually better than our current one.
If your permissions are correct for wordpress to do the filesystem level changes itself, it just works. But if they are not, AND wordpress doesn't have the right permissions to be able to change them ... it uses FTP solely as a mechanism to log into the site and change the permissions so that wordpress IS able to update itself.
Once wordpress is done, it sets the permissions back to a safer level (the same way we do for settings.php).
That system is a lot less complex and probably slightly more secure than the current implementation of this system, as the only thing it does is a chmod.
It doesn't try to write files with a higher permission level than the web server and there's no way to plug in and extend this functionality.
This system is not suitably multi-site aware.
This is a big issue for large implementers, and is a direct consequence of the narrow focus on the use case this system suffers from.
What needs to be understood with this functionality, is that Drupal is has more in common with wordpress MU than wordpress. A lot of the assumptions that wordpress can afford to make do not work for Drupal.
This system is destructive, replacing original packages.
I tested this by creating multiple D6 sites, and placing a module in sites/all/modules.
From one of my test sites, i updated this module and the code went and replaced the
shared instance of this module, instead of placing the module in sites/$site/modules.
What made it even more spectacular, is that the version of this module that was downloaded
broke the system spectacularly, breaking not just the site i upgraded but ALL the other
sites using this module.
Potentially hundreds of copies of modules.
If the system was correctly downloading the modules in sites/$site/modules, it would mean
that every module would be installed possibly hundreds of times in different file system locations for each of the modules.
There is no way that you can sensibly use this system in a multi-site environment.
The ftp credential nightmare
You would need to provide FTP credentials for each of the sites in a manner that they do not have the ability to wipe out each other's files in a manner that this system would understand.
This makes the requirements for using this system much more difficult.
Updates for shared modules
Because this site is running from WITHIN Drupal it has no concept of any of the other sites that
are running on the Drupal stack. What this means is that if it does update a shared module, or do a Drupal core upgrade, it automatically breaks all the other sites on the same code base.
It can break them so badly that you can't even log into the sites to be able to run the updates.
This system needs to be far more intelligent than Drupal is about it's own environment.
This system is too deeply Drupal.org oriented.
This is a side effect of not being able to disable this system, and goes deeper into why this system won't work for a wide variety of use cases.
Only drupal.org sourced packages are managed.
This system is only capable of feeding from Drupal.org. This means that if you developed a custom theme / module for this specific site, this system will not be able to install or update it.
There is currently a mechanism in update module to override the update url in the module .info file, but future plans for this system includes fetching and installing modules for install profiles and the like during install.
This is an issue related to build management, because the build is tested with a certain set of modules and other components. It's very likely that the code that is being automatically downloaded could break your sites in a variety of interesting ways.
But because this system doesn't allow you to easily syndicate your own glue code, it means that your entire build process is for naught.
It is very difficult to provide your own package source.
This is not an issue with the implementation of this system itself, but part of the underlying problem with it being centralized.
Until we had the feature server, the only way to provide a package source was to be running the project module locally, and also be running the cvs scripts and buy into the entire drupal.org packaging and management environment.
Doesn't care about local changes
Another case this will break down is if you needed to patch an upstream package to work with this specific environment. Even if you have contributed the patch back upstream and are working towards getting it included, this system will wipe out your modified files with no easy way to get back to what it should be running.
Even if you had completely valid reasons for needing to modify a module, there is no way to syndicate your modified version to use instead.
There is no mechanism to provide versioned dependencies.
This is as much an issue of Drupal core as any system built on top of it. If you have a glue module for your project that provides a bunch of panels for panels 2.x, there is no way to specify
that you should not just upgrade to panels 3.x when it comes out, blowing away your entire site with no way to recover from it.
How can we fix this ?
The implementation in wordpress works, and it works well, but even in wordpress I was able to find the cracks in their armor.
I fully believe we can implement this system, and implement it better than they have, but we have to accept that our requirements are far more complex than wordpress'.
This is a separate system.
Pure and simple, this can not and should not be a standard Drupal module. For this system to work for drupal it will need to be a completely separate system that is shipped with Drupal and may use some of the Drupal core APIs.
What this entails is that there is a separate 'plugin.php' or similar, that operates with it's own code flow and is never running when Drupal itself is.
The update module could still link into this system if it has been enabled for that site, but for security purposes it should use a different login / session ID , so that exploits on the site won't allow the attacker to gain access to the plugin manager.
Do not bootstrap Drupal fully
Similar to update.php, this system needs to be able to run even if the actual site is not running.
The moment we hit drupal_bootstrap_full, the contrib modules that may possibly be broken will be loaded, and people can start writing exploits for this functionality.
If we choose to use the Drupal API from within this system (and there are reasons we could choose not to), we would likely need to implement another bootstrap phase for Drupal,
namely something like DRUPAL_BOOTSTRAP_CORE, which only loads the files we know we can depend on.
From the perspective of major drupal version upgrades, it would be better if this system did not actually rely on the underlying Drupal API to function, as that would mean a broken core upgrade could still be fixed from within this system as it will still be functional.
Get rid of the false security
The code already in core is incredibly complex and actually counter productive. Even if we decide not to go the web writable route (which is a lot simpler, requires a lot less code and is very likely equally or more secure), then we should still get rid of the ss### extension.
Use a web writable 'overlay' search path
One of the things that I noticed from the wordpress implementation, is that unlike Drupal it only stores the modules it downloads in a single path that needs to be made writable namely the wp-content directory.
For all cases other than a core system upgrade, I believe it is sensible to have the downloaded code be stored in a new contrib/ top level directory, that mirrors the currently used directory search paths as implemented in : api.drupal.org/api/function/drupal_system_listing/7
This would involve a small loop which just prefixes 'contrib/' to the search paths, and avoids a lot of build management issues WHILE providing a simple way to roll back to the original code base.
Provide a simple killswitch for the entire system
A large majority of current Drupal users will not be able to use this system due to the points i mentioned above, so in that sense it should be possible to touch a file or something similar to stop the update links from being generated and so-forth.
Default site only login would also partially accomplish this, because if there is no sites/default this system will be silently disabled as there is no way to log into it.
A kill switch is a very simple way to fix the fact that this system breaks build management, proper development methodology and is far too drupal.org oriented while still making the very useful functionality available to users who can make use of it.
This system is meta-drupal
We need to accept the fact this system needs to know more about the Drupal installation than Drupal itself does. It needs to be aware of which sites are on the system and which modules they are using.
The way I envision it, there will be a single instance of this system for the entire Drupal installation.
To mitigate the issues in a multi-site environment we could enforce that only the uid of sites/default has access to this system, and that when this system operates it will take
all the other sites offline while it runs the necessary updates on them.
Another possible option is to manage the permissions to this system separately, in a sqlite database, which will be a convenient way to store the overarching list of packages on the entire system vs what's available and make sure that it's available even when the database isn't.
Make an exception on code freeze for this system
After playing with this system, I've come to the conclusion that it is going to be very difficult to do an iterative development process on it until we have a good set of actual data to use with it.
If we do choose to develop this as a separate system that is shipped with Drupal, I feel that it would be prudent to allow this system to be implemented primarily during code freeze.
This would allow us time to get it implemented correctly, and would allow us to build on the D7DX work going on in the community. As more packages become available for testing we can automate upgrades and migrations between various versions of these packages and fully kick the tyres.
The potential for this system is huge, but we need to make sure we implement it right.
Other things we can do ..
There are some issues I haven't touched on in these suggestions, but once we have the basics right we can talk about things such as solving the 'many copies of the same module' and other issues.
|Picture 27.png||18.08 KB|
|Picture 26.png||15.33 KB|
|Picture 29.png||28.61 KB|
|Picture 25.png||25.65 KB|