Code review of existing contributed modules is frequently cited as a means to improving the quality of existing contributed modules. However, the task of performing a code review falls somewhere between documentation, writing and reviewing patches, reviewing CVS applications, and creating new modules.
This is a proposal to elevate the visibility of module review, and make it easier for newcomers to perform module reviews.
This idea arose from the many ideas expressed in the IMPORTANT: Source code, projects, and commit access in a post-Git world thread.
A. What's the Need?
The Problem: Many contributed modules have had little to no outside review. This creates widely varying quality within the contrib space.
The Solution: Create a clear quality standard that contributed projects must meet or exceed, and communicate it clearly to developers and reviewers.
The Problem: Projects take too long to get approved.
1. Focus on the big impact items: security issues,licensing issues, knowledge of Drupal API's, blatant module duplication.
2. Get more people involved in reviewing modules.
Reviewing other modules helps everyone. The reviewer gets to hone their developer chops by seeing how other people have solved problems. New developers get to dig into Drupal's APIs, coding standards, security, and documentation methods.
People whose modules are getting reviewed receive the benefit of an external review.
The Drupal project gets the benefit of a more stable, higher quality contrib space.
B. Who Would Do This?
In an ideal world, we would see more Contributed Module Review Sprints. These could be done alongside/as part of code and documentation sprints. Module reviews provide another potential entry point for new developers looking to get familiar with Drupal.
Local DUGs could have Module Review Sprints as part of their meetups, and module reviews could be part of local Camps, and larger Cons.
Online discussion of modules and themes can happen in the Peer Review Group, which is shifting from site peer review (which has yielded no activity in over 2 years) to module peer review. (This is a new idea. We're open to ideas here.)
C. Steps to review contributed modules
These steps are largely distilled from the CVS Application Review handbook page, with some additions to make this specific to contrib reviews as opposed to CVS applications.
The review should be filed as an issue in the queue of the reviewed module.
C1. Initial Coder Review
Use the Coder module, Luke.
Is the code secure? Use http://drupal.org/writing-secure-code as an initial point of reference.
If your review reveals a security issue, DO NOT report it in the issue queue!. Report it to the security team.
C3. Coding Standards
Does the module adhere to coding standards? Use http://drupal.org/node/894256 as a point of reference
C4. Effective use of APIs
Use api.drupal.org as a point of reference.
C5. Is all UI text run through t()?
In order for a module to be translatable, UI text must be run through t().
C6. Documentation and Comments
Does the module use hook_help as needed?
Does the module follow Doxygen formatting conventions?
C7. Identify Potential Duplication
If the module appears to overlap with any other modules, identify the overlaps here.
C8. Identify Common Functionality Suggestions
- Drush support,
- themeable content,
- accessible ui and content,
- features support,
- unit tests,
These features are often relatively easy to implement. A little nudge in the right direction to example module, documentation, etc. may help the developer implement some of these "icing" on the cake features.
D. How to request a code review for your project?
To request review, create an issue and tag it with peer-review. To further specify a request for review of code, add the tag code-review. See Issues Requesting Review.
E. Tracking reviews
Use the project issue queue to create a permanent record of who reviewed the module, the suggestions that they made, and the project owner's responses. Create at least one issue for each area that you reviewed. Mark it active or fixed if no problems were noted in that area. Issues that will definitely block approval should be marked critical, those that could potentially block approval should be marked major, all others should be marked normal priority. After the initial issue has been created, workflow should alternate between 'needs review' and 'needs work' until the issue has been resolved. Then it can be marked fixed.
On the project application issue queue, if you decide to take on a review, you should assign the issue to yourself. This is a sortable field, so it can be used to help you find your issues, and also makes it possible to see who is doing what. If you do not complete a review for any reason, you can mark it unassigned, which will signal people that the review needs to be taken over by somebody else.
How about a 'view' that shows a report on the modules that have had the most reviews and the modules that have never been reviewed? To start, a simple format on project page descriptions, linked to the review issue:
Code reviewed by mlncn on 2011-02-08.