Code standards review is a incredibly difficult - http://ventral.org/pareview should be set as the standard

bailey86's picture

Hi,

I've spent several days working on my first module and have found the code standards review to be incredibly difficult.

First of all - let me say that I completely agree with strict code standards - and it's vital that new modules stick to these standards in all ways.

My module was written for D6.

As you guys may all be fully signed up module maintainers it might be an idea for me to outline the steps I've worked through recently so you get an idea of the problems for people publishing new modules.

Obviously installed coder and coder_tough_love.

OK - coder_tough_love is reporting 'Line 23: @param and @return syntax does not indicate the data type.' When in fact the data type has been defined. Bit of a nuisance if you know what it means - but a real pain when you don't. According to http://drupal.org/node/1354#functions the data types should be defined in the comment block.

But if I remove the data types (string, int, bool etc) coder tough love and coder report no errors.

Next, install pareview_sh.

This is really difficult to run - needs to be run from the command line - and this takes some setting up. But eventually, it can be run and reports it's result in raw HTML. Again - no biggy - but not easy for new guys. It also means you need to have shell access to the server.

BTW - I could not get it to work over http:// - had to get it to work on local files.

Also - it doesn't seem to want to use coder - but that's OK - we can run coder from the admin interface.

pareview_sh is complaining that the files aren't ending with a newline - when they are - but I add extra lines to all the files - which seems to make it happy

And next - php code sniffer.

Follow the instructions on the page to set it up - not easy - but do-able. Also, I'm not allowed to install PEAR modules on the server - so i'm having to mount the SSH share onto my local file sytem to enable my local phpcs to test the module.

I'm on D6 - so can't just install the module - so I get phpcs from PEAR and install it on the command line. But this version doesn't seem to be as strict as the php code sniffer Drupal module.

phpcs now complains about the newlines ending contradicting pareview_sh.

Problem

So I have three different tools - some of which are a real pain to get running - which conflict with each other. This is going to be a real barrier to people submitting modules.

Answer

What we need to do it - code, code, code - then run one tool to test the code against coding standards - tidy up - code, code, code etc

Now that http://ventral.org/pareview includes code sniffer it looks like all three tools are running from a single place. Obviously we need to upload the code the the repository before running a test - but no biggie - git is easy enough to use.

Can it be put on the docs that http://ventral.org/pareview is now the single test which needs to be passed? For example - my project is http://drupal.org/sandbox/bailey86/1278830 - and I'd really like to know that because it passes http://ventral.org/pareview the code is fine RE coding standards.

I think I can ask my current employer to supply some server space and bandwith if this would help.

Comments

The tools you mentioned are

jthorson's picture

The tools you mentioned are intended to help with reviewing, but I feel that publishing any message regarding "You need to pass tool 'X' to pass a review" is misleading. From a coding standards perspective, there is only one 'golden source' ... the single test which needs to be passed is http://drupal.org/coding-standards and all pages linked from there.

My motivation behind this is that I've encountered more than one person who already thought that my reviews were unfair or nit-picky, and since they were able to run Coder cleanly (on the loosest setting), they felt they were entitled to an RTBC. Coding standards is just one of many aspects to a review; and since it's something that can be automated, it can be checked/re-checked quickly ... the side effect is that these 'quick checks' can tend to dominate the application comments while waiting for a detailed manual review on the other factors; which can cause people to assume that the quick checks issues are the entirety of the review process.

From the applicant perspective, it would be wonderful to have only one tool that we can point people towards. The longer term goal is to have just this, with an automated tool tied directly into the application process ... so immediately upon activation, the bot would review your code and spit back the results. I initially started down this road back in May with the goal of automating Coder reviews through Drupal's automated testing infrastructure ... but it's been a long road, with a lot of hurdles along the way (primarily related to some of the inherent restrictions sandboxes have relative to full projects). In the meantime, we've got a number of additional tools which have been developed; and I anticipate that we're probably within a few months from the point where we can have the infrastructure auto-review an application, freeing up reviewer time to focus on some of the harder-to-automate checks.

Which of these tools becomes the gold standard? Well, as PAReview and Drupal Code Sniffer are both in their infancy (CS is less than a month old!), it's a bit premature to call them the new 'standard'. Coder is the incumbent, but likely to receive some revamping of it's own thanks to inclusion of Coder results on the new 'automated testing' tab for projects, and another initiative to enhance it to enforce Coding standards on Drupal Core.

Personally, I consider us to currently be in an 'evaluation' period from an automated tool perspective, and over the next few months, we'll take whatever learnings we can during these evaluations, and combine the best features of each into a single automated tool ... which will certainly address your 'which tool to use' question. However, I don't think we can 'just pick one' at this point in time ... we need to live with the status quo for a little longer, continue the development and evolution of the projects (both new and old); so that we can take the learnings from that evaluation and then move forward with decisions and actions based on results, experiences, and code which is slightly more stable/established (from a features perspective) than what we are currently using.

"Other factors"

beanluc's picture

"these 'quick checks' can tend to dominate the application comments while waiting for a detailed manual review on the other factors"

The output of the 'quick checks' is so verbose that maybe they should not be pasted into what is otherwise supposed to be a detailed manual review on other factors.

If the style tools mature to the point where they're (pretty) reliable and available in a hosted, self-service testing instance (so that absolutely anybody can run them without spending the weekend setting the tools up themselves), then applicants could be cued to "just use them" and not consider the results to be part of the code review.

In other words, code-review and coding-standards-checking would be presented as basically separate activities, both of which are necessary for successful project applications. The "Needs work" status would be for code-review hangups and a "Needs cleanup" status could be included, for when bad style is the only outstanding problem before a project passes.

Earlier reviewers could comment briefly that they observe style problems, then save their verbosity for actual manual review on other factors, and leave it to the applicant to use the style tools as well as consider work on the actual code review. There shouldn't be any reason, if the tools are that accessible, that reviewers should have to paste the style-tools outputs into the issue.

I agree completely. And I

doitDave's picture

I agree completely. And I want to add a point that really annoys me, before adding my proposal on topic (I already wrote this elsewhere, but hey).

While I waited for my "approval", I tried to shorten time in helping with code-checking and, as far as possible, deeper reviews of other applicants. I also contacted some reviewers, raised issues here and in the drupalcs/pareview issue queue and also was "frustrated" at some level and, initially, mistook some well-meant hints from more experienced users. Blame it on the excitement and, of course, don't mind if others do as well. But what really frustrates me as a REVIEWER is when someone steps into the queue, drops something "quick & dirty" with (confessed: not too often, luckily) obviously not the least commitment to what is understood as best practice and minimal requirements in the community he wants to join and, worse, complains that "no one looks at my stuff after now already 4 days!!1!eleven". Or, even worse, receives 600+ trivial errors that could have all been omitted by just reading the very first three pages of the coding standards section and then demands that "someone could do a real review now??!?". Or, just call for the next review without objecting to former reviewers' comments at all. And, worst of all, I found those few companions to be the ones who never ever helped with one single review theirselves.

Don't get me wrong, guys, that's harsh words of course but if you think back for a minute, you will find some examples and probably agree that there is more than just the bare review process that needs justification, IMO we also lack of clarifying that some of our expectations are not just a polite wishlist but, well, real requirements, expectations. IMHO there is absolutely nothing we need to excuse for if we demand some adherence to what we consider important. And, besides, I really don't see why strict standards should keep contributors from contributing. Standards are the blood and soul of any IT development and what comes out if we try to ignore them showed during the so-called "browser wars" in the late 1990s. No one wants that back.

So, to get back on topic again: Yes. Coding standards should not be part of the review process. They should - and this, like many other things, is not new and not undocumented at all! - be completed BEFORE the review process even starts. So we should as well ease self-checking a lot (Patrick's idea is a great step in this direction) as also reject new applications which show evidence that there is no caring for these requirements (and let them re-queue with a new issue once they feel finally ready). Or, in positive speak, let's reward those who initially prove they care for the standards with a fast review and approval.

By the way and related to this: May I ask for your opinions on this: http://groups.drupal.org/node/195768 as I have just moved it from a comment to a separate thread. :)

Thanks for reading!

PS., personal note: Adding to the "standards are blood and soul", I don't really share the popular complaint that separates "coding" from "formatting". Code is both syntactical AND visual standardization. Especially when it is not just designed once to stay as it is but instead is targeting on being extended and thus quickly understood by everyone else in the future. We should really, really try to make applicants understand this goal as it is crucial for Drupal.

Agreed on most points

bailey86's picture

Hi,

Thanks for the reply.

I agree completely that passing code standards alone does not warrant RTBC.

I agree that http://drupal.org/coding-standards is the gold standard - however - we can't manually check modules - it would take far too long (and Joomla would laugh at us) - we're coders, so we build tools for that boring stuff.

I think the problem is that some users can't set up pareview_sh or code sniffer (which are difficult to set up) - so they fix one format error - do some code - introduce another error - submit for review - then get the new error pointed out to them - and so on and so on.

Even if - like me - you can set up the tools - they are buggy and can conflict with each other.

What users need to be able to do is to run the same code testing tools that the reviewers are using - so that we can fix errors before submitting for review.

(I've been coding solidly for sixteen years - and tend to get it right first time - but even using emacs add-ons I'm still likely to leave in a trailing whitespace or leave out an ending newline if I'm working hard).

So, if all users test their code with http://ventral.org/pareview after pushing up to the repository then the code should be good before the reviewers review the module. This would save masses and masses of time from the looping around and around RE coding standards errors in the review tickets.

The long term automated process of testing on submission sounds good - but will not fix this current blockage. As a Drupal developer you can probably submit any modules you want - but for us new guys (even experienced programmers working on a contract) this is incredibly frustrating.

I agree with your description

doitDave's picture

I agree with your description of "how it is right now". However you point to a common misunderstanding IMO:

they fix one format error - do some code - introduce another error

Not that I wouldn't have tended to this as well. It is probably human, at least for ambitious developers ;), but basically a fault if you keep Get your project into a state you feel is release-ready. in mind.

I know we shouldn't blame each other for this, but eventually advice applicants NOT to "do some code" while in the process but either to dismiss for another week and "do some code" or to wait until approved.

As I said, you are right. This IS a source of many overhead in the process. I am also not sure how to best deal with it, after all, "doing some code" is inevitable if the reviewer finds something to be fixed.

So the key will most likely be to ease self-checking. Yes.

I agree with jthorson in all

patrickd's picture

I agree with jthorson in all points.

It has to be clear to all applicants that only passing automated test does not mean to get RTBC.
Also it's really annoying posting these automated tests into applications.
-> I hope all these tools get stable soon, so they can be integrated to drupal.org's infrastructure and most coding standart issues can be resolved -before- the applicant creates an application issue, so reviewers can concentrate on in-depht reviews.

As jthorson said, first we have to wait until these tools are good enough to be integrated officially.
And this will take some time..

But it would already be helpfull if more applicants would use these automated tools - from now.
Documentation is not the only problem, how do we get them reading it? - Not all, but many applicants give me the impression that they've never had a close look on the documentation. Often they have to be pointed to it - while the application process.
I don't think this is the way it should go.

Thanks millions!

bailey86's picture

Thanks millions for http://ventral.org/pareview - it will save Drupal!

I don't think waiting for full integration is an answer to the current problems - getting integration sorted could take years. In the meantime new module developments are getting seriously held up.

Pointing at the automated tools would be an idea - it would have saved me some time - but the problem is that these tools can be buggy and conflict with each other.

For example, I'm writing a D6 module - so I can't use the phpcs module (D7 only) so I've used the PEAR version - but this would be very difficult to set up for most users. And then, the PEAR version doesn't pick up anything like the number of errors that the phpcs module does.

The key part about your site at http://ventral.org/pareview is that it runs phpcs and pareview_sh (which I think runs coder) and they don't conflict with each other!

If you want help with hosting I'm sure my current employer who are a large worldwide media organisation and big users of Drupal would be happy to supply web space and bandwidth.

Also, if you want help in maintaining your tool then please get in touch - I think it will be the ideal tool to use for checking against coding standards. For one - it combines several tools - for another thing - it pulls from the git repository so it actually checks code which had been submitted.

Thanks for your offer, but

patrickd's picture

Thanks for your offer, but currently the server is powerfull enough to handle it.
Before better servers are needed the script must be able to execute multiple submissions at once.

But if necessary later, I'll definitely get in touch with you.

Thanks

bailey86's picture

Thanks for your reply.

Hopefully the load won't be too heavy - it's a very specific tool so won't be needed by thousands of people. But the offer still stands if needed.

And thanks again for putting it together. Well done for making the tools not conflict with each other!

Does it run coder as well as pareview and code sniffer?

That's cool, but I don't

patrickd's picture

That's cool, but I don't expect thousands of applicants using it at the same moment ^^

It's pareview.sh that puts them all together:

it handles
- checking git branching, and some other stuff (most pareview functionality is moving to drupalcs at the moment)
- Drupal Code Sniffer output
- Coder (drupalcs reduced version) output

Most of this is developed by klausi, I just linked it with an online service.
So the biggest thanks should go to him ;-)

Agreed again. And we already

doitDave's picture

Agreed again. And we already agreed on this in our private mails; the "quiz" would be a perfect check for "have they read".

Process scratch:

  • Solve quiz about coding standards
  • become able to do online checks
  • Solve another short quiz on how the process works
  • Be able to apply.
    (..., *show evidence to have cheated and/or understood nothing->go back to zero ;))

Problem explained - and solution proposed

bailey86's picture

OK.

There's a problem here for module developers - and it's causing a blockage. Basically the code needs to match the standards - but the current tools used by reviewers have bugs and are failing correct code for the wrong reasons causing us new module developers to go around and around in cirlces. Now we have one tool at: http://ventral.org/pareview which can solve our issues - but we need to check it doesn't get overloaded.

First off - I think we're agreed on principles.

  1. Passing a coding standard review alone does not mean the module is ready for RTBC.

  2. The coding standard testing should be simply a first/minimum step which module developers need to pass. It can be used as a first test to weed out the really poor submissions. And I agree with beanluc - if the code format is the only reason the module is not ready then the module can be set to a status of 'Needs cleanup' or similar.

  3. Also, I don't think it is feasible to check code manually - reviewers need to use a tool - and coders need to use a tool to quickly identify errors.

So, the problem is this.

The current testing tools are still in development and buggy - but they are being used by reviewers to block module submissions.

Example 1

Pareview_sh had a serious bug - http://drupal.org/node/1340276 - giving a completely wrong response which meant people were changing their code incorrectly. This code was then failing code sniffer. This bug was only fixed a week ago. But not before I changed every single file in my module which had to be changed back again.

Example 2

The documentation - http://drupal.org/node/1354#functions - recommends putting variable types in the function comment block - I.e.:

* @param int $number
*   Description of this parameter, which takes an integer.
* @param float $number
*   Description of this parameter, which takes a floating-point number.

but coder_tough_love has a bug so fails it - this conflicts with code sniffer. (I've filed a bug at: http://drupal.org/node/1364816)

As mentioned - these tools are being used by reviewers to fail a module submission and new module submitters are getting really frustrated with the process.

Moving forward

I know we want an automated submission tool etc etc but if this is months away then that's not an answer for the problem we have today.

There is an answer - if the submitters and reviewers were to standardise on using http://ventral.org/pareview then submitters have one place to test their code. It would save huge amounts of time.

The issue is that the site seems to be getting very busy. As mentioned, if patrickd wants to get in touch I'm sure my current employer (a large worldwide media organisation which strongly supports Drupal) will supply server space and bandwidth.

But at least this site seems to run all three tools (coder, pareview_sh, phpcs) and crucially - they don't conflict with each other.

You can use my project for an example/testing etc - it's at: http://drupal.org/sandbox/bailey86/1278830

Now currently, it passes all the tests at http://ventral.org/pareview - and - I'd like to get on with further development.

But if a reviewer now pastes in error messages from a buggy old version of pareview_sh or coder_tough_love it will block my module for the wrong reasons.

This is why you're getting messages about the code submitters getting annoyed.

As mentioned - code standards are great - some of us have been coding for donkey's years using them - but the current tools can be buggy - and reviewers are using them to block module applications. http://ventral.org/pareview at least gives us one target.

I'd be in full agreement with

jthorson's picture

I'd be in full agreement with updating the application documentation, suggesting that applicants use http://ventral.org/pareview before submitting their application. The 'Tips for ensuring a successful application' page would be a good candidate. But as explained earlier, I would not use the term 'standardized' for these tools right now.

My biggest concern is that lazy applicants will use the automated code review results as a substitute for actually reading the coding standards themselves ... this is already happening to an extent, and I would like to avoid making this problem worse. Recently, even reviewers are locking in to the tool results, instead of actually referring to the standard. Blocking because it doesn't pass a script is not the recommended practice ... using a script to identify issues, and then blocking the application because it those identified issues do not meet the standard is fine, so long as the reviewer is referencing the standard and not just the script.

Incidently, I also disagree with the common practices of pasting a 'wall-o-text' direct from a review script into a comment, which is extremely off-putting to new applicants. Look at the results, identify the common issues, and then explain the required changes in three or four lines ... attaching the script results as an attachment to provide a detailed line number assessment for the applicant to review if desired.

And as a final side note, you have brought up Coder_Tough_Love on more than one occasion ... from the project page itself, Coder Tough Love goes far beyond 'coding standards', and will trip false positives. If it is actually being used as a basis for blocking applications, than this is a mistake - if you do encounter specific instances of this, then please flag them so that we can ensure that the practice is not continued.

IMHO I'm afraid your concern

patrickd's picture

IMHO I'm afraid your concern is already reality.
At least the not always simple described errors of these automated reviews force them to take a look at the documentation.

I now began to point new applicants to ventral.org/pareview without changing the status to 'needs work'.
Example:

You got some coding standart issues (http://ventral.org/pareview/httpgitdrupalorgsandboxhenry01363692git), you can use http://ventral.org/pareview and try to fix them.
If you got any questions on that, please ask!

At least this should be done as soon as possible:
Moving from a master branch to a version branch. For additional resources please see the documentation about release naming conventions and creating a branch in git.
Remove "version" from the info file, it will be added by drupal.org packaging automatically.
Remove "project" from the info file, it will be added by drupal.org packaging automatically.
Remove "datestamp" from the info file, it will be added by drupal.org packaging automatically.

Now we also could think about putting it into application documentation, but like jthorson sais this could lead them to get lazy - before - they might read the coding standarts.

So i think pointing to http://ventral.org/pareview at the beginning of their application is the best way.
We give them the tool, so they can fix coding standart issues them self and there is no need to paste in automated review results into the issue.

I agree

bailey86's picture

I agree with your points.

But I don't think people will get lazy.

I think the review forces them to look into why their code isn't correct and they then have to correct it themselves. And if they have to correct it then they have to understand what needs to be done.

If the tool automatically corrected the errors then I would say that would make them lazy and be a "Bad Thing"tm

First: big thanks for the

klausi's picture

First: big thanks for the pareview.sh online service - good job!

I disagree on this very detail: if there is a large amount of coding standards errors, we should set the issue to "needs work". Before I want to do an in depth review I want that the applicants fix most of the code style, so that I can better read and understand their code. And the status changes are our workflow system: other reviewers should not waste time on looking into applications that require numerous fixes anyway.

Most of the time I do a manual review as well, if the code style issues have come down to a smaller volume. I think it is a good idea to attach the drupalcs output instead of pasting it directly, not sure how pareview.sh should handle that (adding the drupalcs result at the end might break the nice layout on ventral.org/pareview).

Feedback from a current applicant

barnettech's picture

Hello,

My Droogle project is currently being reviewed. Some feedback: It would help if it were very clear to applicants to run the coder module with the setting not set to normal, but set to find minor errors as well, since the active reviewers are looking for this level of compliance. Frustration will be minimized if it is very clear to run PAReview, and coder in minor mode. The fact is folks like myself don't necessary derive much benefit from donating their code.... we choose to contribute, it's the right thing to do, we want to give back to the community, code sharing is a great thing for society and we all love the benefits of open source. That being said ..... I have learned quite a bit during this process (my module is still being reviewed), but I think the frustration is due to the fact that folks just want to quickly share their code -- they don't really know if they will benefit from the exposure of getting something published on drupal.org, perhaps there is a benefit, perhaps not. So really folks are just looking to quickly make a contribution, much like you want to quickly throw a quarter or a few dollars into the Salvation Army pot. I don't mean to minimize my desire to want to contribute.... I just want to help in this conversation -- if it's not easy for folks to make a donation, folks (like myself) get frustrated and no matter how worthwhile keeping README.txt files to 80 characters wide or less, this is frustrating to a code donor that doesn't understand the requirement. There are a lot of coding standards, as I'm going through the process I'm learning of their true value. Some I still don't fully understand -- like why to have more readable code I can't just in Eclipse right click and choose "source" and then "format". I tried to be courteous, I really prefer the look of how Eclipse formats my code. I wanted to just make a quick code contribution to drupal.org and be on my way toward contributing to the world of Drupal / Education (I work for a College). Anyhow I almost abandoned the application process, it takes months often times I see.... I have in fact learned a lot, opened myself up to learning why lots of these standards exist. But I wanted to add to this conversation thread which I really have enjoyed, in contributing what I see as the biggest problem: people really do want to contribute, we all love Drupal, but folks just want to throw a few dollars in the Drupal Salvation Army bucket, and be appreciated, not scorned for not having 2 spaces instead of tabs in their code. So far I've actually been really impressed with the reasoning for many of the coding standards, and I'm sad that it's been a frustrating process for my generous reviewers as well. We're all just looking to contribute.

I have enjoyed this thread, and I too am looking forward to there being a standard automated tool to ease the submission process. I hope to have my module accepted, I hope to minimize the frustration in the process. Maybe I'll feel comfortable in helping to review projects in the near future. Anyhow I thought it might help to give feedback from a current applicant -- since I know where we new applicants are coming from. We think our modules are great, my employer thinks my module is great, they pay me handsomely for my module already, who are you to question how good it is .... That is the mindset. That is why new applicants don't read the code standards docs so quick.... They think their module is great already. As I said I've come to respect lots of these code standards. Folks just want to make a quick contribution though remember.

To be clear, I'm not against these code standards. I am for helping folks want to contribute more, and that means ideally making it so they can make a quick contribution while still maintaining high quality. I applaud your efforts here, and will try not to be too annoying any longer as a newbie. My time is still very valuable (to me anyhow) and I still regret I may have missed a few items in trying to get my module (Droogle) approved in my haste to do the 400 other things on my plate. Anyhow I almost abandoned the process, will stick with it, and hope to contribute other modules as well. It may be unavoidable that the process takes a long time currently? Perhaps, but it sounds like you're (we're) a lot close to having a good automated tool pave the way toward easy donations.

Cheers,

Barnettech (feel free to review my Droogle module which is in the que ;)

For Eclipse ...

jthorson's picture

For Eclipse ... http://drupal.org/node/75242 should help you with the 'just click format' desire.

To be honest, I consider it a good thing that our review process isn't trivial for folks who just want to make a 'quick donation' ... part of this process is to make sure someone is willing to put in a bit of time and effort on their project. This helps to ensure that they are dedicated enough to stick around and actually maintain their project after listing it on Drupal.org; instead of simply uploading their code and considering the project 'finished' ... The current list of 'abandoned' modules is large enough as it is. :)

Just sharing is fine, but I

klausi's picture

Just sharing is fine, but I think sandboxes are the right place to dump your code. Full projects require more responsibility, as they take up space in the Drupal name space and should be maintained on a regular basis.

barnettech's picture

I did not mean to imply my project would not be maintained. I hoped my comment to help you understand that applicants want to contribute and keep a project maintained fed and watered but during the application process are being convinced that these coding standards should matter for harried developers who are short of time. Lots of these standards value are not immediately apparent.

I think what you point out is

doitDave's picture

I think what you point out is helpful in the way that it shows some motivations. (Motivation is an important entity to consider in this process, probably you and the rest would want to add to http://groups.drupal.org/node/195768 as well.)

I personally think, "just giving back some 'bucks'" is not really the most popular motivation. You mentioned "benefit", and here we get closer to the point. What benefit does one expect if he is "nothing but willing to give back some bucks"? There are, for sure, many motivations. Some do apparently have a somehow commercial background, others a rather indirect one, thirds probably just a "coolness" and/or "being part of it" base. This is all fine, or at least nothing I would want to judge about. But the REAL benefit for someone who really "just gives back" is the one he has already received in advance and that is, in the case of Drupal, a rather fantastic free-of-charge CMS. ;)

However, what you point out and what I subscribe to is: Lack of understanding WHY t.f. there are all those "annoying", "nit-picking" standards. And I agree that we all lack of clarifying WHY we demand them, as there actually are really good reasons for almost each of them. Instead of this, some of us tend to excuse for these demands - which not really improves applicants' understanding for that ;)

Finally, regarding the "maintaining" statement: Just browse those 1000s of modules and see, how many of them have hardly 5 commits (all within the first week), ten times as much open issues (hardly one ever solved) and show a terrible, sometimes hardly anyhow understandable code (as tough reviews as we do it now haven't been in place that long). Funny as it is, not few of them are literally claimed to be that "giving back buck" you mentioned, but accidentally "sponsored by [link_to_company]". Sure, you may say "at least; better as if there were no such module at all". Partially agreed, but in many cases it is really annoying if the come out being buggy and needing more patching work than a comlplete rewrite would have done.

TL;DR: More than additional rules we probably really need clearer policies and statements. And we need anyone to understand WHY we have them, even if they don't all agree with it.

I agree with the klausi

bailey86's picture

I agree with the klausi.

If you want to dump some quick and dirty code on the repository then it can go into the sandbox.

But, for code to become a full Drupal module it should be:

  • Professional.
  • Complete.
  • Fully working.
  • Fully commented/documented.
  • Most errors should be caught and handled.
  • A Drupal help page should be created.

Thank you for great job with

Vikom's picture

Thank you for great job with the online tool!

A suggestion is to remove the recaptcha and use Honeypot instead.

Thanks for the reply

bailey86's picture

I've added a comment to the Tips page. And I agree about not using the term 'standardized' for now - things are still in development and may change and it sounds like there is a longer term plan for a more integrated solution as well.

I agree - it is the standard which has to be met and not a tool as such. I think if the script automatically fixed the errors then the coders would get lazy - but making them fix the errors I think means they have to understand the standards - or at least read them!

If they're using hacky bodges to scrape pass the tool it should be pointed out to them.

I agree also about avoiding the wall-o-text - and pointing out what has been done wrong would be better.

Being of the usual anal type I was using coder_tough_love to block myself! There is a slight issue RE data type declaration checking in coder_tough_love - http://drupal.org/node/1364816 - but I'm working through it with the developer currently.

I've added an item to the tips page

bailey86's picture

As mentioned I've added the pointer to the on-line tool to the tips page at http://drupal.org/node/1187664

Being cheeky - if all this work I've done counts as helping with the review process then my current project for review is at:

http://drupal.org/node/1330454

:0)

I will carry on looking through the current requests for review and see what modules I can help with by reviewing.

Thanks.

I've added the

jthorson's picture

I've added the ventral.org/PAReview link to the page content itself, under the 'Coder' section.

Please add to the documentation ..

barnettech's picture

Should we also clarify to applicants they should run coder in minor mode? This would have saved me lots of back and forth. Adding the link to the par view tool should help others quite a bit with the current review process that is great

Coder should run in minor mode by default

bailey86's picture

Hi,

It might be an idea to put an issue in the coder issue queue to ask the maintainer to set coder to run in the minor mode by default.

Code Review of Full Project Applications | Home

Group organizers

Group notifications

This group offers an RSS feed. Or subscribe to these personalized, sitewide feeds: