Secure Drupal and Drupal 7

We encourage users to post events happening in the community to the community events group on https://www.drupal.org.
LittleLion's picture

I am implementing a site that will contain some information that should be secure. I looked up various posts and modules about security and it seems that by default all passwords are sent in clear text across the Internet in D6.

I found little to no information on securing D7 logins or pages.

Is there a simple solution to force the site into https?

Thanks for any advice,

Matt

Comments

If you want everything sent

christefano's picture

If you want everything sent over an SSL-encrypted connection, put this at the bottom of your .htaccess file:

RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}

Be sure you have SSL working before doing this or your site will immediately become unavailable.

If you want mixed-mode, where some traffic is encrypted and some isn't based on the path, user roles, etc., I suggest reconsidering and using SSL for everything. There is almost no drawback to encrypting everything (the one downside being that the server CPU will have some extra work to do depending on the security of the SSL certificate).

I have been using secure

ahimsauzi's picture

I have been using secure pages for mixed mode sites. The module has its issues but for the most part worked for me.

Christefano, would having everything encrypted be more toll on the CPU than adding additional module?

In the case of Secure pages each request has to go through the module to determine what mode it will be served. It sounds like more work to do but I would love to know for sure.

Uzi

Do some benchmarking

BTMash's picture

Its not always ensured that adding SSL cert will only result in a little more work on the server; benchmarking what is going on is critical. Even though its not necessarily very accurate (since lots of factors can be looked at depending on the type of audience coming to your site, etc, using something like ab (stands for apachebench) or jmeter will help you see what kind of traffic your server can handle. Your command would be something like ab -n 1000 -c 50 <site> where -n is the number of connections you want to test out on as a whole (so in the top the command would run until 1000 connections were fullfilled) and -c 50 is the number of concurrent connections to handle at a time (so it would simulate 50 users connecting to the site at the same time). Be very careful with the concurrent connections number as it can bring sites down (and yes, I have done so on sites/servers I've managed). So start with small numbers.

Anyways, to show my point, I ran the command on one of my sites (this is a page that has already been cached). It was able to handle 150 requests per second. I ran the same command on another one of my sites that always runs in SSL on the same server (also a cached page; however, the second site uses fewer modules just to be clear), and it was able to handle 30 requests per second. So the drawback was the server could only handle 20% of the intended audience over SSL.

Something like securepages will put strain on your server as well; but you should check if mixed-mode SSL works in your scenario on having better load (and also on a better user experience) while still having a relatively secure server.

Recommended reading: http://crackingdrupal.com/blog/greggles/drupal-and-ssl-multiple-recipes-...

That is a great article,

LittleLion's picture

That is a great article, thanks for passing it on. It confirms what I read elsewhere, but was not specific to Drupal 7. I have forced the site https for now and will profile the heck out of it before it goes live.

Unfortunately none of the modules reviewed have been production released for D7, a problem I have found on most of the other documentation I have gone through.

Thanks for all the help!

Matt

Those numbers are pretty disillusioning

christefano's picture

Those numbers are pretty disillusioning. I wonder why your results are so terrible. Whatever the answer is, it involves which version of OpenSSL being used (never versions use 10x less memory per connection), which cipher the certificate uses, whether it's a bundled certificate, what the key size of the certificate is, etc.

Did you run ab on localhost? Can you re-run the test on a static file so that MySQL and PHP are eliminated as bottlenecks?

The word on the street is that SSL is no longer computationally expensive. I think Overclocking SSL is the authoritative article on this (note that it was written before all the major certificate authorities started phasing out 1024-bit keys in favor of 2024-bit keys) and that Overclocking mod_ssl has a good interpretation of it.

Tried it on CHANGELOG.txt

BTMash's picture

Just gave it a try on CHANGELOG.txt and similar results: 150 requests/sec on non-ssl and 30 requests/sec on ssl. I ran ab from an external source. We use an external certificate from verisign so that checking (dealing with an intermediate cert), more likely than not, also comes into play.

With that said, I'm not saying 'don't use ssl'. Heck, the site that is on pure ssl right now is on for a reason (well...I could probably tweak the settings so that only logged in users get ssl but then I'd be rambling on the issue - I need ssl enabled in some capacity for that site regardless of the performance impact). I'm simply saying that there may be some level of a performance impact on your hardware, large or small. It is important to always benchmark and see what kind of results you're getting so you're not in for a shock down the road (the 20% of the performance situation I face shows that it is a real situation that people might face. People could have a limitation from their hosting provider that they don't know or can do anything about).

I just ran that command on my

LittleLion's picture

I just ran that command on my local site, a quad core xeon stand-alone server with 6G memory running Fedora12 and results were interesting.

SSL:
Concurrency Level: 50
Time taken for tests: 11.131 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Total transferred: 3211332 bytes
HTML transferred: 2939712 bytes
Requests per second: 89.84 [#/sec] (mean)
Time per request: 556.571 [ms] (mean)
Time per request: 11.131 [ms] (mean, across all concurrent requests)
Transfer rate: 281.73 [Kbytes/sec] received

Non-SSL:
Concurrency Level: 50
Time taken for tests: 0.305 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Total transferred: 3201198 bytes
HTML transferred: 2930928 bytes
Requests per second: 3281.97 [#/sec] (mean)
Time per request: 15.235 [ms] (mean)
Time per request: 0.305 [ms] (mean, across all concurrent requests)
Transfer rate: 10260.00 [Kbytes/sec] received

Running the same test to the same site on a Grid server:
SSL:
Blocked

Non-SSL:
Blocked

Oops! Looks like the ISP has marked me as a DoS site, all my computers are offline now...

Don't try this at home ;)

Matt

Ok, got timed back in. The

LittleLion's picture

Ok, got timed back in. The difference on the cloud between SSL and clear HTML is much less pronounced. The data is fairly consistent albeit for a much smaller sample size. A 50% performance difference is much more acceptable.

How much of this test relies on the performance of the local system or the intervening network? What could explain the huge difference in performance in this test with my local server?

Here are the datum from the Grid server:
SSL:
Concurrency Level: 1
Time taken for tests: 9.772 seconds
Complete requests: 10
Failed requests: 0
Write errors: 0
Total transferred: 44630 bytes
HTML transferred: 43470 bytes
Requests per second: 1.02 [#/sec] (mean)
Time per request: 977.156 [ms] (mean)
Time per request: 977.156 [ms] (mean, across all concurrent requests)
Transfer rate: 4.46 [Kbytes/sec] received

Non-SSL:
Concurrency Level: 1
Time taken for tests: 5.082 seconds
Complete requests: 10
Failed requests: 0
Write errors: 0
Total transferred: 44630 bytes
HTML transferred: 43470 bytes
Requests per second: 1.97 [#/sec] (mean)
Time per request: 508.151 [ms] (mean)
Time per request: 508.151 [ms] (mean, across all concurrent requests)
Transfer rate: 8.58 [Kbytes/sec] received

Matt

Great advice to always

greggles's picture

Great advice to always benchmark and thanks for suggesting the crackingdrupal article ;)

One other point is to make sure the session cookie is being set as secure so that a user who flips back to the http:// version (because of an external link, a bookmark, or an image that is referenced with the full URL) will not accidentally send their session cookie.

I believe that was fixed for Drupal 7 so that if the login is on an ssl page the cookie will be set secure only, but it may not be fixed for Drupal 6 yet. It's worth confirming.

Can't find the right pages in Drupal.

paul_constantine's picture

Hi there,

I like the solution to run everything over secure pages but can't get it to work. For some reason all my links can not be found by Drupal anymore.

The SSL works fine, the apache webserver restarts without any errors and I have a little green padlock. But when I want to access the site I just get the standard index.html page. So I figured I just point out the file index.php in my vhost file. And suddenly I get the right page. But all the subsequent pages can not be found either.

I believe there must be something wrong in my directives in my vhost file. My Drupal installation is located in:

/var/www/www.mydomain.com/htdocs/

This is what my vhost file - www.mydomain.com-ssl - looks like (the top section anyway):

<IfModule mod_ssl.c>
<VirtualHost mydomain.com:443>
        ServerAdmin me@mydomain.com
        ServerName www.mydomain.com:443
        DirectoryIndex index.php
        DocumentRoot /var/www/www.mydomain.com/htdocs/
        <Directory />
                Options FollowSymLinks
                AllowOverride None
        </Directory>
        <Directory /var/www/www.mydomain.com/>
                Options Indexes FollowSymLinks MultiViews
                AllowOverride None
                Order allow,deny
                allow from all
        </Directory>
        ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
        <Directory "/usr/lib/cgi-bin">
                AllowOverride None
                Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
                Order allow,deny
                Allow from all
        </Directory>
        ErrorLog /var/log/apache2/error.log
        # Possible values include: debug, info, notice, warn, error, crit,
        # alert, emerg.
        LogLevel warn
        CustomLog /var/log/apache2/ssl_access.log combined
        Alias /doc/ "/usr/share/doc/"
        <Directory "/usr/share/doc/">
                Options Indexes MultiViews FollowSymLinks
                AllowOverride None
                Order deny,allow
                Deny from all
                Allow from 127.0.0.0/255.0.0.0 ::1/128
        </Directory>

Is the error in this file? Or do I need to change something else beside adding your entries to the bottom of the .htaccess file?

Best wishes,
Paul

Kind regards,
Paul

Zooot Net
Your Own Social Network.


Use secure communication with OpenPGP.

Encrypt with public key: 34A447B1675A9463
Verify key with hash: 570C 0051 3B68 CAFB 98BA 1EEF 34A4 47B1 675A 9463

Nothing is jumping out at me.

christefano's picture

Nothing is jumping out at me. What do your error and access logs say for the SSL version of your site when you get 404s?

Found the error.

paul_constantine's picture

Changing "AllowOverride None" to "AllowOverride All" fixed the problem.

Thanks for your help,
Paul

Kind regards,
Paul

Zooot Net
Your Own Social Network.


Use secure communication with OpenPGP.

Encrypt with public key: 34A447B1675A9463
Verify key with hash: 570C 0051 3B68 CAFB 98BA 1EEF 34A4 47B1 675A 9463

Update.

paul_constantine's picture

When using https the server will show the index.php page as specified in the vhost file.

But any page beyond will fail. When I try to log in the server will re-direct to this address:

"https://www.mydomain.com/user/login"

And I get a 404 error. If I load

"http://www.mydomain.com/user/login"

Everything works. So it appears that the address with the "s" behind the http does not exist.

Kind regards,
Paul

Zooot Net
Your Own Social Network.


Use secure communication with OpenPGP.

Encrypt with public key: 34A447B1675A9463
Verify key with hash: 570C 0051 3B68 CAFB 98BA 1EEF 34A4 47B1 675A 9463

Secure Drupal and Drupal 7

LittleLion's picture

Thanks Christefano,

Is the spelling correct on those lines? ;)

I added the corrected code, but I still get the 404 error.

The sandbox site is at:
https://www.littlelionstudios.com/www/SGVLUG/
which translates to
https://www.littlelionstudios.com/www/SGVLUG/node?destination=node
at login.

I'll pop on an account for you to test if you like.

Thanks much for the help,

Matt

Whoops, yeah the spelling was

christefano's picture

Whoops, yeah the spelling was wrong. I was typing that on my phone while watching Battle: LA in the movie theater. Your support question was waaaaay more interesting.

I'll edit the code snippet so that others will find accurate rewrite rules in the thread.

Battle:L.A.

mcfilms's picture

LOL -- That is a pretty scathing review of Battle:L.A. You'd rather re-write .htaccess files then watch the movie. Waiting for the movie to be on video AND to have a good book on hand for sure.

Doh!

LittleLion's picture

Ok, the problem was that my SSL was a single-site SSL for the primary domain.

It was not working for the subdomain so I went back to accessing the site from the root.

My htaccess file was still pointing at the subfolder that the subdomain was targeting.

Problem fixed, anyone else running across this feel free to write me for details.

Matt

Type of data

rjbrown99's picture

The type of information you need to secure is also something to think about. If you are dealing with nonpublic personal information, such as social security numbers, bank account/credit card numbers, health information, or similar data, there are a number of regulations that might come into play that you should think about. I can't be more specific without details.

Good evening RJ, Are you

LittleLion's picture

Good evening RJ,

Are you speaking to PCI SSC requirements?

https://www.pcisecuritystandards.org/

If so that is an area of interest, but I intend to deal with it by using outside vendors and not accepting, storing or transmitting personal financial information on sites.

On the other hand some of the people I'm working with are collaborating on business contracts, etc. and would like the sites to be secure from others. These so far are low volume sites which can survive full-time SSL operation so Christefano's suggestions were perfect.

Are there other considerations you have in mind?

Matt

Depends

rjbrown99's picture

PCI certainly would be in play if you are dealing with credit card data. I'm not referring to any one regulation or requirement as they all depend on what type of data you are dealing with. If it's health data, there may be a set of requirements like HIPAA. If it has privacy implications, there may be another set of laws or requirements. If it's nonpublic personal information like bank account numbers/SSN/drivers license numbers, there's another set of requirements. It also depends on what states are in play as they all have their own set of laws, especially if you are dealing with private information about consumers. I can't really give you any more details without specifics.

Now, based on your reply, if it's just business contracts - they may be sensitive to you or your partner/client companies but generally they are not going to fall under a regulation beyond whatever confidentiality you promise in the contract. In that case some of the basics like SSL, enforcement of a strong password, password expiration, account lockout on guessing attempts, logging of IPs and reviewing those logs, and similar basic practices are going to be a good approach. If you publish a privacy policy you are going to be held to it, so make sure it accurately represents whatever you promise to do and isn't just a cut-paste job from another website.

Hope that helps, but might just confuse things :)

This article just showed up in my RSS reader

christefano's picture

This article just showed up in my RSS reader this evening:

  HTTPS is more secure, so why isn't the Web using it?
  http://arstechnica.com/web/news/2011/03/https-is-more-secure-so-why-isnt...

So the Web is clearly moving toward more HTTPS connections; why not just make everything HTTPS?

That's the question I put to Yves Lafon, one of the resident experts on HTTP(s) at the W3C. There are some practical issues most Web developers are probably aware of, such as the high cost of secure certificates, but obviously that's not as much of an issue with large Web services that have millions of dollars.

The real problem, according to Lafon, is that with HTTPS you lose the ability to cache. "Not really an issue when servers and clients are in the same region (meaning continent)," writes Lafon in an e-mail to Webmonkey, "but people in Australia (for example) love when something can be cached and served without a huge response time."

Lafon also notes that there's another small performance hit when using HTTPS, since "the SSL initial key exchange adds to the latency." In other words, a purely security-focused, HTTPS-only Web would, with today's technology, be slower.

I'm not sure what Lafon means about not being able to cache when using SSL. Browsers can cache content when sent the correct HTTP headers, so maybe he's talking about something else.

SSL in NIC cards

bvirtual's picture

One approach for those with their own dedicated hardware is to install a NIC card that does SSL encoding/decoding on it. This offloads it from the CPU chips. As the NIC SSL is done in custom hardware, it is very fast, or so I have heard.

Anyone have experience with SSL on a NIC?

Another way is to have a front end proxy server do the SSL. You'd only want that if it reduced overall latency to the end user, I would think.

It's good to get some stats on SSL performance hit. A factor of 3 or more seems to be typical. I'm fine with that. But a slow site, going from 3 seconds a page to 9 seconds...

Does Google spider https web pages?

Peter

LA's Open Source User Group Advocate - Volunteer at DrupalCamp LA and SCALE

Agreed

BTMash's picture

While I don't have any experience with it, what you said regarding a NIC card is true. It will speed up the site by a fair amount (I have been clamoring the IT folk at my workplace to get one...maybe some day :)).

I don't have any experience with a front end proxy doing ssl but that sounds like a pretty good idea. And yes, Google does spider https pages :)

Used to do that

rjbrown99's picture

We used to do that quite a while ago. It's great as long as your drivers are fully supported and kept up-to-date. You don't want the vendor to retire driver support and then you are stuck at some old kernel version that has security vulnerabilities you can't patch.

We moved to offloading SSL to F5 gear and they do a good job for a mid-size website.

CDN

rjbrown99's picture

The way I read the caching issue as it is presented in the article is that CDNs may not be able to cache static content at edge locations. CDN+SSL can be a headache and a significant cost, if you can find providers that support it.

Probably not an issue for 90% of websites out there, but relying on 100% HTTPS/SSL for all users is a challenging and potentially expensive problem for high traffic sites, both from the cost and latency perspective.

While I'm at it...

rjbrown99's picture

While I'm carpet-bombing this discussion group with posts, here's a quick tip. If you are using Securepages, Securepages Prevent Hijack, and Memcache+sessions you might want to have a look at this issue and the issue linked in comment #1: http://drupal.org/node/1090610

Basically there's a bit of work involved to cleanly handle user logout and removing all of the cookies (since Securepages Prevent Hijack sets a second cookie.)

How to run an UNSECURE Drupal site

pillarsdotnet's picture

Courtesy of the New Zealand Parliament:

http://www.youtube.com/watch?v=AnOAeVaU5xM

Wow, that's unfortunate

greggles's picture

Wow, that's unfortunate :(

I'd say it's more of an Apache/Webserver configuration mistake than a Drupal mistake, but putting the backup files anywhere near the webroot is something I'm generally opposed to as well...