Using wget to access contributed projects

We encourage users to post events happening in the community to the community events group on https://www.drupal.org.
jbc's picture

Can anyone point me towards using wget to access contributed modules / projects. I don't want to use a CVS system, but now that I've migrated to Ubuntu, I want to be able to use a command line interface to quickly download and unpackage drupal modules.

I'm new to Ubuntu. I've previously used Tortoise CVS system under Windows, but I don't really need it if I can simply use a command line to download projects / latest module updates.

Thanks!

shalom from wales!
John

(sorry if anyone notices I posted twice...my first effort seemed unlikely to get noticed and I'm facing a bit of pressure to resolve this issue.)

Comments

simple

greggles's picture

Here's my workflow when I'm using windows

1) ssh into the server using PuTTy
2) cd /dir/to/your/sites/all/modules
3) In Firefox, go to http://drupal.org/project/pathauto - just for instance ;)
4) Right click on the "Download" link for the release you want. This points to the tarball.
5) wait
6) Run the command "tar zxf filename-6.x-1.1.tar.gz" which will do a tar z: unZip with x: eXtract on the f: filename that follows.
7) Party like you just saved 5 minutes compared to using FTP

This takes me a few seconds to do. I have a bookmark in Firefox for http://drupal.org/project which I've edited to add the "keyword" of "pr". This way I open a Firefox tab and type "pr pathauto" and that takes me to the project page.

If doing this dance feels a little repetitive to you, there is also Drush which automates some of the process.

--
Open Prediction Markets | Drupal Dashboard

thanks, greggles

jbc's picture

I'm actually migrating FROM windows setup to an ubuntu lamp stack and wanting to get some advice on the best / easiest method of updating modules -- I have a lot of contributed modules, obviously not all used on every site, for a multi-site install of drupal.

thanks for the idea, though, I guess it works pretty similar from ubuntu? or do you do something different under ubuntu?

shalom from wales!
John

shalom from wales!
John

I made this same transition

sdboyer's picture

I made this same transition myself a couple months back and struggled for a little while, but I haven't looked back.

Drush can handle the situation you're describing (multi-site install, with a variety of modules) just fine, through the use of the -l switch: i.e., drush pm install cck admin_menu panels views is going to nab you those four module packages into your sites/all/modules directory, if you have a multisite that you've got set to, say, http://example1.local, then running drush -l http://example1.local pm install cck admin_menu panels views will net you those same four modules into the sites/example1.local/modules directory.

As for maintaining the whole shebang, my approach is this (especially now that we've got that AWESOME svn plugin for drush - woohoo grugnog!): I maintain a single drupal-vendor svn repo on a dummy drupal install, and I use drush to automatically update that repo's set of contrib modules. From there, I use a yet-to-be-perfected method of partial repo mirroring to move changes that are made to the main vendor repo out and into the vendor branches I maintain for each particular project I'm working on. (I'm trying to grok svk atm for this purpose).

This all has the advantage of a) never updating a live site, or even a test site, with new modules without my direct approval, while b) providing an easy, extensible command-line-based interface for me to be able to choose where and how to distribute those changes within the constellation of drupal stuff I have.

drush, cvs or wget / download via update status

jbc's picture

Having had a look around a bit more, I wonder if I can widen the question.

I presently have a couple of websites which are running using cvs deploy, previously administered via Tortoise cvs under windows. Presumably it is easy to maintain a cvs system under Ubuntu, but what would be recommended for someone adopting from the Gui of windows / tortoise?

I understand drush is a good alternative for accessing the latest modules, but it must be run with a cvs system. Does it only work for a local lamp stack or can it also be used on the live website, using ssh? Can anyone speak about their experience with that system?

Finally, I wonder whether it is not simplest to just download packages directly via the update status system and then upload?

I would only be using cvs for convenience; I am not going to be developing any modules myself in the near future. Can anyone make a recommendation of which route they think would be most suitable for me?

Thanks...

shalom from wales!
John

shalom from wales!
John

maintaining local machine or server?

greggles's picture

Are you asking about maintaining sites on your local machine or via a shell session to a Linux server?

I guess based on your comments that it is a local machine. In that case, you can apply my steps to the local command line (gnome-terminal instead of PuTTy).

The best, though, would be to learn the cvs command line (and/or drush). It's not so hard - there's even Drupal specific docs: http://drupal.org/handbook/cvs

--
Open Prediction Markets | Drupal Dashboard

Drush, cvs and wget

Owen Barton's picture

Hi John,

I also recommend drush - it can automatically figure out the recommended stable version and install it for you (if you do wget or cvs by hand you need to check drupal.org each time). It can install specific (older/newer) versions and development branch snapshots too.

Also, please note that drush CAN use wget instead of cvs (or vica-versa) to download and update packages.

Drush can be used locally, or direct on a live site via ssh (as can wget and cvs) - but I would strongly recommend never ever changing or upgrading live code (on sites you care about) without thorough testing on a staging site first, and good database backups in case it all goes wrong :)

The advantage of using cvs over wget is that if you want to contribute a bugfix back (even if you don't maintain any modules yourself) you can just type 'cvs diff -u' to make a patch from the edited files and upload to the issue queue, rather than having to copy the original file(s) somewhere else first. The other advantage is if you run any modules from the development branch you can easily update them just by typing 'cvs up -dP' (without having to check drupal.org and download/unpack a new tarball).

There are several CVS GUIs for linux, that you can install on Ubuntu with apt or synaptic - gcvs, crossvc, cervisia, tkcvs or eclipse.

Hope this helps!
- Owen

Thanks, Owen and Greg for a helpful round of comments.

jbc's picture

Thanks, Owen and Greg for a helpful round of comments.

I am thinking about a system that will apply to both local and live sites. Obviously, live sites must be backed up, and major updates / changes tested locally. But it would still help to have an efficient way of getting new modules into the live 'space', particularly where the change is minor.

Presently, I find the challenge sufficient of maintaining several websites -- as well as learning css and theming skills. I can't see me learning sufficient php skills in the foreseeable future to justify the advantage you wrote about Owen, of submitting patches / bugfixes of edited code. Similarly, I rarely run dev branch stuff.

Consequently, at this stage, using Drush without a cvs system might be the way to go. It would be very helpful if you could point towards any documentation of drush that specifically deals with its ability to use wget rather than cvs -- unless its especially intuitive?

The advantage of this choice is that I can continue for the time being to use ftp with which I'm most familiar (I never used cvs with command line, but always uploaded my local test site wholesale to live site, using ftp - filezilla)... while I become competent with drush. Whereas, I would otherwise need to learn a new cvs system...and I'm not even certain that the cvs generated by Tortoise would work with eclipse, gcvs, etc. though I'd always assumed that these systems were cross-breedable?

shalom from Wales!
John

shalom from wales!
John

Had a go at Drush...pretty cool!

jbc's picture

Thanks for the encouragement!

shalom from wales!
John

shalom from wales!
John

Maybe this blog post could

ztyx's picture

Maybe this blog post could give you another way of doing it.

Jens

another way to do it

toddgee's picture

May I humbly suggest that my own bash script library is better than drush? checky: http://toddgee.com/drupalScripts

I maintain a suite of 40ish sites using them. They follow more of a specialized usage model rather than a generic plug-in model (each has its advantages); but are all highly polished and do extensive parameter checking and offer help.

Also, my script library promotes and leverages its own model of Drupal deployments. See the link above for more info.

On a more basic intro level, please?

webmeist's picture

If you don't mind I'd like to readdress this topic with a more beginner's question?
I am reading this book as I'm planning on installing Drupal and create my (first) site):
http://www.packtpub.com/drupal-7-with-mobile-tablet-devices-web-developm...

The author also recommends Drush, and I am going to use it to follow
his development, but I guess I'm wondering if it's required to download all the
files onto your own machine? I'm on Win XP and I'm just thinking it might be better to
have two sites, a dev site and a real site, on Dreamhost's servers, rather than a local
one on my machine, which very likely might die or change, etc.
(I just had two laptops stolen lately and a third one bit it yesterday. Working
on a luckily very reliable desktop a neighbor gave me and I resurrected...)

:)

I'm sure it's yes, but basically Drush could move the files without unpacking them
on my pc? Is it that necessary to have a local version?

Thanks so much.
Just starting out...
'meist