Welcome GuestLogin

A Drunken Wiki

RSS RSS

A Drunken Wiki is the companion wiki to the blog "A Drunken Madman", written by Jason Brown.

Navigation




Search the wiki
»

Vaccination Saves Lives: Stop The Australian Vaccination Network


Find me on Twitter, Facebook, Google+, Pinterest and LinkedIn





Pin It






Alpha Archive Atheism Blue Mountains Climbing Cycling Development DotNet GPS Guitar HMHB JavaScript JQuery LINQ Meta MTB Music Perl Powershell RNP SharePoint Skepticism Social Media Training Ukulele Vaccines Visual Studio WebParts Woodford Festival 2011 XML

PoweredBy

Using Net::FTP to retrieve logfiles

RSS
Modified on 2012/06/02 07:32 by Jason Categorized as Development, Perl
This article was originally published in around 2003-2004 at InfiniteMonkeys Web Programming Resource

Using Net::FTP to retrieve logfiles
by : Atrax

My ASP host retains only a week's worth of logfiles for my sites. It's a pain to try and remember to grab all your logfiles once a week by FTP, so I did the old pre-emptive laziness traick and wrote a little Perl script to retrieve my logs. And here it is.

The introduction pretty much explains why this script exists - I needed to grab my files regularly before they got deleted, so I wrote this Net::FTP script to do it for me. The example grabs the logfiles for this site, http://www.readthefuckingmanual.co.uk/, but it's generic enough for me to just change the host info to retrieve other sites' logfiles easily.

First, to explain the folder structure of my site. On logging in by FTP, I'm presented with three folders. HTDocs contains my web content, Private contains database and script components, and other private content. Finally, logfiles contains, obviously, my logfiles (actually in a subdirectory thereof, which is different for each site)

So the operation is this:

Log in
Change directory to /logfiles
check what the subfolder name is
Change directory into subfolder
Run through the files, downloading them if I don't already have a copy
Log out

This is accompished by the following Perl code

#!/usr/bin/perl

use Net::FTP;

$ftp = Net::FTP->new('ftp.readthefuckingmanual.co.uk');
$ftp->login('readthefuckingmanual.co.uk', '******');
$ftp->cwd('logfiles');
@folders = $ftp->ls();
$ftp->cwd($folders[0]);
@logfiles = $ftp->ls();
foreach $file (@logfiles)	{	
	# check remote size
	my $fileSize = $ftp->size($file);
	# check local size
	my ($dev, $ino, $mode, $nlink, $uid, $gid, 
		$rdev, $size, $aatim, $mtime, $ctime,
		$blksize, $blocks) = stat("d:\\wwwroot\\rtfmlogs\\$file");
	print "fetching file : $file ($fileSize) local size : ($size)\n";	
	# compare local and remote size
	if($fileSize ne $size )	{
		$ftp->get($file, "d:\\wwwroot\\rtfmlogs\\$file") or
				 print "error $!";
		print "file fetched\n";
	}	else	{
		print "file skipped\n";
	}
}

print "operation successful";

As you can see, it logs in, switches through to the right directory, grabs a list of files and starts looping through them. If checks the local copy of the file to see if I have a different size with the stat() function, and if the size differs, it downloads the file. How simple.

Jason Brown

 

ScrewTurn Wiki version 3.0.5.600. Some of the icons created by FamFamFam.