Automatic Web Server Monitoring

Hi there! Here’s another quick suggestion for all of you out there who have to deal with a lot of web server environments and need to make sure that they’re all alive and responsive. Sure there are all sorts of tools for server monitoring and performance tracking, but I found that most of the time you just need to know if the server is up & running. And of course you should be notified if it’s not. Here’s my short Python script that does just that. You will need to tweak it a little to use in your environment. First of all I’ve set the content threshold to 2000 Bytes which means that the expected length of the page content returned is no less than 2000 Bytes. If it’s below this value, one could assume that something’s fishy going on. You might want to change this value to something else depending on what your web server returns. Another thing that you’ll want to change is the servers dictionary in the main() method. The logic behind this is that each of the web servers is assigned a responsible person that you would need to inform if something is wrong with the server. You should list all your servers that need to be monitored and the responsible person’s e-mails. The final change is on the line where you instantiate the HeartbeatMonitor class. The first parameter admins is a list of e-mails of the people who will receive the notification if any of the servers are down. You can leave this list empty if you want, but I usually put my e-mail in there since I like to be in the loop. The second parameter mailServer is the SMTP server which will be used to send notifications so make sure this one is set correctly. This parameter is actually optional and localhost will be used as the default mail server if you decide to skip it.

Search for Files on Remote FTP Server

I’ll keep this one short. Don’t know about you, but I’ve always been frustrated about how most FTP clients won’t let you search for files on a remote FTP server. I remember I used some client that had this functionality, but this was quite a while ago so I can’t even remember the name of it. If someone else knows any lightweight tools that let you do that, please leave a comment. Recently I needed to do some quick searches on the servers that I have only FTP access to. And you know what they say… “when you can’t find the right tool for the job – write it yourself” Actually I don’t know if anyone says that :) but I decided to do it and put together a Python script just for this task. I’m not sure if many others have run into a similar problem, but I put it on GitHub anyway. I called it FTP search for obvious reasons. You can find the script and a short documentation there. The only prerequisite is that you have Python 2.7.x installed. I can’t promise you that it will work with Python 3.x but you’re welcome to try.  If you have any ideas on how to improve this little tool and wish to contribute then leave a comment bellow or contact me via e-mail that’s specified on GitHub’s project page.

Creating thumbnails from photos with Python PIL

This time around I want to share with you a little Python library I found called PIL. If you’re a Python developer chances are you already know of it, but I use Python only from time to time, usually to automate some tasks so I was very excited to come across such a library. First thing’s first, download PIL and install it. I’m using it on a Windows platform so I used the packaged binary version, but you can get the source code and use it on any platform that supports Python. There’s also links to documentation on the same page. If you already know Python it should be fairly easy to use the library. What I liked best about it is speed, though my comparison is to Java-based image processing which can be painfully slow. One of the first things I did with it was create a simple little thumbnail generator to help me handle my photo collection which you can see below. Simply specify a directory with .jpg files in it and the script will go through each file, shrink it to fit a chosen format of 1024×768 while maintaining the aspect ratio and create a thumbnail for it. All the created files are placed in a directory named album which is relative to your working directory. Oh, almost forgot, it tries to read the EXIF data of the photos and if Orientation tag is found it automatically rotates the image for your viewing pleasure.

tail -f for windows

Here’s a simple implementation of “tail -f” UNIX command equivalent in python. I used it a lot on windows servers to monitor log files. The good thing is that it works over SMB (windows shares).

#!/usr/bin/env python
import sys, os, time

def main(argv):
    if len(argv) < 2:
        print "Usage: tail filename.log"
        fp = open(argv[1], "r")
        st_results = os.stat(argv[1])
        st_size = st_results[6]

        while 1:
            where = fp.tell()
            line = fp.readline()
            if not line:
                print line,

                if __name__=="__main__":

This is useful when starting/stopping remote services with Windows Service Controller:

sc \\server1 stop "Macromedia JRun4 default Server"
sc \\server1 start "Macromedia JRun4 default Server"

Strings in JAVA

A few days ago I needed to extract all strings from .java files and also thought that it would be a good idea to keep count how many times a string is used. So I came up with this simple python script. It’s kind of a quick and dirty solution, but it met my needs for the particular task.

import sys, os, re
from operator import itemgetter

files = []
strings = {}
exp = re.compile("(\".+?\")")

def klist(bdir):
    dir = os.listdir(bdir)
    for fname in dir:
        if fname.endswith(".java"):
        if os.path.isdir(bdir+"\\"+fname):

def get_strings(fname):
    fp = open(fname)
    data = fp.readlines()
    print fname[fname.rfind("\\")+1:]+":"

for line in data:
        k = 1
            m =, k)
            if m!=None:
                fstr = m.groups()[0]
                print "    "+fstr
                cnt = 1
                if strings.has_key(fstr):
                    cnt = strings[fstr] + 1
                strings.update({fstr : cnt})
                k = m.end()
                k = len(line)

if __name__ == "__main__":
    if len(sys.argv)<2:
        print "Usage: base_directory"

    for fname in files:

print "-"*70
    di = strings.items()
    di.sort(key=lambda x: x[1])
    for (k, v) in di:
        print v, ":", k

So what this basically does is gather the strings and prints out strings for each file and then after a separator line it prints some usage stats. This might contain bugs, because I was in a hurry to write it, so if you use do it at your own risk ;)