Git-ing Familiar with Python - Part 2: site_checker.py

Git-ing Familiar with Python - Part 2: site_checker.py

Introduction

For this project, I wanted a simple Python script that I could run every hour to check if my external sites were up and running.

Some of the requirements are outlined below:

  1. The script should check each site in a site list for a response.
  2. If it receives an error, it will need to send a notification through Pushover.
  3. API keys and Site List should go into a secrets file that is ignored by GitHub.

Potential enhancements to base functionality to consider when developing:

  1. Explore adding Twilio SMS notifications and SMTP notifications as well.
  2. Specific errors lead to specific notifications (403, 404, 500, 502, 504, etc...)

I'm going to be working in VSCodium with the git extension.
I've create the repository at https://github.com/WendingTuo/site_checker.


File Structure

First, we'll create our api_secrets.py file and our sites.py file using touch commands in the local git repository. We'll add these to the .gitignore file by simply listing them by name.

# Untracked Files
api_secrets.py
sites.py

Then we'll make our primary script file, site_checker.py.

Once we've created all these files, we'll do an initial upload of the blank files. We'll use this opportunity to confirm that the .gitignore configuration is correct and functioning.

Now that we've confirmed that the script updates, but the secrets files don't, we can start to populate the secrets files with the variables we'll need for our specific deployment.


The Variables

We'll start with the api_secrets file, which for the time being will only need to include our Pushover api key and our Pushover user key. You can find these by logging in to https://pushover.net and selecting "Create an Application/API Token."

Once you've got those variables, we'll load them into our secrets file in this format:

# Pushover Authentication Details
pushover_api_token = '<insert_your_api_token>'
pushover_user_key = '<insert_your_user_token>'

Next, we'll populate our sites into the sites.py file using a list structure. I've included an example below using this site and a redirect to the same site. You can populate this list with as many sites as you'd like to monitor.

site_list = ('https://blog.blakehyatt.com', 'https://blakehyatt.com')

Looks good! Now we can start writing some code!


The Setup

First, we'll import requests - This is a python library focused on HTTP requests. (If you have not used requests before, you'll need to open install it using the instructions on their documentation).

We'll also import time, sys, and some basic functions from pushover for notifications.

import time
import sys
import requests
from pushover import init, Client

Then, we'll import the variables from our secrets files:

from api_secrets import *
from site_list import *

And we'll set up Pushover to send notifications:

pushoverClient = Client(pushover_user_key, api_token=pushover_api_token)

The Script

Next, for each item in our site_list, we'll call the get function from the requests library and set that as variable r.

Then, we'll call the status_code function, and validate whether it's a successful web address resolution by checking against the requests.codes.ok function.

If the website is up, this function will be True, meaning we just print that to our logs and move on.

The else: condition will be for all non-successful HTTP codes, which will send a notification via Pushover.

Here's what that code looks like:

for i in site_list:
	r = requests.get(i)
	if r.status_code == requests.codes.ok:
		print (r.url, r.status_code)
	else:
		print (r.url, r.status_code)
		pushoverClient.send_message("The url " + r.url + " failed to resolve. Error code: " + str(r.status_code))

Finally, we want to gracefully exit the script since we will be automating the execution of it via crontab.

print("Exiting in 3 seconds...")
time.sleep(3)
sys.exit()

Automation

Now, we want to set this script up to run automatically using crontab. I'd like this job to run every 30 minutes between 7 A.M. and 10 P.M., and I'd like it to also write the output to a log file on my system. To do this, I need to run enter crontab -e into the terminal, which will open my user's crontab file in edit mode. Then, I'll add the following line to the bottom:

0 7-22 * * * python3 /home/wendingtuo/site_checker/site_checker.py > /home/wendingtuo/logs/site_checker.log

Now hit ctrl+X to save, and type crontab -l to validate the crontab file has been updated accurately.


Last Minute Improvements

After letting the automation run at the top of the hour, I reviewed the logs and thought it would be nice to have the logs separated by a timestamp, so I added this line above the main function: print (time.strftime("%c")).


Closing Thoughts

Looking back on my introduction, I believe I hit the mark on the core requirements, while my understanding of the functionality of the application helped to shape my understanding of the stretch requirements.

If I were to add the functionality for more notification types, I would probably do the following:
1. Break the notifications out into a separate function that I would call in the else statement.
2. Incorporate some cli commands that determine the preferred notification platform

If I were to add the functionality to send more specific notifications for common status codes, I would do the following:
1. Create elif statements in the main function to match the r.status_code to specific error codes, leaving the existing else function as the final catch-all.

Well that about wraps up this project! I'm going to leave it running for a while and check back in on it periodically to ensure it's working as needed.

Show Comments