Automating Boring Tasks with Python: Real-World Examples
"I will spend 6 hours automating a task that takes 5 minutes."
We've all been there. But sometimes, that automation pays off for years.
Python is the king of automation. Today, we are going to look at 6 Real-World Scripts that I use to
save hours of my life.
No Hello World. No theory. Just code.
1. The File Organizer (Clean Up Your Downloads)
My Downloads folder is a graveyard of PDFs, PNGs, and ZIPs.
This script watches a folder and moves files into subfolders based on their extension.
import os
import shutil
DOWNLOADS_DIR = "C:/Users/Jim/Downloads"
DESTINATIONS = {
"images": [".png", ".jpg", ".jpeg", ".gif"],
"docs": [".pdf", ".docx", ".txt"],
"installers": [".exe", ".msi"]
}
def organize():
for filename in os.listdir(DOWNLOADS_DIR):
file_path = os.path.join(DOWNLOADS_DIR, filename)
if os.path.isfile(file_path):
ext = os.path.splitext(filename)[1].lower()
for folder_name, extensions in DESTINATIONS.items():
if ext in extensions:
target_folder = os.path.join(DOWNLOADS_DIR, folder_name)
os.makedirs(target_folder, exist_ok=True)
shutil.move(file_path, os.path.join(target_folder, filename))
print(f"Moved {filename} to {folder_name}")
if __name__ == "__main__":
organize()
Why this helps: I run this once a week. It keeps my mental space clean.
2. The "Am I Down?" Monitor
Do you ever suspect your internet is acting up?
This script pings Google every 5 minutes and logs the result. If it fails, it plays a sound.
import time
import requests
import winsound
def check_internet():
try:
requests.get("http://www.google.com", timeout=5)
print(f"[{time.ctime()}] Online")
except (requests.ConnectionError, requests.Timeout):
print(f"[{time.ctime()}] OFFLINE!")
winsound.Beep(1000, 1000) # 1000Hz for 1 second
while True:
check_internet()
time.sleep(300)
Why this helps: It gives me proof when I call my ISP to complain.
3. Bulk Image Resizer (For Bloggers)
If you manage a blog (like this one), you need to optimize images.
Uploading 5MB PNGs destroys your SEO. This script resizes everything to 800px width.
from PIL import Image
import os
TARGET_WIDTH = 800
def resize_images(folder):
for filename in os.listdir(folder):
if filename.lower().endswith(('.png', '.jpg')):
img = Image.open(os.path.join(folder, filename))
# Calculate new height to maintain aspect ratio
ratio = TARGET_WIDTH / float(img.size[0])
new_height = int((float(img.size[1]) * float(ratio)))
img = img.resize((TARGET_WIDTH, new_height), Image.Resampling.LANCZOS)
img.save(os.path.join(folder, "resized_" + filename))
print(f"Resized {filename}")
# Usage: resize_images("./blog_images")
4. Automated Email Sender (For Follow-Ups)
As a developer, you often need to send the same email to multiple people (status updates, meeting notes, etc.).
This script reads a CSV file and sends personalized emails using Gmail's SMTP.
import smtplib
import csv
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
GMAIL_USER = "your-email@gmail.com"
GMAIL_PASSWORD = "your-app-password" # Use App Password, not regular password
def send_bulk_emails(csv_file):
with open(csv_file, 'r') as file:
reader = csv.DictReader(file)
for row in reader:
msg = MIMEMultipart()
msg['From'] = GMAIL_USER
msg['To'] = row['email']
msg['Subject'] = row['subject']
body = f"Hi {row['name']},\n\n{row['message']}"
msg.attach(MIMEText(body, 'plain'))
server = smtplib.SMTP('smtp.gmail.com', 587)
server.starttls()
server.login(GMAIL_USER, GMAIL_PASSWORD)
server.send_message(msg)
server.quit()
print(f"Sent email to {row['name']}")
# CSV format: name,email,subject,message
Why this helps: I use this for weekly team updates. It takes 2 minutes to set up, saves 30 minutes every week.
5. Web Scraper for Price Monitoring
Want to know when a product goes on sale? This script checks a website every hour and sends you a notification if the price drops.
import requests
from bs4 import BeautifulSoup
import time
import smtplib
TARGET_URL = "https://example.com/product"
TARGET_PRICE = 50.00 # Alert if price drops below this
def check_price():
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
}
response = requests.get(TARGET_URL, headers=headers)
soup = BeautifulSoup(response.content, 'html.parser')
# Adjust selector based on website structure
price_element = soup.find('span', class_='price')
current_price = float(price_element.text.replace('$', '').replace(',', ''))
if current_price < TARGET_PRICE:
send_notification(f"Price dropped to ${current_price}!")
return True
return False
def send_notification(message):
# Implement your notification method (email, Slack, etc.)
print(f"ALERT: {message}")
while True:
check_price()
time.sleep(3600) # Check every hour
Why this helps: I saved $200 on a monitor by catching a flash sale. The script paid for itself in one purchase.
6. Database Backup Automation
If you manage any databases (SQLite, MySQL, PostgreSQL), automated backups are non-negotiable.
import subprocess
import datetime
import os
DB_NAME = "my_database"
BACKUP_DIR = "C:/backups"
def backup_database():
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
backup_file = os.path.join(BACKUP_DIR, f"{DB_NAME}_{timestamp}.sql")
# For PostgreSQL
cmd = f'pg_dump -U username {DB_NAME} > {backup_file}'
# For MySQL
# cmd = f'mysqldump -u username -p {DB_NAME} > {backup_file}'
subprocess.run(cmd, shell=True)
print(f"Backup created: {backup_file}")
# Keep only last 7 days of backups
cleanup_old_backups(BACKUP_DIR, days=7)
def cleanup_old_backups(directory, days=7):
cutoff = datetime.datetime.now() - datetime.timedelta(days=days)
for filename in os.listdir(directory):
filepath = os.path.join(directory, filename)
if os.path.getmtime(filepath) < cutoff.timestamp():
os.remove(filepath)
print(f"Deleted old backup: {filename}")
# Schedule this to run daily (use Windows Task Scheduler or cron)
Why this helps: Database corruption happens. Having automated backups means you can restore in minutes, not hours.
Best Practices for Automation Scripts
1. Error Handling
Always wrap your automation in try-except blocks. A script that crashes silently is worse than no script at all.
try:
organize_files()
except Exception as e:
print(f"Error: {e}")
# Log to file or send notification
2. Logging
Adding logs is critical for debugging.
import logging
logging.basicConfig(
filename='automation.log',
level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s'
)
logging.info("Starting file organization")
3. Configuration Files
Hard-coding paths is a rookie mistake. Use config files or environment variables.
import json
with open('config.json', 'r') as f:
config = json.load(f)
DOWNLOADS_DIR = config['downloads_dir']
When NOT to Automate
Automation is powerful, but it's not always the answer:
- One-time tasks: If you'll only do it once, automation might take longer than doing it manually.
- Complex workflows: Sometimes, the automation is more complex than the task itself.
- Frequently changing requirements: If the process changes every week, maintaining the script becomes a burden.
The rule of thumb: Automate if you'll do it more than 5 times, or if it takes more than 10 minutes each time.
Conclusion
Automation isn't just about saving time. It's about saving cognitive load.
When you automate the boring stuff, you free up your brain for the fun stuff: building features and solving hard
problems.
Start with these scripts, tweak them to fit your life, and watch your productivity soar. Remember: The best automation is the one you actually use.
Next Steps:
- Pick one script that solves a real problem in your workflow.
- Run it manually first to verify it works.
- Schedule it (Task Scheduler on Windows, cron on Linux/Mac).
- Monitor it for a week, then forget about it.
That's the beauty of good automation: It becomes invisible.

Comments
Post a Comment