Python Automation Scripts: Streamlining Tasks with Python
Harnessing Python to Automate Repetitive Tasks
Automation is a key aspect of modern computing, and Python is one of the best languages for automating repetitive tasks due to its simplicity and the wide range of libraries available. In this guide, we'll explore various Python scripts to automate different types of tasks, including file management, web scraping, sending emails, and more.
1. File Management Automation
Automating file management tasks can save a lot of time and reduce errors. Python's os
and shutil
modules are powerful tools for file and directory operations.
1.1. Renaming Multiple Files:
This script renames all files in a directory by adding a prefix.
import os
def rename_files(directory, prefix):
for filename in os.listdir(directory):
os.rename(
os.path.join(directory, filename),
os.path.join(directory, prefix + filename)
)
# Usage
rename_files('/path/to/directory', 'new_')
1.2. Organizing Files by Extension:
This script organizes files in a directory into subdirectories based on their file extensions.
import os
import shutil
def organize_files_by_extension(directory):
for filename in os.listdir(directory):
if os.path.isfile(os.path.join(directory, filename)):
ext = filename.split('.')[-1]
ext_dir = os.path.join(directory, ext)
os.makedirs(ext_dir, exist_ok=True)
shutil.move(os.path.join(directory, filename), os.path.join(ext_dir, filename))
# Usage
organize_files_by_extension('/path/to/directory')
2. Web Scraping Automation
Web scraping is the process of extracting data from websites. Python's BeautifulSoup
and requests
libraries make it easy to scrape web data.
2.1. Scraping a Web Page:
This script extracts and prints all the hyperlinks from a web page.
import requests
from bs4 import BeautifulSoup
def scrape_links(url):
response = requests.get(url)
soup = BeautifulSoup(response.content, 'html.parser')
links = soup.find_all('a')
for link in links:
print(link.get('href'))
# Usage
scrape_links('https://www.bytescrum.com')
2.2. Extracting Data from a Table:
This script extracts data from an HTML table and prints it in a structured format.
import requests
from bs4 import BeautifulSoup
def scrape_table(url):
response = requests.get(url)
soup = BeautifulSoup(response.content, 'html.parser')
table = soup.find('table')
headers = [header.text for header in table.find_all('th')]
rows = table.find_all('tr')
for row in rows:
columns = row.find_all('td')
data = [column.text for column in columns]
print(dict(zip(headers, data)))
# Usage
scrape_table('https://example.com/table_page')
3. Sending Automated Emails
Automating email tasks can be very useful for sending notifications, reports, or reminders. Python's smtplib
library allows you to send emails using an SMTP server.
3.1. Sending a Simple Email:
This script sends a simple email using an SMTP server.
import smtplib
from email.mime.text import MIMEText
def send_email(subject, body, to_email):
from_email = 'your_email@example.com'
password = 'your_password'
msg = MIMEText(body)
msg['Subject'] = subject
msg['From'] = from_email
msg['To'] = to_email
with smtplib.SMTP_SSL('smtp.example.com', 465) as server:
server.login(from_email, password)
server.sendmail(from_email, to_email, msg.as_string())
# Usage
send_email('Test Subject', 'This is a test email.', 'recipient@example.com')
4. Automating System Tasks
Python can be used to automate various system tasks, such as scheduling scripts, monitoring system resources, or interacting with other applications.
4.1. Scheduling Tasks withschedule
:
This script uses the schedule
library to run a function at a specific time every day.
import schedule
import time
def job():
print("Doing daily task...")
schedule.every().day.at("10:00").do(job)
while True:
schedule.run_pending()
time.sleep(1)
4.2. Monitoring System Resources:
This script monitors CPU and memory usage using the psutil
library.
import psutil
import time
def monitor_system(interval):
while True:
cpu_usage = psutil.cpu_percent(interval=1)
memory_info = psutil.virtual_memory()
print(f"CPU Usage: {cpu_usage}%")
print(f"Memory Usage: {memory_info.percent}%")
time.sleep(interval)
# Usage
monitor_system(5)
5. Web Automation
Web automation involves controlling a web browser to perform tasks such as form submissions, web scraping, or testing web applications. Selenium is a popular library for web automation in Python.
5.1. Automating Web Interaction with Selenium:
This script automates the process of filling out a form on a webpage.
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
def automate_form_submission(url, form_data):
driver = webdriver.Chrome()
driver.get(url)
for field_name, field_value in form_data.items():
field = driver.find_element_by_name(field_name)
field.send_keys(field_value)
submit_button = driver.find_element_by_name('submit')
submit_button.click()
driver.quit()
# Usage
form_data = {
'username': 'your_username',
'password': 'your_password'
}
automate_form_submission('https://example.com/login', form_data)
5.2. Scraping Dynamic Content with Selenium:
This script uses Selenium to scrape data from a dynamically loaded webpage.
from selenium import webdriver
from selenium.webdriver.common.by import By
def scrape_dynamic_content(url):
driver = webdriver.Chrome()
driver.get(url)
content = driver.find_element(By.ID, 'dynamic-content').text
print(content)
driver.quit()
# Usage
scrape_dynamic_content('https://example.com/dynamic')
Conclusion
Thank you for reading our blog.
Happy Coding!