Celery is an open-source asynchronous task queue or job queue based on distributed message passing. While it supports scheduling, it focuses on real-time operations. When used with Pipenv, Python's tool for managing dependencies and virtual environments, developers can simplify their work and ensure consistent setups. Python-dotenv enhances this by loading environment variables from a .env
file, keeping configurations clean and secure. Together, these tools create a strong system for handling background tasks like email sending in Python projects.
In this blog post, weโll show how to use Celery, Pipenv, and Python-dotenv to handle email sending tasks via SMTP in your Python projects.
Setting Up Your Project
Let's start by creating a new Python project with Pipenv and then add Celery to it.
Install Pipenv:
pip install pipenv
Create a new project directory and initialize Pipenv:
mkdir celery-email cd celery-email pipenv install
Install Celery, Redis (as a message broker), and Python-dotenv:
pipenv install celery redis python-dotenv
Create a
.env
file in your project directory to store environment variables# Celery configuration REDIS_URL=redis://localhost:6379/0 # Email configuration SMTP_SERVER=smtp.mailtrap.io SMTP_PORT=2525 SMTP_USERNAME=username SMTP_PASSWORD=password
Configuring Celery
Create a celeryconfig.py
file in your project directory with the following content:
from dotenv import load_dotenv
import os
# Load environment variables from .env file
load_dotenv()
# Celery configuration
broker_url = os.getenv('REDIS_URL')
result_backend = os.getenv('REDIS_URL')
broker_connection_retry_on_startup = True # Ensure broker connection retry on startup
In this setup, we use Python-dotenv to load the Celery broker and backend URLs from the .env
file, keeping sensitive information out of our source code.
Creating Email Sending Task
Create a tasks.py
file in your project directory:
from celery import Celery
from celery.utils.log import get_task_logger
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
import smtplib
import os
logger = get_task_logger(__name__)
# Initialize Celery app
app = Celery('tasks', broker=os.getenv('REDIS_URL'), backend=os.getenv('REDIS_URL'))
@app.task
def send_email(recipient_email):
# Email content
msg = MIMEMultipart()
msg['From'] = os.getenv('SMTP_USERNAME')
msg['Subject'] = 'Test Email from Celery'
body = 'This is a test email sent asynchronously using Celery.'
msg.attach(MIMEText(body, 'plain'))
# SMTP server configuration
smtp_server = os.getenv('SMTP_SERVER')
smtp_port = int(os.getenv('SMTP_PORT', 587)) # Convert to integer, default to 587 if not specified
smtp_username = os.getenv('SMTP_USERNAME')
smtp_password = os.getenv('SMTP_PASSWORD')
try:
# Create SMTP session
server = smtplib.SMTP(smtp_server, smtp_port)
server.starttls()
server.login(smtp_username, smtp_password)
msg['To'] = recipient_email
# Send email
server.sendmail(smtp_username, recipient_email, msg.as_string())
logger.info(f"Email sent successfully to {recipient_email}")
except Exception as e:
logger.error(f"Failed to send email to {recipient_email}: {str(e)}")
finally:
server.quit()
This task sends an email asynchronously using Celery and smtplib.
Running Celery
To start the Celery worker, run below command in your project directory:
pipenv run celery -A tasks worker --loglevel=info
This command starts a Celery worker that listens for tasks defined in tasks.py
.
Using the Email Sending Task
You can now use the defined task in your Python scripts or web applications. Here is an example of how to call this task:
from tasks import send_email
# List of recipients
recipient_emails = ['rajendra@example.com', 'manish@example.com']
# Triggering the Celery task asynchronously
for recipient in recipient_emails:
print(f"Email start {recipient}")
result = send_email.delay(recipient)
print(f"Email task sent to {recipient}: {result}")
Conclusion
Integrating Celery with Pipenv and Python-dotenv offers a robust solution for managing and optimizing Python email tasks. Besides email processing, Celery is excellent for many other asynchronous tasks. It can handle background job scheduling, periodic tasks, and real-time data processing. Celery's ability to distribute work across multiple workers can enhance the performance of data-heavy applications, web scraping, and machine learning model training.
If you have any questions or need further clarification, feel free to reach out. Happy coding! ๐๐