
If you are reading this blog I assume you are already familiar with the DAG creation in Apache Airflow. If not, please visit “Dag in Apache Airflow”.
This blog explains:
– Sending email notifications using EmailOperator.
– Sending email notification when Dag or task fails
Here, we will schedule a dag that consists of 3 tasks. Task_1 and Task_2 are using BaseOperator while sending_email task is using EmailOperator.
After successful execution of Task_1, the sending_email task will send an Email. At last failure Alert mail will be sent for Task_2.
Let’s start step by step…
1: Generate Google App Password
This generates 16 character unique password authorized by Google to mock the original password or two-factor authentication.
Steps for generating Google App Password:-
- Visit the App passwords page. After login, you will see the below window.
- Click Select app and choose the app you’re using.
- Click Select device and choose the device you’re using.
- Select Generate.
- App password generated ( 16 character code in the yellow bar) as shown in image below.
- Select Done.
Note: Once you are finished, you won’t see that App password code again.
Hence, Please note the password somewhere carefully.
2 : Edit airflow.cfg
airflow.config file is available in the airflow directory. Try to update the SMTP section as shown here:
[email]
email_backend = airflow.utils.email.send_email_smtp
[smtp]
smtp_host = smtp.googlemail.com
smtp_starttls = True
smtp_ssl = False
smtp_user = YOUR_EMAIL_ADDRESS
smtp_password = 16_DIGIT_APP_PASSWORD
smtp_port = 587
smtp_mail_from = YOUR_EMAIL_ADDRESS
3: Importing modules
from datetime import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.email_operator import EmailOperator
4: Default Arguments
Here, ‘email_on_failure‘ is set to True. that’s why email will be sent automatically on failure.
default_args = {
"owner": "Kuldeep",
"start_date": datetime(2022, 2, 16),
'email': ['kuldeep.swaroop@knoldus.com'],
'email_on_failure': True,
}
5: Instantiate a DAG
with DAG(dag_id="Sending_mail",
schedule_interval="@once",
default_args=default_args,
)
6: Set the Tasks
Here, EmailOperator is used to perform the task of sending an email.
sending_email = EmailOperator(
task_id='sending_email',
to='kuldeep.swaroop@knoldus.com',
subject='Airflow Alert !!!',
html_content="""<h1>Testing Email using Airflow</h1>""",
)
task_2 = BashOperator(
task_id='task_2',
bash_command='cd temp_folder’,
)
7: Setting up Dependencies
task_1 >> sending_email >> task_2
Code
The attached screenshot is the complete Example.
Result
- Task_1 succeeds
- Sending_email task succeeds
- Task_2 fails because there is no folder named temp_folder’
Viewing DAG In Airflow
After running the code, when you go to the browser and write, localhost:8080.
Click on your DAG
After clicking, you will get a detailed view of the tasks.
Tree View
Graph View
Received Email after successful execution of sending_email task
Received Alert Email after the failure of task_2
Note: If you are facing any issue related to the email operator, try to update the Docker-compose.yml file and set:-
volumes: - ./dags:/usr/local/airflow/dags - ./config/airflow.cfg:/usr/local/airflow/airflow.cfg
I hope you are now able to send emails in Apache Airflow. Stay tuned for the next part.
Read Apache-Airflow documentation for more knowledge.
To gain more information visit Knoldus Blogs.