From the course: Apache Airflow Essential Training
Unlock this course with a free trial
Join today to access over 24,700 courses taught by industry experts.
Using the PostgreSQL operator - Apache Airflow Tutorial
From the course: Apache Airflow Essential Training
Using the PostgreSQL operator
- [Instructor] We are now ready to run our very first DAG connecting to Postgres. Now, all of the SQL commands that I run, I'm going to place within a separate sub-folder called sql_statements under my AIRFLOW install directory. Our SQL code will thus be separated from our DAG definition. Now, this first SQL command is to create a table called, customers if it doesn't already exist. I also have a create table customer_purchases SQL command which creates a table called, customer_purchases and which references the customer's ID as a foreign key. You can see this on line five. My DAG definition and tasks is written in this postgres_pipeline.py file. On line five, I have an import for the Postgres operator which is what you need to connect to Postgres and run SQL commands in that database. I've defined the DAG here using the with statement. The one new parameter that I have specified here is the template_searchpath. This points…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
Installing PostgreSQL on macOS2m 28s
-
(Locked)
Installing PostgreSQL on WSL2m 27s
-
(Locked)
Connecting to PostgreSQL4m 19s
-
(Locked)
Using the PostgreSQL operator3m 55s
-
(Locked)
Performing PostgreSQL insert operations2m 44s
-
(Locked)
Performing PostgreSQL join operations3m 2s
-
(Locked)
A complete end-to-end pipeline with PostgreSQL3m 48s
-
(Locked)
Configuring PostgreSQL as a metadata database and using the LocalExecutor6m 6s
-
-
-
-