Search courses 👉
Professional Training

Building ETL and Data Pipelines with Bash, Airflow and Kafka

edX, Online
Length
5 weeks
Next course start
Start anytime See details
Course delivery
Self-Paced Online
Length
5 weeks
Next course start
Start anytime See details
Course delivery
Self-Paced Online
Visit this course's homepage on the provider's site to learn more or book!

Course description

Building ETL and Data Pipelines with Bash, Airflow and Kafka

Well-designed and automated data pipelines and ETL processes are the foundation of a successful Business Intelligence platform. Defining your data workflows, pipelines and processes early in the platform design ensures the right raw data is collected, transformed and loaded into desired storage layers and available for processing and analysis as and when required.

This course is designed to provide you the critical knowledge and skills needed by Data Engineers and Data Warehousing specialists to create and manage ETL, ELT, and data pipeline processes.

Upon completing this course you’ll gain a solid understanding of Extract, Transform, Load (ETL), and Extract, Load, and Transform (ELT) processes; practice extracting data, transforming data, and loading transformed data into a staging area; create an ETL data pipeline using Bash shell-scripting, build a batch ETL workflow using Apache Airflow and build a streaming data pipeline using Apache Kafka.

You’ll gain hands-on experience with practice labs throughout the course and work on a real-world inspired project to build data pipelines using several technologies that can be added to your portfolio and demonstrate your ability to perform as a Data Engineer.

This course pre-requisites that you have prior skills to work with datasets, SQL, relational databases, and Bash shell scripts.

Upcoming start dates

1 start date available

Start anytime

  • Self-Paced Online
  • Online
  • English

Suitability - Who should attend?

Prerequisites

Computer and IT literacy.

Outcome / Qualification etc.

What you'll learn

  • Describe and differentiate between Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes.
  • Define data pipeline components, processes, tools, and technologies.
  • Create batch ETL processes using Apache Airflow and streaming data pipelines using Apache Kafka.
  • Demonstrate understanding of how shell-scripting is used to implement an ETL pipeline.

Course delivery details

This course is offered through IBM, a partner institute of EdX.

2–4 hours per week

Expenses

  • Verified Track -$99
  • Audit Track - Free
Ads