Coursera – BI Foundations with SQL, ETL and Data Warehousing Specialization

Coursera – BI Foundations with SQL, ETL and Data Warehousing Specialization
English | Tutorial | Size: 707.89 MB

Springboard for BI Analytics success. Develop hands-on skills for building data pipelines, warehouses, reports and dashboards.

What you’ll learn

• Write SQL queries to work with relational databases including CREATE TABLE, SELECT, INSERT, UPDATE, DELETE, ORDER, JOIN, Functions, etc.

• Execute commonly used Linux commands; Automate Extract, Transform and Load (ETL) jobs and data pipelines using BASH scripts, Apache Airflow & Kafka

• Design Data Warehouses using star and snowflake schemas, loading and verify data in staging areas, build cubes, rollups and materialized views/tables

• Analyze data in warehouses using interactive reports and dashboards using BI tools such as Cognos Analytics

Skills you’ll gain

• Python Programming
• Cloud Databases
• Relational Database Management System (RDBMS)
• Jupyter notebooks
• Data Warehousing
• Business Intelligence (BI)
• Cognos analytics
• Cube and Rollup
• Star and Snowflake Schema
• Shell Script
• Bash (Unix Shell)
• Extract Transform and Load (ETL)
• Linux
• Linux Commands
• Data Engineer
• Apache Kafka
• Apache Airflow
• Data Pipelines

Specialization – 4 course series

Professionals with SQL, ETL, Enterprise Data Warehousing (EDW), Business Intelligence (BI) and Data Analysis skills are in great demand. This Specialization is designed to provide career relevant knowledge and skills for anyone wanting to pursue a job role in domains such as Data Engineering, Data Management, BI or Data Analytics.

The program consists of four online courses. In the first course you learn the basics of SQL and how to query relational databases with this powerful language. Next you learn to use essential Linux commands and create basic shell scripts. You continue your journey by learning to build and automate ETL, ELT and data pipelines using BASH scripts, Apache Airflow and Apache Kafka. In the final course you learn about Data Lakes, Data Marts as well as work with Data Warehouses. You also create interactive reports and dashboards to derive insights from data in your warehouse.

Note that this specialization has a significant emphasis on hands-on practice employing real tools used by data professionals. Every course has numerous hands-on labs as well as a course project. While you will benefit from some prior programming experience, it is not absolutely necessary for this course. The only pre-requisites for this specialization are basic computer and data literacy, and a passion to self-learn online.

1. Hands-on Introduction to Linux Commands and Shell Scripting
2. Databases and SQL for Data Science with Python
3. ETL and Data Pipelines with Shell, Airflow and Kafka
4. Getting Started with Data Warehousing and BI Analytics

4 Courses, Total, 337 Files, 57 Folders

Applied Learning Project

Each course provides lots of practice using hands-on labs and projects on cloud-based environments with real tools. Hands-on exercises include: running Linux commands and pipes, in creating shell scripts, scheduling jobs using cron, building ETL and data pipelines, creating & monitoring Airflow DAGs, working with streaming data using Kafka, designing a data warehouses with star and snowflake schemas, verifying data quality, loading staging and production warehouses, writing SQL queries and joins with PostgreSQL, MySQL & DB2 databases, developing cubes, rollups and materialized views/tables, creating interactive reports & dashboards, and analyzing warehouse data using BI tools like Cognos Analytics.

Buy Long-term Premium Accounts To Support Me & Max Speed



If any links die or problem unrar, send request to

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.