Building a Modern Data Lakehouse on Azure Learning Path
As a Data Engineer, you need to master the skills of data ingestion, modeling, and serving in a modern data platform. You want complete control over all the aspects of constructing a data lakehouse to ignite a data-driven community.
As element61, we provide a comprehensive learning path that combines multiple training courses to equip individuals with the essential Data Engineer skills required to construct modern Data Lakehouses on Microsoft Azure. Completing the prerequisite training courses, before embarking on individual ones, can prove immensely beneficial. These trainings impart essential skills to pave the way for a successful journey.
In this data engineering learning path, we allow for two options for the final course: a choice based on the leading data engineering platform in your organization’s architecture:
- The Microsoft Fabric course focuses on all skills needed to do lakehouse or warehouse development. Topics include:
- What is a data lakehouse, and how is it different from a classic data warehouse?
- What are the building blocks of Microsoft Fabric?
- What is OneLake?
- What are the personas/experiences and the different workloads?
- What is Direct Lake mode in Power BI?
- How can I set up & design ingestion scripts in a Modern Data Platform with Azure Data Factory and Microsoft Fabric?
- How can I leverage Lakehouse's best practices in data transformations, including the different transformation engines and workspace management?
- What are the options for performance tuning in Microsoft Fabric, including cluster design and Delta Lake?
- In the Azure Databricks session, we focus on enabling professionals who embrace a Python, Scala, or Spark SQL-enabled skill set for writing data jobs & transformations (incl. data lakehousing). Topics include:
- What are the differences between a lakehouse and a traditional data warehouse?
- What is the medallion structure in Databricks?
- How do I choose between Databricks SQL and Python development?
- What is a Deltalake? And what is the function of the transaction log and the parquet file format?
- What is Unity catalog?
- How do Azure Data Factory, Databricks notebooks, and Azure Data Lake Storage come together in a modern data platform?
Click on any of the blocks to go to the detailed training page with all relevant information.
This learning path starts twice per year (typically in January & September) and runs over a period of 4 months with typically:
- Month 1: Dimensional Data Modeling training (2 days)
- Month 2: Azure Fundamentals for Data Professionals (2 days)
- Month 2 or 4: Azure DevOps Basics (1 day)
- Month 3: Data Lakehouse development with Databricks or Data Lakehouse development with Fabric (1 day)
- € 3.500 per participant for 6 full days of training
Interested to know more?
For more information, please reach out email@example.com and we can give you more details and practicals.
This training path can also be hosted in-company. For more information, reach out to our academy address.
The full element61 Training schedule (incl. when which training runs) can be found here.