14 Days of Code : The Complete Data Engineering in Fabric
Published 9/2025
Duration: 6h 24m | .MP4 1280x720 30 fps(r) | AAC, 44100 Hz, 2ch | 2.73 GB
Genre: eLearning | Language: English
Published 9/2025
Duration: 6h 24m | .MP4 1280x720 30 fps(r) | AAC, 44100 Hz, 2ch | 2.73 GB
Genre: eLearning | Language: English
Master Fabric Data Engineering in Just 14 Days—From Notebooks to Delta Lake
What you'll learn
- Master Microsoft Fabric Notebooks: Learn to schedule notebooks, manage environments, and handle export/import operations for real-time data workflows.
- Orchestrate Data Pipelines: Use parameters, notebookutils, and master notebooks to trigger parallel executions and integrate with KeyVaults and lakehouse contex
- Perform Advanced Data Ingestion & Transformation: Apply PySpark and Pandas to read complex formats like multiline JSON and implement transformations using selec
- Design and Optimize Delta Lake Architectures: Implement features like Time Travel, Restore, Partitioning, Z-Ordering, and SCD Type 1 & 2 with audit logging and
- Execute End-to-End Projects: Apply all concepts in full-scale, scenario-driven projects including lakehouse setup, pipeline orchestration, and log storage in da
Requirements
- No prior Python experience required—Python will be taught hands-on within the course.
- Basic understanding of data concepts like tables, joins, and file formats (CSV, JSON)
- Familiarity with cloud platforms or data engineering tools is helpful but not mandatory.
- Curiosity and commitment to learning through real-world, project-based scenarios.
Description
Ready to master data engineering with Microsoft Fabric in just 14 days?This hands-on course is designed for learners who want to build real-world skills—not just watch theory-heavy videos. Whether you're a beginner or a working professional, you'll learn how to use Fabric Notebooks, Python, PySpark, and Delta Lake through guided projects and practical assignments.
You’ll start by exploring Fabric Notebook features like scheduling, environment setup, and orchestration. Then, dive into data ingestion, transformation, and lakehouse architecture using PySpark and Pandas. From there, you’ll unlock the full power of Delta Lake—Time Travel, Z-Ordering, SCD implementations, and audit logging. Finally, you’ll apply everything in two full-scale projects that simulate real production workflows.
No prior Python experience? No problem.Python is taught hands-on within the course, so you’ll learn by doing—step by step.What You’ll Learn
Master Fabric Notebooks: scheduling, environment setup, and orchestration
Perform advanced data ingestion and transformation using PySpark and Pandas
Implement Delta Lake features including Time Travel, Partitioning, and SCD Types
Orchestrate notebooks with pipelines and store execution logs in database tables
Execute two end-to-end projects with real-world data engineering scenarios
Who This Course Is For
Aspiring data engineers looking for job-ready skills
Professionals wanting to upskill in Microsoft Fabric and Delta Lake
Students and freshers seeking hands-on, project-based learning
Trainers and freelancers building real-time data solutions
Who this course is for:
- Aspiring data engineers who want to master Microsoft Fabric using real-world scenarios and hands-on notebooks.
- Working professionals looking to upskill in PySpark, Delta Lake, and Fabric orchestration without relying on theory-heavy content.
- Students and freshers seeking job-ready skills through guided projects and practical assignments.
- Trainers, freelancers, and educators who want to understand how to build and deliver Fabric-based data engineering solutions.
- Anyone curious about modern data engineering workflows and eager to learn through authentic, project-based learning.
More Info