Back to Browse

Data Engineering: Data Lake vs Data Warehouse (SQL- ETL Coding)

19 views
Premiered May 6, 2026
16:16

This video demonstrates a hands-on SQL ETL workflow using a DoorDash delivery scenario. The objective of this exercise is to move from raw operational data to an analytics-ready dataset suitable for reporting and business intelligence. The workflow follows a structured data engineering process: 1. Creation of a raw operational table to simulate real-time delivery data 2. Inspection of the dataset to validate structure and content 3. Insertion of sample delivery records, including both completed and cancelled orders 4. Transformation of raw timestamp data into actionable performance metrics using SQL 5. Generation of an analytics-ready table to support business insights Key transformations include: - Converting timestamps into numeric values using julianday - Calculating delivery performance metrics such as preparation time, driver wait time, route time, and total delivery time - Converting time differences into minutes for interpretability - Applying rounding for standardized reporting outputs This process reflects a simplified ETL pipeline: - Extract: raw delivery data - Transform: computation of performance metrics - Load: creation of a structured analytics table The final output enables analysis of operational efficiency and identification of delivery delay patterns across cities. This demonstration highlights both technical SQL execution and the ability to translate raw data into meaningful business metrics.

Download

0 formats

No download links available.

Data Engineering: Data Lake vs Data Warehouse (SQL- ETL Coding) | NatokHD