


GlobalMart, a leading e-commerce company in India, serves millions of customers through its online platform. Previously relying on Azure Synapse Analytics, the data team struggled with scalability challenges, limited flexibility in managing growing data volumes, and inefficient data sharing across teams. As sales data continued to expand, these limitations began impacting analytics speed and decision-making capabilities.
Without a modern, scalable data architecture, GlobalMart risks delayed insights, poor query performance, and increased operational costs. The business needs a solution that can handle massive datasets efficiently, enable seamless data sharing, and support complex analytical queries for faster, data-driven decisions.
You'll act as a data engineer at GlobalMart, tasked with migrating raw sales data from Azure Blob Storage into Snowflake using the medallion architecture. Working with 10 datasets, you'll build an
end-to-end data pipeline that transforms raw CSV files into an analytics-ready dimensional model. Using SQL and Snowflake's data loading capabilities, you'll implement a three-layer architecture that progressively refines data quality and structure.
What You'll Build:
Bronze Layer:
Silver Layer:
Gold Layer:
By the end, you'll have built a production-ready three-tier data architecture with 10 Bronze tables, 10 Silver tables, 1 calendar dimension, 6 dimension tables, and 1 fact table, so you can efficiently ingest raw data, apply data quality transformations, and create analytics-ready models for business intelligence.
Submit SQL code snippets, answer multiple choice questions, and complete short answer questions to demonstrate your implementation.