Doeon Kim — Demonstrated capabilities across the Databricks Lakehouse Platform
Production-grade Spark + Delta Lake medallion pipeline with data contracts, schema enforcement, and Snowflake export.
| Component | Detail |
|---|---|
| Architecture | Bronze / Silver / Gold medallion layers with Delta Lake |
| Data Contracts | Schema validation, quality checks, SLA enforcement |
| Processing | PySpark transformations, incremental loads, merge operations |
| Export | Snowflake warehouse integration for cross-platform analytics |
| Governance | Column-level lineage, data quality metrics, contract versioning |
| Category | Skills |
|---|---|
| Core Platform | Workspaces, clusters, notebooks, Unity Catalog, DBFS |
| Data Engineering | Delta Lake, Spark SQL, structured streaming, Auto Loader |
| ML/AI | MLflow, model serving, feature store, fine-tuning on GPU |
| Architecture | Medallion layers, data contracts, lakehouse design patterns |
| Governance | Unity Catalog, data lineage, access controls, quality monitoring |
Databricks Certified Platform Architect — Validated expertise in lakehouse architecture, Unity Catalog governance, and production workload design.