About the Engagement The Case Coordination Center's Data Reporting and Analytics team is undertaking a platform migration from Microsoft SQL Server and SSIS to Databricks. We are seeking an experienced Databricks consultant to accelerate this transition by working alongside our existing ETL engineering team, transferring hands-on skills, and helping us deliver our first production pipelines on the new platform. This is a knowledge-transfer-focused engagement. The ideal candidate will not just do the work - they will coach our team to do it themselves. Our engineers have strong SQL and SQL Server backgrounds and are adopting a Spark SQL-first approach to Databricks development. The consultant will also be expected to configure BI tool connectivity and collaborate closely with our IT department on environment management and production deployments. Contract - contract-to-hire considered for outstanding candidates Initial 3-month contract with option to extend to 6 months Hybrid / Remote - some on-site presence may be requested for workshops Hours: Full-time (40 hours/week) Start Date: Within 60-90 Days Reports To: Data Reporting and Analytics Consultant VI, Case Coordination Center What You Will Do: Responsibilities include but are not limited to: • Lead the architecture and implementation of our SQL Server to Databricks migration, following the Medallion Architecture (Bronze / Silver / Gold) • Configure and operationalize Lakeflow Connect for SQL Server ingestion, replacing existing SSIS extract packages • Build production-grade ETL pipelines using Spark SQL and Databricks Workflows • Implement Delta Lake best practices: MERGE-based incremental loads, OPTIMIZE, VACUUM, and Z-ordering • Set up Unity Catalog governance: schemas, permissions, and table lineage • Migrate and re-implement existing SQL Server Agent jobs as Databricks Workflows • Conduct weekly knowledge-transfer sessions and hands-on workshops with our ETL engineering team • Document all patterns, decisions, and runbooks so the team can maintain the platform independently after your engagement • Configure and optimize BI tool connectivity to Databricks SQL Warehouses (Power BI), including DirectQuery setup, dataset publishing, and performance tuning • Collaborate with IT departments on Dev, Staging, and Production environment setup - including deployment pipelines, access controls, and production promotion workflows • Assist with migration validation: row count checks, data quality comparisons, and UAT support • Helps build monitoring solutions Required Qualifications: • 5 - 10 years of data engineering experience in Databricks • Databricks Certified Data Engineer Associate or Professional certification • Demonstrated experience migrating from Microsoft SQL Server or SSIS to Databricks • Deep expertise in Spark SQL and Delta Lake (MERGE, time travel, schema evolution, table maintenance) • Experience with Databricks Workflows, job clusters, and pipeline orchestration • Hands-on experience with the Medallion Architecture in a production environment • Proficiency in T-SQL - ability to review and translate existing SQL Server stored procedures and SSIS packages • Experience with Unity Catalog or Databricks data governance • Strong communication and teaching skills - this role is 50% engineering, 50% coaching • Experience configuring BI tools (Power BI) to connect to Databricks SQL Warehouses, including DirectQuery, published datasets, and row-level security • Experience working across Dev, Staging, and Production environments in an enterprise IT governance model - including coordinating production deployments with IT operations teams Preferred Qualifications: • Experience with Lakeflow Connect (SQL Server connector specifically) • Experience with Azure Data Lake Storage Gen2 (ADLS) or Amazon S3 as a data lake • Familiarity with Databricks Asset Bundles for CI/CD deployment • Experience with Delta Live Tables (DLT) and streaming pipelines • Prior consulting or staff augmentation experience with structured knowledge transfer deliverables • Experience using Databricks Asset Bundles for CI/CD across Dev, Staging, and Production workspaces • Familiarity with IT change management processes (CAB approvals, deployment windows, rollback procedures) • Experience in healthcare, case management, or government data environments (preferred but not required)