Data Engineer (Databricks certified + snowflake + iceberg)
Data Engineer – Iceberg Migration
Databricks + snowflake + iceberg
Remote
Overview
We are looking for a Data Engineer / Data Architect to drive the migration of Snowflake datasets to Databricks Unity Catalog-managed Iceberg tables. The role is focused on ensuring a smooth transition, maintaining cross-platform data accessibility, and supporting advanced analytics and AI-driven workloads.
Key Responsibilities
Lead and execute the migration of Snowflake tables to Databricks Unity Catalog (Iceberg format)
Assess existing Snowflake data models, pipelines, and dependencies
Establish dual-access capabilities (Snowflake external access and Databricks)
Identify and convert queries to Spark SQL-compatible syntax
Work closely with data platform and datahub teams to ensure seamless onboarding
Perform data validation, reconciliation, and consistency checks post-migration
Enhance and optimize data pipelines and query performance in Databricks
Ensure adherence to data governance, access control, and Unity Catalog best practices
Troubleshoot and resolve migration-related issues efficiently
Develop and contribute to automation frameworks and reusable migration components
Required Skills
Strong expertise in Snowflake and Databricks (Spark, Unity Catalog)
Hands-on experience with Apache Iceberg, Delta Lake, or other open table formats
Proficiency in SQL and Spark SQL
Experience in data migration, ETL/ELT processes, and data modeling
Familiarity with AWS (S3, IAM, networking) or equivalent cloud environments
Solid understanding of data governance and access management
Ability to troubleshoot and optimize performance across distributed systems
Preferred Qualifications
Experience with cross-platform data sharing (Snowflake + Databricks)
Knowledge of REST catalog integration and Iceberg external tables
Exposure to AI/BI workloads and analytics ecosystems
Understanding of enterprise data platforms and data mesh architecture