Databricks Data Engineer
Compensation estimateAI
See base, equity, bonus, and total comp estimates for this role — free, no credit card.
Sign up to see compensation estimateTitle: Databricks Data Engineer
Location: Columbus, OH (1st choice), Remote US or Canada (2nd choice)
Experience Required: 7 - 12 years
Job Overview:
We are seeking a skilled Data Engineer to join our team. The successful candidate will be responsible for development and optimization of data pipelines, implementing robust data checks, and ensuring the accuracy and integrity of data flows. This role is critical in supporting data-driven decision-making processes, especially in the context of our insurance-focused business operations.
Key Responsibilities:
- Collaborate with data analysts, reporting team and business advisors to gather requirements and define data models that effectively support business requirements
- Develop and maintain scalable and efficient data pipelines to ensure seamless data flow across various systems adddress any issues or bottlenecks in existing pipelines.
- Implement robust data checks to ensure the accuracy and integrity of data. Summarize and validate large datasets to ensure they meet quality standards.
- Monitor data jobs for successful completion. Troubleshoot and resolve any issues that arise to minimize downtime and ensure continuity of data processes.
- Regularly review and audit data processes and pipelines to ensure compliance with internal standards and regulatory requirements
- Familiar with working on Agile methodologies - scrum, sprint planning, backlog refinement etc.
Candidate Profile:
- 7-12 years experience on Data Engineering role working with Databricks & Cloud technologies.
- Bachelor’s degree in computer science, Information Technology, or related field.
- Strong proficiency in PySpark, Python, SQL.
- Strong experience in data modeling, ETL/ELT pipeline development, and automation
- Hands-on experience with performance tuning of data pipelines and workflows
- Proficient in working on Azure cloud components Azure Data Factory, Azure DataBricks, Azure Data Lake etc.
- Experience with data modeling, ETL processes, Delta Lake and data warehousing.
- Experience on
Delta Live Tables, Autoloader & Unity Catalog
.
- Preferred - Knowledge of the insurance industry and its data requirements.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Excellent communication and problem-solving skills to work effectively with diverse teams
- Excellent problem-solving skills and ability to work under tight deadlines.