We're in beta · Starting with US & Canada · Shipping weekly — your feedback shapes RiseMe
AceStack logo
AceStack Verified
Software, Data Analytics, SaaS

Lead Data Engineer : Raleigh, NC /Remote : Full Time Permanent

Raleigh, North Carolina, United StatesHybridFull TimeLeadPosted 1 month agoVisa sponsorship available

Compensation estimateAI

See base, equity, bonus, and total comp estimates for this role — free, no credit card.

Sign up to see compensation estimate

Job Role : Lead Data Engineer - Snowflake, DBT and Qlik

Location : Raleigh, NC/Remote

Job Type : Full time Permanent

Job Description

Job Description

Lead Data Engineer - Snowflake, DBT and Qlik

• Data Architecture and Strategy - Design and implement scalable, efficient data architectures. Lead the development of data strategy aligned with business objectives. Evaluate and integrate new technologies to enhance data capabilities

• Hands-on Data Pipeline Development – As a technical lead, implement complex data pipelines for real-time and batch processing. Optimize data flows for high-volume, high-velocity data environments. Develop advanced ETL processes for diverse data sources

• Data Governance and Quality Management – Drive, establish and enforce data governance policies and best practices. Implement data quality frameworks and monitoring systems. Ensure compliance with data regulations and standards

• Performance Optimization and Troubleshooting – Take charge, analyze and optimize system performance for large-scale data operations. Troubleshoot complex data issues and implement robust solutions

• Mentorship and Knowledge Sharing - Mentor junior data engineers and provide technical guidance. Be current on data technologies, recommend best practices and standards. Collaborate with cross-functional teams, leadership to drive data literacy

• Testing & Automation – Enforce/write unit test cases, validate/review the data integrity & consistency results, drive automated data pipelines using GitLab, Github, CICD tools

• Code Deployment & Release Management – Review & approve code promotions, enforce release management procedures to promote code deployment to various environments including production, disaster recovery, and support activities.

• Collaborate with business stakeholders to understand data requirements and translate them into an efficient architecture solution.

Technical/Business Skills:

• Deep expertise in designing, building robust metadata-driven, automated data pipeline solutions leveraging modern cloud-based data technologies, tools for large data warehouse, database platforms.

• Deep experience leveraging data security, governance methodologies meeting data compliance requirements.

• Strong hands-on experience designing, building medallion architecture ELT pipelines, snowpipe, streaming frameworks using Qlik Replicate, DBT Cloud transformations, snowflake, GitLab with CICD.

• Strong experience designing, building data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake.

• Strong design, development experience in Python/Pyspark, advanced SQL for ingestion frameworks and automation.

• Strong experience architecting, building solutions using S3, Lambda, SQS, SNS, Glue, RDS AWS services.

• Strong experience working with various structured and semi-structured data files - CSV, fixed width, JSON, XML, Excel, and mainframe VSAM.

• Strong orchestration experience using DBT cloud, Astronomer Airflow.

• Design, implement logging, monitoring, alerting, observability using tools like Dynatrace.

• Strong experience in problem solving, performance tuning.

• Design and implement schema drift detection and schema evolution patterns.

• Strong experience designing, implementing sensitive data protection strategies – tokenization, snowflake data masking policies, dynamic & conditional masking, and role based masking rules.

• Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems.

• Strong experience in enforcing, adopting release management guidelines, code deployment to various environments, implementing disaster recovery strategies, leading production activities.

• Ability to make technical data decisions for key business use cases.

• Ability to prioritize investment on new data-related technologies in conjunction with business units.

• Must have one or more certifications in the relevant technology fields.

• Financial banking experience is a plus.

Ready to apply?
You'll be redirected to AceStack's application page.