
Lead Technical Consultant – Level 3 (Data Platform Engineer)
Compensation estimateAI
See base, equity, bonus, and total comp estimates for this role — free, no credit card.
Sign up to see compensation estimateLead Technical Consultant – Level 3 (Data Platform Engineer)
Location:
Remote (RTP, NC preferred)
Pay Rate:
$95 – $115/hour (W2 only)
Important:
- No sponsorship is available for this role.
- We cannot work with third-party vendors or C2C candidates. Candidates must be able to work directly on Alphanumeric’s W2.
About the Role
Alphanumeric is hiring a
Lead Technical Consultant – Level 3 (Data Platform Engineer)
to support our client of 20+ years, a global leader committed to improving lives through medical and pharmaceutical innovation.
This team operates as a
full-stack data and platform engineering organization
, spanning product and portfolio leadership, data engineering, infrastructure & DevOps, AI/ML platforms, and advanced analytics environments.
Their mission is to transform how scientific and enterprise data is leveraged by:
- Building
next-generation, metadata-driven data platforms
to reduce time spent on “data mechanics”
- Delivering
best-in-class AI/ML environments
to accelerate predictive capabilities
- Engineering
data at scale as a unified, real-time asset
- Automating
end-to-end data pipelines
across high-throughput domains (genomics, multi-omics, etc.)
- Enabling
governance-by-design
for internal and external data
- Creating
domain-specific data products
for computational scientists
- Supporting
full traceability and data provenance
- Driving
scalable, reusable, and efficient data engineering practices
Position Overview
As a
Data Platform Engineer
, you will take
full ownership of delivering scalable, high-impact data platform solutions
—from problem definition through deployment and ongoing operations.
You will act as a
technical leader and mentor
, setting the standard for engineering excellence while contributing hands-on to development. This role requires a strong balance of
architecture, coding, platform engineering, and operational excellence
.
Key Responsibilities
- Design, build, and maintain
cloud-native data platform solutions
- Develop
reusable frameworks and components
to accelerate data product development
- Own
end-to-end delivery
of platform services, including monitoring and optimization
- Implement
observability, performance tuning, and reliability improvements
- Define and track
metrics to measure platform performance and impact
- Mentor junior engineers and provide
technical leadership across projects
- Collaborate cross-functionally with data engineers, scientists, and platform teams
Core Skills & Experience
- Strong experience with
Google Cloud Platform (GCP)
including:
- Cloud Run, GKE, Cloud Storage, Artifact Registry, IAM
- Advanced
Python development
(automation, tooling, pipelines)
- Hands-on experience with
Docker
:
- Multi-stage builds, image optimization, security best practices
- Experience building and maintaining
CI/CD pipelines
for containerized environments
- Strong background in
debugging and observability
:
- Logging, monitoring, distributed tracing (Cloud Logging, Cloud Monitoring, etc.)
- Proven ability to
diagnose and resolve performance issues
in cloud-native systems
Preferred / Bonus Skills
- Experience with
Nextflow
(workflow orchestration)
- Exposure to
bioinformatics / genomics / multi-omics / imaging pipelines
- Experience with
GCP Batch
for large-scale workloads
Education & Experience
- Bachelor’s degree with
8+ years of experience
, or
- Master’s degree with
5+ years of experience
Why Join?
- Work on
cutting-edge data platforms
in life sciences and healthcare
- Contribute to
real-world impact in biopharmaceutical development
- Collaborate with
top-tier engineering and scientific talent
- Be part of a team driving
innovation at scale