
Euclid Innovations Verified
Energy Management, Smart Buildings, Software, Artificial Intelligence, IoT
GCP Data Engineer
Charlotte, North Carolina, United StatesHybridContractPosted 25 days agoVisa sponsorship available
Compensation estimateAI
See base, equity, bonus, and total comp estimates for this role — free, no credit card.
Sign up to see compensation estimateHi Rahul,
Hope you are doing good!
This is Rahul
from
Euclid Innovations
. Please find the JD and let me know if you are interested.
GCP Data Engineer
Charlotte, NC
12 Months
We are seeking experienced Data Engineers to support a large-scale data platform transformation for a leading banking client.
This role focuses on building Spark-based data pipelines and enabling data movement between GCP (Google Cloud Platform) and on-prem systems (DPC) based on governance and model requirements.
Key Responsibilities
- Design and build scalable ETL/data pipelines using Spark and Python
- Develop data workflows to ingest, transform, and move large datasets
- Implement data routing logic to direct data to:
- GCP (BigQuery, Dataflow, Dataproc)
- On-prem platforms (DPC)
- Ensure data quality, validation, and reconciliation across systems
- Collaborate with data science and platform teams to support predictive model pipelines
- Optimize performance and scalability for high-volume data processing
Required Skills
- Strong hands-on experience with Apache Spark / PySpark for large-scale data processing
- Proficiency in Python for data engineering (ETL pipelines)
- Experience designing and developing data pipelines / data engineering workflows
- Solid background in ETL, data ingestion, transformation, and data movement
- Experience working with big data technologies and handling large datasets (batch/streaming)
- Experience with cloud platforms – GCP (Google Cloud Platform)
- BigQuery, Dataflow, Dataproc, GCS (Google Cloud Storage)
- Experience with data migration / data integration projects
- Understanding of data pipeline architecture and distributed systems
Preferred Skills
- Experience with GCP (BigQuery, Dataflow, Dataproc, GCS)
- Exposure to hybrid environments (cloud + on-prem)
- Familiarity with ML/data pipelines (supporting models, not building them)
- Experience in financial services / banking domain (nice to have)
Similar roles
GCP Lead Data EngineerCapgemini · Nashville, Tennessee, United States · Onsite
GCP Data EngineerNewt Global · Irving, Texas, United States · Hybrid
Senior GCP Data EngineerCapgemini · Nashville, Tennessee, United States · Remote- GCP Data EngineerMethodHub · Atlanta, Georgia, United States · Onsite
- GCP Data EngineerRecutify Inc. · San Francisco, California, United States · Onsite