About the Role:
We are hiring an experienced Cloud Data Engineer with strong expertise in Google Cloud Platform (GCP), Python, and SQL to join a high-performing data engineering team at a global banking leader. In this role, you will design, develop, and maintain scalable cloud-native data pipelines and drive automation in data workflows. This is an exciting opportunity to work on enterprise-scale analytics and data modernization initiatives.
Key Responsibilities:
- Design and implement robust ETL pipelines using Python and PySpark
- Develop and optimize data solutions with Google Cloud services including BigQuery, Cloud Composer, and Dataflow
- Write advanced SQL queries for data transformation and reporting
- Automate and orchestrate workflows using Cloud Composer and Airflow
- Collaborate with cross-functional teams including data analysts, architects, and business stakeholders
- Ensure high code quality with unit testing, version control, and CI/CD pipelines
- Troubleshoot, monitor, and enhance data solutions for performance and reliability
- Contribute to the development of scalable and secure cloud-based data architectures
Required Skills & Qualifications:
- 5–8 years of experience as a Data Engineer or Cloud Data Developer
- Proven experience with Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, and Cloud Composer
- Strong programming skills in Python; experience with PySpark preferred
- Proficient in SQL and data transformation logic
- Hands-on experience in workflow orchestration tools like Airflow
- Experience with CI/CD tools (e.g., Git, Jenkins, Cloud Build)
- Familiarity with data governance, data quality, and security best practices
- Strong analytical, communication, and problem-solving skills
Preferred Qualifications:
- GCP Data Engineer Certification
- Experience in the banking or financial services domain
- Knowledge of Terraform or Infrastructure as Code (IaC) practices