About the Role:
We are looking for an AI, GCP Big Query, Airflow, Dataflow, Pub/Sub, Cloud Functions with minimum 5+ years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred).
Requirements:
- Strong proficiency in SQL, Java and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc.
- Skills Required: Big Query, AI, AIRFLOW, Ability to communicate and work with cross-functional teams and all levels of management, GCP and Python.
- Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform.
- Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., Big Query).
- Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments.
- Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton and other automation frameworks.
- Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues.
- Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).
#LI-Hybrid #LI-MK1