Job Description
Job Description
Robert Half is seeking a skilled Data Engineer. This role is integral to managing and optimizing our data pipelines and infrastructure within a Google Cloud Platform (GCP) environment. The ideal candidate will bring a strong understanding of data quality and orchestration and have a data-driven mindset to support decision-making across the organization.
Key Responsibilities Data Pipeline Management:
- Oversee data pipelines deployed through a mixture of Terraform, Python, Powershell, and SQL in a GCP data system
- Use of Cloud Composer, GCS, and BigQuery as primary but not exhaustive services used to manage of ETL pipeline
- Will be responsible for entirety of data pipelines and orchestration in GCP, including deployment, bug fixes, issue investigation and resolution, and more
- Must be experienced in evaluating data quality and validation practices from ingestion through reporting
- Will oversee ML-based data mastering solution in tandem with solution vendor
Other duties as needed
Interviews are currently being scheduled, for immediate consideration please apply.
The successful Data Engineer will have:
Proficiency in coding languages such as Python and SQL for data engineering tasks.
Experience with orchestration tools and strong understanding of pipeline deployment processes.
Hands-on expertise with key GCP services, including Cloud Composer, GCS, and BigQuery.
Strong focus on data quality and the ability to evaluate data validation practices effectively.
A data analyst mindset, with a focus on ensuring data usability and quality across systems.
Familiarity with Terraform for infrastructure-as-code practices.
Knowledge of data modeling and schema design is a plus.
Background is required prior to employment