Job Overview:
We are seeking an experienced Cloud Data Engineer with strong expertise in Google Cloud Platform (GCP) to join our Operations Enablement team. The ideal candidate will have experience in supporting, developing, testing, and maintaining GCP-based data pipelines. Proficiency in BigQuery and familiarity with handling support tasks will be crucial in this role. You will collaborate closely with cross-functional teams including support, development, data science, analysts, and other engineers to ensure that data workflows are streamlined and align with business objectives. This position will also contribute to a significant project focused on migrating legacy ETL pipelines from Informatica to GCP-based solutions.
The successful candidate will play a key role in optimizing our data systems as part of an important transformation initiative. We are looking for a driven, motivated Cloud Data Engineer ready to make a substantial impact on our business.
Experience - 4+ years
Key Skills:
- GCP Data Pipelines
- BigQuery
- Composer
- GitLab
- Cloud Functions
- Python
- SQL Server
- ETL
- Automation
- Performance Tuning
- Data Warehousing
- Data Integration
Key Responsibilities:
Pipeline Development & Maintenance:
- Design, implement, and optimize GCP-based data pipelines utilizing services like Cloud Functions, Cloud Composer, and others.
- Involved in complex workstreams from start to finish, ensuring high-quality, on-time delivery.
- Develop and automate ETL processes for large-scale data integration and transformation.
- Optimize data pipeline scalability, performance, and reliability.
- Take part in technical discussions and communicate complex ideas to non-technical stakeholders.
- Align technical decisions with business objectives and propose impactful solutions.
- Mentor junior engineers and share expertise across teams.
BigQuery Management:
- Design, optimize, and maintain BigQuery environments to support large datasets.
- Develop efficient, cost-effective queries and data models within BigQuery.
- Monitor and troubleshoot BigQuery performance, while managing data storage and associated costs.
Data Integration & Transformation:
- Collaborate with internal teams to understand data needs and design solutions to integrate data from multiple sources into GCP.
- Ensure consistency, quality, and compliance of data across pipelines and systems.
Collaboration & Support:
- Work alongside Data Scientists, Analysts, and other engineers to develop and deploy data-driven solutions.
- Provide support for troubleshooting data pipeline issues and optimizing performance.
Documentation & Reporting:
- Create and maintain comprehensive documentation for GCP data pipeline architectures and configurations.
- Provide regular reports and dashboards on pipeline performance, data quality, and health.
Required Skills & Qualifications:
- Extensive experience with Google Cloud Platform (GCP), including services like BigQuery, Kubernetes, Airflow, Cloud Storage, Dataflow, Cloud Composer, Pub/Sub, and more.
- Familiarity with big data technologies such as Spark, and experience with programming languages like Python or Scala.
- Strong proficiency in SQL, particularly for querying and optimizing large datasets in BigQuery.
- Expertise in Python for building data pipelines and automation scripts.
- Experience with ETL/ELT, both batch and streaming with Strong skills in troubleshooting pipelines.
- Knowledge of advanced pipeline concepts, including idempotency, scaling, security, testing, version control, and handling schema changes.
- Some experience with Informatica is a plus.
- Familiarity with CI/CD pipelines and automation tools preferable in Gilab.
- Experience in optimizing pipeline performance and cost efficiency.
- Some understanding of GCP IAM, monitoring, and logging tools to maintain secure and efficient operations.
- Background in data warehousing, data integration, and performance tuning.
We offer a competitive total rewards package including a base salary determined based on the role, experience, skill set, and location. For those in eligible roles, discretionary incentive compensation may be awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility.
We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. By submitting an application for this job, you acknowledge that any personal data or personally identifiable information that you provide to us will be processed in accordance with our Candidate Privacy Notice.
Top Skills
What We Do
Revionics LLC, an Aptos Company, provides enterprise retailers around the world with leading science-based solutions for pricing, promotions and markdowns. As a trusted partner for top retailers across a variety of industries and markets, Revionics delivers unparalleled results in ROI, profit lift, process efficiency and more.
Powered by robust analytics and advanced AI models, Revionics equips retailers with clarity and confidence to make optimal pricing decisions. With science at the center, Revionics’ machine learning capabilities translate consumer, competitor and market data into actionable insights and transparent pricing recommendations for high-impact results