Data Engineer

Posted 15 Days Ago
Be an Early Applicant
Hiring Remotely in Bengaluru, Karnataka
Remote
Senior level
Machine Learning • Software • Database • Analytics
The Role
As a Senior Data & Analytics Engineer, you will build scalable data architectures for SaaS platforms, focusing on AWS Data Engineering stack and Postgres. Responsibilities include optimizing data pipelines, implementing efficient migration scripts, and ensuring data management for high-performance analytics and reporting.
Summary Generated by Built In

Empowering enterprises to keep the planet habitable for all, Terrascope aspires to be the easiest carbon measurement and decarbonization platform for companies in the land, nature, and net-zero economy sectors.


Terrascope is a leading decarbonisation software platform designed specifically for the Land, Nature (LAN), and the Net-Zero Economy (NZE). As the easiest-to-use platform for these sectors, our comprehensive solution blends deep industry expertise with advanced climate science, data science, and machine learning. Terrascope enables companies to effectively manage emissions across their supply chains. 


Our integrated platform offers solutions for Product and Corporate Carbon Footprinting, addressing Scope 3 and land-based emissions, SBTi FLAG & GHG Protocol LSR reporting, and supporting enterprise decarbonisation goals.


Publicly launched in June 2022, Terrascope works with customers across sectors, from agriculture, food & beverages, manufacturing, retail and luxury, to transportation, real estate, and TMT.


Terrascope is globally headquartered in Singapore and operates in major markets across APAC, North America, and EMEA. Terrascope is a partner of the Monetary Authority of Singapore’s ESG Impact Hub, a CDP Gold Accredited software provider, has been independently assured by Ernst & Young, and a signatory of The Climate Pledge to achieve Net Zero by 2040.



We are seeking a Senior Data & Analytics Engineer to design and implement scalable data architectures for SaaS platforms in both single-tenant and multi-tenant environments. This role will focus on leveraging the AWS Data Engineering stack, Postgres, and advanced analytics processing techniques, including the creation of materialised views to enable high-volume data analytics. The ideal candidate is skilled in Python scripting, Java or Go, and proficient in handling large-scale data processing workflows. This role will report into the Director of Engineering & Tech and will be crucial in shaping the future of climate-tech SaaS products.

In this role you will be responsible for:

  • Building a robust and scalable data platform to support seamless data onboarding and management for a SaaS platform.
  • Designing scalable single-tenant and multi-tenant SaaS data platforms, optimize materialized views for analytics, and develop pipelines using the AWS Data Engineering stack.
  • Designing and implementing efficient data migration scripts and workflows to enable smooth data onboarding for new and existing clients.
  • Writing clean, efficient code in Python, Java, or Go, and design robust data models using Postgres ORM frameworks.
  • Processing large-scale datasets, optimize Postgres databases for high performance, and implement best practices for scaling analytics solutions.
  • Indexing optimization in Postgres for query performance.
  • Creating materialized views and analytics-ready datasets for headless BI.
  • Implementing row-level security and design multi-tenant database architectures for scalability and security.
  • Developing pipelines and processes to integrate diverse data connectors into the SaaS platform while ensuring data integrity and consistency.
  • Enabling data accessibility and transformation for data science teams by creating analytics-ready datasets and facilitating model integration.
  • Ensuring the data platform and migration workflows are optimized for scalability, high performance, and low latency.
  • Working closely with product, engineering, and data science teams to align platform capabilities with analytics and machine learning requirements.
  • Managing and scaling AWS infrastructure and automate workflows using the GitHub DevOps stack.

To be successful in this role, you should have/ be:

  • Bachelor’s degree in STEM field.
  • At least 5 to 8 years of extensive experience as a Data and Analytics engineer building data platforms for SaaS apps, large-scale data processing workflows, advanced analytics and processing techniques.
  • Experience in database migration projects.
  • Experience in build or migration of multi-tenant database projects.
  • Deep competence in Python scripting for writing ETL pipelines, custom migration scripts, and automating AWS tasks.
  • Deep competence in Java/Go for building high-performance, scalable tools to handle complex migration needs.
  • Deep competence in data storage and management using AWS RDS(postgres), S3 and Document DB.
  • Deep competence of Postgres database architecture and functionality, including indexes, partitioning, and query optimization.
  • Deep competence in Materialized Views, including their creation, refresh strategies, and use cases for analytics.
  • Advanced SQL skills to design complex queries that aggregate, filter, and transform data effectively for materialized views.
  • Deep competence in data processing using AWS Glue, Lambda and Step functions.
  • Have experience in AWS data migration service.
  • Deep competence in data processing and analytics using AWS Athena and AWS Redshift.
  • Deep competence in security and monitoring using AWS IAM, AWS CloudWatch and AWS CloudTrail. 
  • Experience in design mapping between MongoDB’s flexible schema and Postgres' relational schema.
  • Experience in data enrichment and cleaning techniques. 
  • Proven experience with scalable, large data sets and high-performance SaaS applications.
  • Strong ability to work with and optimize large-scale data systems
  • You are a data engineer with a strong background in building scalable analytics solutions in startup environments.
  • You are passionate about creating efficient data processing systems and driving analytics innovation.
  • You are a problem solver with excellent programming skills and a focus on performance optimization.
  • You are a collaborative team player who enjoys mentoring peers and sharing best practices.
  • Prior experience in startups and working with remote teams.
  • Comfortable with change and ambiguity.

Even better if you are:

  • Familiar with Rust programming language.
  • An entrepreneurial problem solver comfortable in managing risk and ambiguity.

We're committed to creating an inclusive environment for our strong and diverse team. We value diversity and foster a community where everyone can be their authentic self.

Top Skills

Go
Java
Python
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
106 Employees
Remote Workplace
Year Founded: 2021

What We Do

Terrascope is a enterprise grade, end to end, smart carbon measurement and management SaaS platform. And we are on a mission to empower companies to build a credible pathway to net zero.

By combining data science, machine learning and sustainability expertise, our platform provides the data, analytics and digital tools to help large companies decarbonise their business operations and supply chains.

Powered by technology, data science and deep sustainability expertise, Terrascope is on mission is to drive decarbonisation at scale by helping enterprises:

- Measure with confidence. Terrascope increases the speed, accuracy, and confidence of scope 1, 2, and 3 emissions measurement, while ensuring compliance with GHG protocol, reporting frameworks and assurance standards.

- Manage complexity. Terrascope helps enterprises focus decarbonisation efforts where it matters the most by identifying emission hotspots and defining next best actions to make tangible progress towards net zero.

- Collaborate seamlessly. Terrascope enables collective action and shared accountability by allowing internal and external stakeholders to centralise data

Similar Jobs

Capco Logo Capco

Data Engineer

Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Remote
Hybrid
India
6000 Employees

Capco Logo Capco

Data Engineer-Goldensource

Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Remote
Hybrid
India
6000 Employees

BlackLine Logo BlackLine

Data Engineer

Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
Remote
Hybrid
Bengaluru, Karnataka, IND
1810 Employees

Atlassian Logo Atlassian

Data Engineer

Cloud • Information Technology • Productivity • Security • Software • App development • Automation
Remote
Bengaluru, Karnataka, IND
11000 Employees

Similar Companies Hiring

HERE Technologies Thumbnail
Software • Logistics • Internet of Things • Information Technology • Computer Vision • Automotive • Artificial Intelligence
Amsterdam, NL
6000 Employees
True Anomaly Thumbnail
Software • Machine Learning • Hardware • Defense • Artificial Intelligence • Aerospace
Colorado Springs, CO
131 Employees
Caliola Engineering Thumbnail
Software • Machine Learning • Hardware • Defense • Data Privacy • App development • Aerospace
Colorado Springs, CO
52 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account