Northbeam is building the world’s most advanced marketing intelligence platform for growth. Our attribution modeling technology and customizable dashboards provide our customers with a unified view of their e-commerce business data. The smartest brands in ecommerce trust Northbeam to accurately attribute their advertising spend, understand the entire customer journey, and make data-driven decisions to grow profitably.
Northbeam’s team and customer base are growing quickly, and it’s essential that we invest in the right people & systems to scale our business. Our business has found incredible product-market fit and continues to grow rapidly. This is a career-defining opportunity for an experienced engineer to accelerate their growth and contribute to a rapidly-scaling company.
The Northbeam team is composed of hard-working and talented individuals focused on collaboration, personal growth, and technical excellence. We would love for you to join us!
Job Description
Northbeam is fundamentally a data product - the whole company. We don’t sell shoes, or ads, or games. We sell data: quality integrations with a variety of platforms, fresh and reliable data pulls, robust data ingest APIs, correct aggregations and algorithmic insights on top of that data, all packaged up in a user-facing application.
What this means for you is that high quality, robust data integration is at the core of what we do, and your work will have a direct connection to the company’s success.
We are looking for a Senior Software Engineer with experience in data integration, API-based ETL pipelines, and cloud-native architecture. You will work with a small engineering team to create a platform that consolidates third-party data from a wide range of sources, including advertising platforms, e-commerce systems, customer data warehouses, ERP, POS, and CRM systems. You will need to think about concerns like scalability, multi-tenancy, batch vs streaming trade-offs, data validation, API design, and more. You will work with experienced engineers who are eager to share their knowledge and experience, and to learn alongside you.
Curiosity, willingness to do the hard thing, attention to developer ergonomics, and an enjoyment of a startup pace of development will be the key to success in this role.
About the Role
This is a startup. The one thing that’s constant is change. To start with, you can expect to:
- Design and implement scalable, high-performance data pipelines to ingest and transform data from a variety of sources, ensuring reliability, observability, and maintainability.
- Build and maintain APIs that enable flexible, secure, and tenant-aware data integrations with external systems.
- Work with event-driven and batch processing architectures, ensuring data freshness and consistency at scale.
- Drive clean API design and integration patterns that support both real-time and batch ingestion while handling diverse authentication mechanisms (OAuth, API keys, etc.).
- Implement observability, monitoring, and alerting to track data freshness, failures, and performance issues, ensuring transparency and reliability.
- Optimize data flows and transformations, balancing cost, efficiency, and rapid development cycles in a cloud-native environment.
- Collaborate with data engineering, infrastructure, and product teams to create an integration platform that is flexible, extensible, and easy to onboard new sources.
You will work with great people who have done this many times before. You will teach them some new tricks, and maybe learn some old ones.
If this sounds like your kind of chaos, we’d love to hear from you.
About YouRequirements
- 5+ years of experience in data engineering, software engineering, or integration engineering, with a focus on ETL, APIs, and data pipeline orchestration.
- Strong proficiency in Python
- Experience with API-based ETL, handling REST, GraphQL, Webhooks
- Experience implementing authentication flows
- Proficiency in SQL and BigQuery
- Experience with orchestration frameworks (e.g., Airflow) to manage and monitor complex data workflows.
- Familiarity with containerization (Docker, Kubernetes) to deploy and scale workloads.
- Ability to drive rapid development while ensuring maintainability, balancing short-term delivery needs with long-term platform stability.
Nice to Haves
- Detailed understanding of authentication mechanisms (OAuth 2.0, API keys, secrets management) and secure multi-tenant architectures.
- Experience working with ERP systems, CRMs, CDPs, or complex other enterprise data tools and their APIs.
- Exposure to event-driven architectures and real-time data processing tools
- Knowledge of data governance, compliance (GDPR, SOC2), and security best practices for handling sensitive data.
- Experience working in a multi-tenant SaaS or large-scale data-intensive environment.
Values
These are the values we share as the Northbeam community:
- Growth mindset - we’re always learning and growing
- Customer focus - we want to make the customer happy with our product
- Ownership mentality - we think like owners in the business
- Radical candor - we’re transparent and give direct feedback to one another
Benefits
- Equity package
- Generous base salary
- Healthcare Benefits (medical, dental, vision)
- Travel to meet with the team
- Flexible PTO Policy
- 12 Company Paid Holidays
Top Skills
What We Do
Through the use of first-party data collection and machine learning, Northbeam delivers actionable insights to DTC and ecommerce brands that result in a clear picture of your customers’ buying behavior.