Belvedere Trading is a leading proprietary trading firm eagerly expanding into the heart of Boulder, Colorado. Our traders work hard to provide liquidity to the market using our automated trading systems and have mastered a diverse set of commodities, interest rates, exchange-traded funds (ETF), and equity index options. Our trading models and software systems are continually re-engineered, optimized, and maintained to stay on top of the industry. This would not be possible without the dedicated efforts of our technology teams who develop and perfect our innovative technology solutions.
High-performance proprietary development is the source of our success and competitive advantage at Belvedere, further fueling our passion for performance. We are a team driven by intellectual curiosity, seeking answers that will change not only how we trade in this technological age, but also the future landscape of the trading industry. We place a high premium on defining, developing, and deploying high-performance trading software using a team-based, holistic development approach. We are looking for passionate team members whose contributions will be critical to our continued success.
Belvedere Trading is looking for a Senior Data Engineer to join our team out of either Chicago, Illinois or Boulder, Colorado in a hybrid working model. Our Data Engineers are responsible for data services, including ELT and ETL pipeline implementation, data warehouse architecture, data quality automation, and analytic visualization. We expect a Senior Data Engineer to need minimal guidance on architecture best practices. You will work closely with System Operators, Quant Researchers, and Software Developers to support data-driven decision-making across the organization. As a Senior Data Engineer at Belvedere, you have the opportunity to work on supporting cutting-edge research in financial markets.
If you thrive in a fast-paced environment and have a passion for solving complex data challenges, we’d love to hear from you.
What You'll Do
- Design, build, and maintain robust data pipelines and data infrastructure
- Collaborate with cross-functional teams to define and implement data solutions
- Develop and optimize ELT and ETL processes to ensure high-quality, accurate, and timely data delivery
- Work with large datasets to perform data wrangling, cleaning, and transformation tasks
- Monitor and troubleshoot data systems to ensure their reliability and