About the Company
At Bloomfield, we are revolutionizing the way crops are monitored and managed. Our AI-powered imaging technology provides continuous, plant-level health and performance insights from seed to harvest. Our mission is to empower farmers with the tools they need to increase crop productivity and quality while using fewer scarce resources, ultimately contributing to a more sustainable and food-secure future.
In 2024, Kubota Corporation, a global leader in agricultural machinery and solutions, through its North American subsidiary, Kubota North America Corporation, acquired Bloomfield. This acquisition unites Bloomfield’s innovative technology with Kubota’s extensive resources and commitment to provide comprehensive smart agriculture solutions to farmers worldwide. Our combined expertise and resources will drive innovation and deliver benefits to farmers, ensuring a more sustainable and prosperous agricultural industry.
About the Role
We are seeking an Infrastructure Software Engineer to join our growing team. You will play a key role in building and maintaining our core infrastructure and data platforms, both in AWS and on the edge. This position requires a balance of technical skills and innovative thinking to support our evolving technology stack.
Responsibilities
- Designing, implementing, and maintaining our data pipeline, including data schema definition, ingestion, transformation, and serving
- Ensuring the reliability, quality, security, and performance of the data pipelines
- Troubleshooting and debugging any issues with our data pipelines
- Working with other engineers to understand their data needs and ensure that our data ecosystem can support their requirements
- Optimizing the data pipeline for efficiency, scalability, and reliability
- Developing and maintaining documentation for the data pipeline
Qualifications
- Bachelor's or Master's degree in Computer Science, Engineering, or a similar field
- 5+ years of experience as a Data Engineer or a similar role
- Strong proficiency in dbt, SQL and Python
- Good knowledge of data warehouse solutions such as GCP BigQuery, AWS Redshift, experience with Dremio is a plus
- Experience with Data Warehouse design and maintenance
- Experience with real Data monitoring and alerting in production
- Experience with self-serve BI tools such as Metabase, Looker, Tableau - experience with LightDash is a plus
- Familiarity with data cleaning, processing, and analytics of image data
- Strong problem-solving and troubleshooting skills
- Excellent communication and collaboration abilities
Top Skills
What We Do
We build tools that help farmers, breeders, and scientists better understand plant growth using a combination of computer vision and AI.