Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.
WORKING AT OPORTUN
Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.
Position Overview:
We are seeking a highly skilled and experienced Senior Machine Learning Engineer to join our dynamic team and lead the development of our ML infrastructure, from model training to deployment, enabling us to deliver advanced and impactful solutions to our clients. As the Senior Machine Learning Engineer at Oportun, you will assume a pivotal role in elevating our ML capabilities, responsible for conceiving and implementing a state-of-the-art machine learning infrastructure. Your mastery of a ML domain enables you to take up business problems and solve them with a AI/ML solution. With your depth of expertise and leadership abilities, you will actively contribute to architectural decisions, mentor junior ML engineers, and collaborate closely with diverse teams, including data scientists, and engineers, to deliver high-quality Ml solutions that redefine the norms of FinTech. Your profound expertise in architecting and deploying machine learning models will be instrumental in propelling our products to new dimensions of sophistication and success. This is a role where you will have the opportunity to take up responsibility in leading the technology effort – from technical requirements gathering to final successful delivery of the ML solution - for large initiatives (cross-functional and multi-month long projects).
Responsibilities:ML Infrastructure Development:
- Design and implement scalable ML pipelines using Databricks, PySpark, AWS SageMaker, and Python to support model training, testing, and deployment.
- Leverage FastAPI for building and deploying lightweight, high-performance RESTful APIs for model serving.
- Utilize Kubernetes and Docker for containerization and orchestration to ensure fault-tolerant and distributed ML workflows.
- Integrate with databases like MongoDB, MariaDB, and DynamoDB for efficient data storage and retrieval.
Feature Engineering and Data Pipelines:
- Develop and optimize real-time and batch feature pipelines using PySpark on Databricks to handle large-scale data processing.
- Ensure smooth data integration across NoSQL (MongoDB, DynamoDB) and SQL (MariaDB) databases.
Model Deployment and Monitoring:
- Deploy ML models in production using AWS SageMaker or FastAPI for API-based deployments, ensuring high performance and low latency.
- Set up monitoring and alerting with tools like New Relic to ensure the reliability of deployed models.
Collaboration and Mentorship:
- Work closely with data scientists to transition research-grade models into scalable production systems.
- Mentor junior engineers on best practices in ML development, FastAPI, and scalable deployment strategies.
CI/CD and Automation:
- Build and maintain automated CI/CD pipelines using Jenkins and Docker, ensuring smooth integration and deployment of ML workflows.
- Automate retraining pipelines to ensure models adapt to changing data and maintain performance.
Qualifications:Experience: 5+ years in ML system design and deployment, with hands-on expertise in Databricks, PySpark, AWS SageMaker, and FastAPI.Technical Skills:
- Strong proficiency in Python, PySpark, and cloud services like AWS, S3, DynamoDB, and SageMaker.
- Experience with containerization (Docker) and orchestration (Kubernetes).
- Familiarity with monitoring tools like New Relic and databases like MongoDB, MariaDB, and DynamoDB.
Other Attributes:
A tech-agnostic mindset with the ability to adapt to new tools and frameworks.
Strong problem-solving and collaboration skills
We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.
California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/.
We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3).
Top Skills
What We Do
Oportun is an A.I.-powered digital banking platform that seeks to make financial health effortless for anyone. Driven by a mission to provide inclusive and affordable financial services, Oportun helps its nearly 1.5 million hardworking members meet their daily borrowing, savings, banking, and investing needs. Since inception, Oportun has provided more than $12 billion in responsible and affordable credit, saved its members more than $2 billion in interest and fees, and automatically helped members set aside more than $7.2 billion for rainy days and other needs. In recognition of its responsibly designed products, Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.