PENN Entertainment, Inc. is North America’s leading provider of integrated entertainment, sports content, and casino gaming experiences. From casinos and racetracks to online gaming, sports betting and entertainment content, we deliver the experiences people want, how and where they want them.
We’re always on the lookout for those who are passionate about creating and delivering cutting-edge online gaming and sports media products. Whether it’s through ESPN BET, Hollywood Casino, theScore Bet Sportsbook & Casino, or theScore media app, we’re excited to push the boundaries of what’s possible. These state-of-the-art platforms are powered by proprietary in-house technology, a key component of PENN’s omnichannel gaming and entertainment strategy.
When you join PENN Entertainment’s digital team, you’ll not only work on these cutting-edge platforms through theScore and PENN Interactive, but you’ll also be part of a company that truly cares about your career growth. We’re committed to supporting you as you expand your skills and explore new opportunities.
With locations throughout North America, you can build a future at PENN Entertainment wherever you are. If you want to challenge conventions in gaming, media and entertainment, we want to talk to you.
About the Role & Team
As part of the team, you will be working with a distributed team of smart, friendly, and dedicated engineers, product managers, and designers determined to deliver some of the best apps the market has to offer. The Sports Modeling Automation Team is responsible for integrating models and data from our Data Science team to our internal services; It consist of Machine Learning Engineers, Data Engineers and Software engineers.
About the Work
As a key member of our Sports Modeling Automation Team, you will:
- Design, implement, and maintain backend services and APIs using Python (primarily FastAPI or Flask).
- Build and manage complex data workflows with Argo Workflows (Kubernetes-native workflow engine supporting DAG and step-based workflows).
- Develop event-driven distributed systems that process large amounts of data and integrate with downstream back end services (Kafka experience preferred or any other event streaming/message queue platform).
- Work in containerized environments using Docker and Kubernetes
- Build internal tools and libraries to help accelerate other backend teams
- Work with data science and data engineering teams to build best-in-class SDLC processes
- Oversee the design and maintenance of data systems and contribute to the continual enhancement of the data platform
- Collaborate with the team to define, track, and meet SLOs
- Maintain and expand existing systems, tooling and infrastructure
- Ensure System Reliability: Implement robust monitoring and alerting mechanisms using tools like DataDog.
- Participate in Agile Processes: Engage in the design, architecture, and delivery of new features within a collaborative agile/scrum environment.
- Deploy to Cloud Infrastructure: Manage deployments of services and applications to our cloud platforms.
- Strategic Partnership: Work closely with the tech lead and engineering manager to help set the team's direction.
- Demonstrate Technical Proficiency: Showcase expertise in the team's tech stack, tooling, and architecture to lead wide-ranging projects effectively.
- On-Call Rotation: Participate in our on-call rotation to address critical issues during off-business hours.
About You
- Strong Computer Science Foundation: Solid understanding of data structures, distributed systems, and software design.
- Passionate About Clean Code: Commitment to clean architecture and software craftsmanship.
- Versatile Developer: Experience with modern web frameworks and API development.
- Adaptable Learner: Proficiency in Python with a willingness to learn new technologies and frameworks.
- Hands-on experience with workflows orchestration tools such as Argo Workflows (or Airflow).
- Database Proficiency: Strong experience with relational databases such as PostgreSQL and MySQ and NoSQL database such as BigTable, Mongo, DynamoDB.
- Comfortable with Command Line: Proficient in terminal operations.
- Familiar with Containerization: Knowledge of Kubernetes and container orchestration.
- Caching Knowledge: Understanding of caching strategies and tools.
- Problem-Solving Skills: Excellent analytical abilities and independent troubleshooting.
- Strong Communicator: Ability to convey complex technical concepts to both technical and non-technical stakeholders.
- Nice to have: Knowledge of other programming language (e.g, Elixir, Java, GO)
What We Offer
- Competitive compensation package.
- Comprehensive Benefits package.
- Fun, relaxed work environment.
- Education and conference reimbursements
#LI-REMOTE
Candidates residing in Ontario requiring special accommodation can email [email protected]
theScore is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability or age.
Top Skills
What We Do
theScore, a wholly-owned subsidiary of PENN Entertainment, empowers millions of sports fans through its digital media and sports betting products. Its media app ‘theScore’ is one of the most popular in North America, delivering fans highly personalized live scores, news, stats, and betting information from their favorite teams, leagues, and players. theScore’s sports betting app ‘theScore Bet Sportsbook & Casino’ delivers an immersive and holistic mobile sports betting and iCasino experience. theScore Bet is currently live in the Company's home province of Ontario. theScore also creates and distributes innovative digital content through its web, social and esports platforms.