Posted in

Mid-Level Data Scientist / Quant – Risk & Trading

Mid-Level Data Scientist / Quant – Risk & Trading

CompanySleeper
LocationLas Vegas, NV, USA, Seattle, WA, USA, San Francisco, CA, USA, Los Angeles, CA, USA
Salary$90000 – $175000
TypeFull-Time
Degrees
Experience LevelMid Level, Senior

Requirements

  • 3-5 years in data science, machine learning, or quant research; comfortable owning end-to-end projects.
  • Fluent in Python, SQL, and modern ML tooling (scikit-learn, XGBoost, Airflow or similar).
  • Familiar with sports data and the economics of fantasy / sportsbook markets; plus if you’ve built pricing or risk models.
  • Systems thinker who anticipates failure modes and edge cases in real-time environments.
  • Willing to flex hours around major game slates; we’re a remote-first team and optimize schedules for coverage & work-life balance.

Responsibilities

  • Feature engineering & model tuning – Own the pipelines that transform raw bet, player, and market data into features for our pricing and exposure models (BigQuery + SQLX, Python, Pandas).
  • Predictive modeling – Train, validate, and deploy supervised and probabilistic models that forecast player performance, market volatility, and user value.
  • Guardrail automation – Ship rule-based limiters and anomaly-detection jobs that run every few seconds, flagging and throttling outlier exposure before it becomes tail risk.
  • Dashboards & alerting – Build Grafana dashboards and SQLX reports that surface live liability, promo uptake, and top-line KPIs to trading and exec stakeholders.
  • Light on-call rotation – During peak sports windows, respond to automated alerts and, if necessary, execute a manual override (price suspension / limit change). < 2 hrs/wk on average.
  • Cross-functional collaboration – Pair with Backend & Data Engineers to productionize models, and with Product to iterate on game mechanics and promos.

Preferred Qualifications

  • Experience with BigQuery, Looker, dbt, or similar analytics stacks.
  • Exposure to real-time streams (Kafka, Pub/Sub) and event-driven architectures.
  • Prior work building user-level segmentation or LTV models.