Posted in

Manager – Dev Ops & Data Engineering

Manager – Dev Ops & Data Engineering

CompanyPatientPoint
LocationCincinnati, OH, USA
Salary$Not Provided – $Not Provided
TypeFull-Time
Degrees
Experience LevelSenior, Expert or higher

Requirements

  • 7+ years of experience in data/DevOps engineering positions
  • 3+ years in a managerial or team leadership role developing data infrastructure and technology systems
  • Experience in Managing cloud data warehousing platforms (Snowflake, Google BigQuery) and ensuring cost efficiency
  • Prior experience in Implementing security best practices for data storage, processing, and access control
  • Significant experience with modern software development practices, including CI/CD, automated testing, and cloud-native architecture
  • Experience in engineering large-scale enterprise applications end-to-end, with a focus on Data Engineering and Insights
  • Proven ability to evaluate the current technology stack and advocate for strategic enhancements to optimize value delivery to the business
  • Experience with working with modern data architecture as well as data transformation tooling
  • Experience with Snowflake or Google Big Query required
  • Ensure high system availability and quick resolution of incidents

Responsibilities

  • Lead and mentor a team of data and DevOps engineers fostering a culture of growth and excellence
  • Define and execute the technical roadmap for data infrastructure and DevOps practices
  • Build and scale a high-performing team of data engineers to enhance data pipelines, develop new data sources, refine existing datasets, and improve data models as PatientPoint’s products evolve
  • Conduct regular performance evaluations, establish individual objectives, and deliver constructive feedback to support professional development
  • Drive team and organizational growth by actively participating in the recruitment and onboarding of top-tier data engineering talent
  • Collaborate with peers, leadership and stakeholders to define project scope, objectives, and deliverables that align with organizational goals
  • Develop and monitor project timelines, ensuring milestones are achieved and deliverables are met on schedule. Partner with cross-functional teams to ensure seamless integration of data solutions across the organization
  • Work closely with peers, Data Modeler, QA team in maintaining scalable, reliable, and high-performance data pipelines
  • Own and maintain CI/CD framework along with pipelines for data and application deployments
  • Act as the subject matter expert for the organization’s data platform and pipelines, contributing to their development and strategic direction
  • Oversee the creation of automated data quality assurance processes and operational support strategies to maintain data reliability

Preferred Qualifications

  • BA/BS in Computer Science, Computer Engineering, Information Technology, Management Information Systems, Software Engineering, Software Development, Data Engineering or related field
  • Experience with collaboration tools (Teams, Slack, Jira, Confluence)
  • Prior experience in designing and Implementing Snowflake Rules Based Architecture (RBAC)
  • Experience in managing SOWs and vendor contracts
  • Experience with dbt is preferred
  • Experience with Data Management and Data quality concepts
  • Experience with Terraform
  • Expertise in common programming/code languages, Orchestration and tooling, such as SQL, Python, Airflow, etc.