Posted in

Senior Data Science Workbench Analyst – Engineer – Databricks or Snowflake

Senior Data Science Workbench Analyst – Engineer – Databricks or Snowflake

CompanyM&T Bank
LocationBridgeport, CT, USA
Salary$119400.84 – $199001.4
TypeFull-Time
DegreesBachelor’s, Master’s
Experience LevelSenior

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
  • 5+ years of experience in cloud-based infrastructure management and AI/ML workbench administration.
  • Expertise in Databricks, and cloud-based data platforms (AWS, Azure, GCP).
  • Strong programming skills in Python, SQL, and automation scripting (Terraform, Bash, or similar).
  • Experience with workflow orchestration tools such as Apache Airflow or Prefect.
  • Deep understanding of cloud security, IAM roles, and governance best practices.
  • Proven ability to lead projects and mentor junior engineers.

Responsibilities

  • Lead automation initiatives to improve the scalability and efficiency of AI/ML workflows.
  • Architect and maintain highly available cloud-based data science environments on platforms like Databricks and Snowflake.
  • Enhance monitoring and observability for AI/ML infrastructure, ensuring performance optimization and cost efficiency.
  • Improve and enforce security and compliance measures across data science environments.
  • Develop and refine CI/CD pipelines to streamline model deployment and management in production.
  • Collaborate with data scientists and engineers to drive innovation and operational excellence.
  • Mentor junior engineers by providing guidance on infrastructure best practices, cloud security, and automation.
  • Optimize cloud cost management strategies to ensure efficient resource utilization.

Preferred Qualifications

  • Certifications: AWS Solutions Architect Professional, Databricks Advanced Developer, Snowflake Advanced Architect.
  • Experience integrating machine learning models into production pipelines.
  • Proficiency in Kubernetes, Docker, and containerized AI/ML workloads.
  • Experience working with real-time data streaming technologies such as Kafka or Kinesis.
  • Strong knowledge of FinOps for cloud cost optimization.