Data Operations Engineer II
Company | AccuWeather |
---|---|
Location | State College, PA, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Mid Level, Senior |
Requirements
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Minimum of 3-5 years of professional experience in data operations, scripting, and automation.
- Advanced proficiency in scripting languages such as Python, PowerShell, or similar.
- Proficiency in Object-Oriented Programming (OOP) languages such as C++, Java, or similar.
- In-depth understanding of data processing concepts and data integration tools.
- Experience managing relational databases such as SQL.
- Experience working with cloud platforms (e.g., AWS, Azure, GCP).
- Exceptional analytical and problem-solving skills with meticulous attention to detail.
- Excellent communication skills with the ability to collaborate effectively within a team environment.
- Ability to adapt to changing priorities and manage multiple tasks simultaneously.
- Proactive mindset with a strong willingness to learn and explore new technologies.
- Demonstrated ability to debug systems, tracing issues back to their source.
Responsibilities
- Develop, maintain, and optimize complex scripts using languages such as Python, Bash, or similar, to automate data collection and monitoring.
- Responsible for improving and maintaining data infrastructures and ensuring the reliability and efficiency of data processes.
- Build and manage end-to-end monitoring systems and automated alert mechanisms to ensure the health and performance of data pipelines.
- Create and maintain comprehensive documentation of scripts, automation and monitoring workflows, data pipelines, and deployment procedures for knowledge sharing and future reference.
- Collaborate closely with cross-functional teams, providing support for scripting needs and contributing to the development of an effective automation strategy.
Preferred Qualifications
- Experience with DataDog for monitoring and performance tracking.
- Familiarity with Databricks for data engineering and analytics.
- Experience of version control systems (e.g., Git).
- Demonstrated experience related to scripting and automation in a data context.
- Experience working with deployment automation (CI/CD) tools and processes.