Data Engineer Senior
Company | Associated Bank |
---|---|
Location | Madison, WI, USA, Chicago, IL, USA, St Paul, MN, USA |
Salary | $93520 – $160320 |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Senior |
Requirements
- Bachelor’s Degree in Computer Science, Information Technology, Engineering, or related field or equivalent combination of education and experience Required.
- 5-6 years data engineering focused on using ETL or ELT patterns to build automated data pipelines or related technologies Required.
- 3-5 years’ experience Cloud platforms (AWS, Azure, GCP, Snowflake) Snowflake preferred and Python programming or advanced SQL coding and performance tuning skills Required.
Responsibilities
- Securely design, develop, test, and deploy streaming and batch ingestion methods and pipelines across a variety of data domains leveraging programming languages, application integration software, messaging technologies, REST APIs and ETL/ELT tools.
- Ensure that high-throughput, low-latency, and fault-tolerant data pipelines are developed by applying best practices to the data mapping, code development, error handling and automation.
- As part of an agile team, design, develop and maintain an optimal data pipeline architecture using both structured data sources and big data for both on-premises and cloud-based environments.
- Develop and automate ETL/ELT code using scripting languages and ETL tools to support all reporting and analytical data needs.
- Following DataOps best practices, enable orchestration of data, tools, environments, and code.
- Design and build dimensional data models to support the data warehouse initiatives.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data pipeline performance, re-designing infrastructure for greater scalability and access to information.
- Participate in requirements gathering sessions to distill technical requirements from business requests.
- Collaborate with business partners to productionize, optimize, and scale enterprise analytics.
Preferred Qualifications
- Master’s Degree preferred in Computer Science, Information Technology, Engineering, or related field.
- Thorough understanding of relational, columnar and NoSQL database architectures and industry best practices for development Preferred.
- Understanding of dimensional data modeling for designing and building data warehouses Preferred.
- Experience with Infrastructure as Code (IAC) tools such as Terraform.
- Experience with big data streaming technologies such as Kafka. Experience with parsing data formats such as XML/JSON and leveraging external APIs.