Specialist – Data Engineer – Annuity Data Platform
Company | Nationwide |
---|---|
Location | Des Moines, IA, USA, Scottsdale, AZ, USA, Columbus, OH, USA |
Salary | $93500 – $139500 |
Type | Full-Time |
Degrees | |
Experience Level | Mid Level, Senior |
Requirements
- Hands-on experience with ETL tools such as Talend (Open Studio/Cloud).
- Strong proficiency in SQL, including stored procedures, indexing, and query optimization.
- Experience with relational database platforms such as Oracle, SQL Server, PostgreSQL, and Snowflake.
- Exposure to job orchestration/scheduling tools like Control-M, Apache Airflow, or equivalent.
- Experience working in cloud environments (e.g., AWS, Azure).
- Experience with app containerization using Kubernetes.
- Familiarity with data modeling concepts, both logical and physical.
Responsibilities
- Design, develop, and maintain ETL pipelines using Talend for large-scale data integration projects.
- Optimize complex SQL queries to extract, transform, and load data across various RDBMS platforms (e.g., Oracle, SQL Server, PostgreSQL, Snowflake).
- Collaborate with data architects and analysts to translate business requirements into scalable data solutions.
- Implement and monitor job schedules, handling job failures with appropriate recovery strategies.
- Participate in peer code reviews to ensure adherence to development best practices and coding standards.
- Provides basic to moderate technical consultation on data product projects by analyzing end to end data product requirements and existing business processes to lead in the design, development and implementation of data products.
- Produces data building blocks, data models, and data flows for varying client demands such as dimensional data, standard and ad hoc reporting, data feeds, dashboard reporting, and data science research & exploration.
- Applies secure software and systems engineering practices throughout the delivery lifecycle to ensure our data and technology solutions are protected from threats and vulnerabilities.
- Translates business data stories into a technical story breakdown structure and work estimate so value and fit for a schedule or sprint is determined.
- Creates simple to moderate business user access methods to structured and unstructured data by such techniques such as mapping data to a common data model, NLP, transforming data as necessary to satisfy business rules, AI, statistical computations and validation of data content.
- Assists the enterprise DevSecOps team and other internal organizations on CI/CD best practices experience using JIRA, Jenkins, Confluence etc.
- Implements production processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Develops and maintains scalable data pipelines for both streaming and batch requirements and builds out new API integrations to support continuing increases in data volume and complexity.
- Writes and performs data unit/integration tests for data quality. With input from a business requirements/story, creates and executes testing data and scripts to validate that quality and completeness criteria are satisfied. Can create automated testing programs and data that are re-usable for future code changes.
- Practices code management and integration with engineering Git principle and practice repositories.
Preferred Qualifications
- Familiarity with data warehousing concepts and dimensional modeling.
- Knowledge of Python or Shell scripting for automation and auxiliary data tasks.
- Applying GenAI tools for rapid development and/or analysis.