Posted in

Data Engineer

Data Engineer

CompanyVoya Financial
LocationNew York, NY, USA
Salary$113250 – $141560
TypeFull-Time
DegreesBachelor’s
Experience LevelSenior

Requirements

  • Bachelor’s Degree in Computer Science, MIS, Engineering or a directly related field.
  • 5+ years of experience in data engineering, with a strong background in building and maintaining data pipeline and ELT processes to ingest data from external data provider.
  • Strong proficiency in SQL and experience with relational database design.
  • Strong proficiency in Python.
  • Experience with Snowflake and Snowpark.
  • Experience with AWS or Azure.
  • Experience with orchestration tools such as Airflow, Prefect, or Dagster.
  • Knowledge of Investment Management and financial data.
  • Excellent communication skills with the ability to collaborate effectively with cross-functional business and technology groups.
  • Commitment to learning or curious about new tools / technologies.

Responsibilities

  • Design, develop, and maintain scalable and efficient data pipeline and ELT processes.
  • Parse, analyze, and understand datasets with a focus on application in business use cases.
  • Manage data storage solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
  • Perform data reconciliations, validations, quality checks across various data sources and systems and identify enrichment opportunities.
  • Develop new systems and maintain and modify existing systems as required.
  • Provide support in system acceptance testing and validation activities.
  • Analyze project requirements and accordingly provide technical and functional recommendations.
  • Lead junior data engineers, fostering a culture of continuous learning and improvement and providing guidance and support in their professional development.
  • Additional responsibilities, as required.

Preferred Qualifications

  • Experiences with LLMs and NLP.
  • Experience with Databricks and PySpark.
  • Experience with Data Test and Quality framework such as SODA, Great Expectation, dbt test.
  • Experience with dbt.
  • Experiences with Market Data and/or Alternative Data.
  • Experience with scheduling tools such as Tidal, Control-M, or cron.