Posted in

Director – Data Engineer – PGIM Asset Management – Hybrid

Director – Data Engineer – PGIM Asset Management – Hybrid

CompanyPrudential Financial
LocationNewark, NJ, USA
Salary$150000 – $190000
TypeFull-Time
DegreesBachelor’s
Experience LevelSenior, Expert or higher

Requirements

  • 15+ years of experience in building out Data pipelines in Java/Scala
  • 10 + years of experience working in AWS Cloud especially services like S3, EMR, Lambda, AWS Glue and Step Functions
  • Strong knowledge of Azure and Fabric environments
  • 5+ years of experience with Spark
  • 10+ years of financial domain experience and prior knowledge of working Blackrock Aladdin and their interface files
  • Ability to lead team of developers
  • Exposed to working in an Agile environment with Scrum Master/Product owner and ability to deliver
  • Strong Experience with data lake/data marts/data warehouse
  • Ability to communicate the status and challenges and align with the team
  • Demonstrating the ability to learn new skills and work as a team

Responsibilities

  • Build new and existing applications in preparation for newly launched business
  • Align with the business teams and rest of the PGIM teams in assessing business needs and transforming them into scalable applications
  • Build and maintain code to manage data received from heterogenous data formats including web-based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII)
  • Help build new enterprise Datawarehouse and maintain the existing one
  • Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our AWS cloud implementation
  • Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance
  • Build interfaces for supporting evolving and new applications and accommodating new data sources and types of data

Preferred Qualifications

  • Experience working in Hadoop or other Big data platforms
  • Exposure to deploying code through pipeline
  • Good exposure to Containers like ECS or Docker
  • Direct experience supporting multiple business units for foundational data work and sound understanding of capital markets within Fixed Income
  • Knowledge of Jira, Confluence, SAFe development methodology & DevOps
  • Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams
  • Proven ability to work quickly in a dynamic environment
  • Bachelor’s degree Computer Science or a related field