Posted in

Senior Data Engineer – Customer Data and Insights

Senior Data Engineer – Customer Data and Insights

CompanyDuke Energy
LocationCharlotte, NC, USA
Salary$Not Provided – $Not Provided
TypeFull-Time
DegreesBachelor’s
Experience LevelSenior

Requirements

  • Bachelors degree in Management of Information Systems, Engineering, Mathematics, Computer Science or Other Related Degree
  • In addition to required degree, three (3) or more years related work experience
  • In lieu of Bachelors degree(s) AND three (3) or more years related work experience listed above, High School/GED AND five (5) or more years related work experience

Responsibilities

  • Support or collaborate with application developers, database architects, data analysts and data scientists to ensure optimal data delivery architecture throughout ongoing projects/operations.
  • Design, build, and manage analytics infrastructure that can be utilized by data analysts, data scientists, and non-technical data consumers, which enables functions of the big data platform for Analytics.
  • Develop, construct, test, and maintain architectures, such as databases and large-scale processing systems that help analyze and process data in the way the Analytics organization requires.
  • Develop highly scalable data management interfaces, as well as software components by employing programming languages and tools.
  • Work closely with data subject matter experts to determine what data management systems are appropriate and data scientists to determine which data is needed.
  • Work with stakeholders to understand the information needs and translate these into technical solutions.
  • Work closely with a team of Data Science staff to take existing or new models and convert them into scalable analytical solutions.
  • Design, document, build, test and deploy data pipelines that assemble large complex datasets from various sources and integrate them into a unified view.
  • Identify, design, and implement operational improvements: automating manual processes, data quality checks, error handling and recovery, re-designing infrastructure as needed.
  • Create data models that will allow analytics and business teams to derive insights about customer behaviors.
  • Build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications.
  • Responsible for obtaining data from the System of Record and establishing batch or real-time data feed to provide analysis in an automated fashion.
  • Develop techniques supporting trending and analytic decision making processes.
  • Apply technologies for responsive front-end experience.
  • Ensure systems meet business requirements and industry practices.
  • Research opportunities for data acquisition and new uses for existing data.
  • Develop data set processes for data modeling, mining and production.
  • Integrate data management technologies and software engineering tools into existing structures.
  • Employ a variety of languages and tools (e.g. scripting languages) for integration.
  • Recommend ways to improve data reliability, efficiency and quality.

Preferred Qualifications

  • Architected, modeled and implemented a solution for multiple work management systems, including SS9, Maximo and Field Collection System in AWS Redshift
  • Knowledge of code writing.
  • Experience with relational databases, query authoring (SQL) as well as familiarity with a variety of RDMS.
  • Knowledge of building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Knowledge of Open Analytics Platforms (such as Hadoop ecosystem)
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Excellent verbal and written communication skills
  • Self-starter who works with minimal supervision and the ability to work in a team of diverse skill sets
  • Ability to comprehend customer requests and provide the correct solution
  • Strong analytical mind to help take on complicated problems
  • Desire to resolve issues and drive into potential issues
  • Familiarity with the Agile Methodology
  • Strong analytic skills related to working with unstructured datasets. Excellent verbal and written communication skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including SQL Server and Cassandra.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Understanding of distributed computing principles.