Posted in

Data Engineer II

Data Engineer II

CompanyMcKinsey & Company
LocationAtlanta, GA, USA
Salary$Not Provided – $Not Provided
TypeFull-Time
DegreesBachelor’s
Experience LevelMid Level, Senior

Requirements

  • Undergraduate degree; Advanced graduate degree (e.g., MBA, PhD, etc.) or equivalent work experience preferred
  • 5+ years of corporate and/or professional services experience
  • Excellent organization capabilities, including the ability to initiate tasks independently and see them through to completion
  • Strong communication skills, both verbal and written, in English and local office language(s), with the ability to adjust your style to suit different perspectives and seniority levels
  • Proficient in rational decision-making based on data, facts, and logical reasoning
  • Must have technical skills with hands-on experience: AWS – Glue, S3; Snowflake (SQL, data sharing, Snowpark, views, functions/stored procedures, Snowpipe); Python, PySpark, Pandas; Git/GitHub, GitHub Actions; Terraform; Designing and implementing ETL pipelines with an understanding of common transformations

Responsibilities

  • Be a leading contributor in the creation and maintenance of data pipelines and database views to represent required information sets
  • Build and test more sophisticated end-to-end data transformation pipelines solving for specific challenges of different kinds of data sources and types of data (e.g., master, transaction, reference, and metadata)
  • Be technically hands-on and comfortable writing code to cater to business requirements
  • Design solutions and use SQL, Python, PySpark, or other programming tools to consume, transform, and write data according to processing requirements for the data
  • Follow and help to enforce coding and data best practices with the team
  • Develop, promote, and use reusable patterns for consuming, transforming, and storing different kinds of data from diverse sources
  • Use a quality and security-first mindset and ensure principles are met through leading by example
  • Keep aware of the newest technologies and trends and provide meaningful investigations into their potential
  • Act as thought leader in the team to assess the technical feasibility of developing solutions around a conceptual idea
  • Consistently be seen as an enabler to working with distributed development teams

Preferred Qualifications

  • AWS – RDS, DynamoDB, Lambda, API-GW, EC2, Sagemaker
  • Airflow
  • Databricks
  • Kafka
  • Iceberg, Flink
  • Snowflake Cortex
  • Warehousing
  • Data lakes/lake houses
  • Data modeling