Skip to content

Senior – Data Engineer
Company | Walmart |
---|
Location | Sunnyvale, CA, USA |
---|
Salary | $117000 – $234000 |
---|
Type | Full-Time |
---|
Degrees | Bachelor’s, Master’s |
---|
Experience Level | Senior |
---|
Requirements
- Experience in Google Cloud, SQL, Big Query, Hive/Spark, Unix Scripting, Python, Tableau and Looker.
- 5+ years of experience of working in a high growth analytics team.
- 2+ years of experience in data visualization tools such as Tableau, Looker.
- 2+ years of programming experience with at least one language such as Python, Scala, Java or other modern OOP programming language.
- Fluency in SQL, as well as an understanding of statistical analysis, and common pitfalls of data analysis. Basic understanding of query logic is a MUST.
- Experience with parsing, cleansing and querying unstructured data using Hive/Spark.
- Experience with optimization and automation using scripting.
- Strong problem-solving skills with an emphasis on product development.
- Experienced developing in cloud platforms such as Google Cloud Platform (preferred), AWS, Azure, or Snowflake at scale.
- Experience in designing data engineering solutions using open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow.
- Excellent written and verbal communication skills for coordinating across teams.
- Bachelor’s degree in Statistics, Computer Science, Business Administration or related field. Relevant experience in Product analytics will be a PLUS.
Responsibilities
- Consults with senior leadership/business partners to understand overall goals/functional objectives, decipher key business challenges, identify tactics to anticipate and mitigate business challenges/issues and develop potential success criteria.
- Builds and leads a team to execute on business area or category objectives.
- Owns complex projects end-to-end for a business area, category or prototype.
- Manages the development and implementation of prototypes and production-ready complex algorithms for complex projects.
- Collaborates with functional leadership and cross-functional teams to develop consultative solutions for complex projects for a business area, category or prototype.
- Responsible for crafting simple, dependable data pipelines and automated reporting to surface metrics.
- Design and implement big data architecture, data pipelines, reporting, dashboarding, data exploration, processes and quality controls to enable self-service Business Intelligence.
- Explores data to find actionable insights and makes product recommendations through funnels, long-term trends, segmentation, and more as necessary.
- Designs, builds, and maintains ETL pipeline of analytical data warehouse.
- Provides and supports the implementation of business solutions applying data modeling, SQL and data warehousing skills.
- Applies fluency in SQL, as well as a deep understanding of statistical analysis, and common pitfalls of data analysis.
- Develop automated scripts, build reusable tools using any programming language, preferably Python.
- Identify the bottlenecks of data flow and analysis, architect and implement solutions such as automation of data processing pipelines, optimizations of data queries, and enhancement of data visualization and interpretation.
- Keep current on big data and data visualization technology trends, evaluate, work on proof-of-concept and make recommendations on the technologies based on their merit.
Preferred Qualifications
- Data engineering, database engineering, business intelligence, or business analytics, ETL tools and working with large data sets in the cloud.
- Master’s degree in Computer Science or related field and 3 years’ experience in software engineering.
- Knowledge in implementing Web Content Accessibility Guidelines (WCAG) 2.2 AA standards, assistive technologies, and integrating digital accessibility seamlessly.