Senior Data Engineer
Company | Cricut |
---|---|
Location | South Jordan, UT, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Senior, Expert or higher |
Requirements
- Bachelor’s degree in Computer Science or related degree
- 7+ years of experience leading Modern Data platforms and solving business problems using data and advanced analytical methods
- Experience with Cloud data stack solutions such as AWS, Redshift, Snowflake, Airflow
- Experience handling real-time events data feed is required
- Experience writing production level code for data pipelines and real time applications, and contributing to a large code repository
- Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMSes and data warehouses.
Responsibilities
- Design, build, and integrate data from various resources
- Manage big data pipelines that are easily accessible with optimized performance
- Develop, architect and implement core data engineering and data warehouse frameworks
- Design and build data quality monitoring framework to ensure data completeness and data integrity
- Champion security and governance and ensure data engineering team adheres to all company guidelines
- Lay down the solid foundation of Data integration of new data sources
- Provide direct and ongoing leadership for a team of individual contributors designing, building & maintaining highly scalable, predictable, and modern data pipeline
- Partner with front end team to design efficient Data Model
- Work closely with Business Intelligence and Data Science team to provide data platform as a Service
- Work with Data Science team to deploy ML models
Preferred Qualifications
- Strong experience building Data warehouse, Datamart and analytics solutions
- Strong experience in data modeling, mapping & analysis and design
- Strong experience with relational and no-SQL Databases
- Expertise in developing end-to-end data pipelines, from data collection, through data validation and transformation, to making the data available to processes and stakeholder
- Expertise in distributed data processing frameworks such as Apache Spark, Flink or Similar
- Expertise in OLAP databases such as Snowflake or Redshift
- Expertise in stream processing systems such as Kafka, Kinesis, Pulsar or Similar
- Organized and capable of managing multiple complex projects on tight deadlines without compromising quality, and comfortable working with dynamically evolving requirements
- Experience in Agile project management methodologies
- Great communication skills, ability to work with senior management, business leads, analysts, product managers and lead strategic cross-company project
- Outstanding verbal and written communications skills with the ability to listen, articulate positions, and influence outcomes beyond direct areas of ownership
- Consistently demonstrates and follows high standards of integrity in business decision-making