Skip to content

Matterport – Senior Data Engineer
Company | CoStar Group |
---|
Location | Irvine, CA, USA |
---|
Salary | $127500 – $190000 |
---|
Type | Full-Time |
---|
Degrees | Bachelor’s |
---|
Experience Level | Senior |
---|
Requirements
- Bachelorʼs degree or equivalent experience in Computer Science, Computer Engineering, Mathematics, or an adjacent field required from an accredited, not-for-profit university or college
- A track record of commitment to prior employers
- Working knowledge of data models within an Enterprise Data Lakehouse/Warehouse
- Advanced SQL, Excel, and Python skills
- 5+ years of data engineering experience
- Familiarity with machine learning concepts, tools, and libraries (e.g., TensorFlow, PyTorch, Scikit-learn, MLflow)
- Collaborate with cross-functional teams: Work closely with business analysts, data scientists, DBAs, and DevOps engineers to ensure successful data platform implementations.
- Knowledge of business intelligence software (i.e., Looker, Power BI, Tableau) and web analytics tools (i.e., Google Analytics, Google Tag Manager, BigQuery)
- Ability to retrieve, synthesize, and present critical data in structures that is immediately useful to answering specific ad-hoc questions.
Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes using Airflow, Snowflake, S3, Databricks, and other AWS services, PostgreSQl, MySQL, MS SQL
- Implement and optimize Snowpark jobs, data transformations, and data processing workflows in Snowpipe
- Building and deploying code using Snowpark Python by processing incremental data, orchestrating with Snowflake tasks, and deploying via a CI/CD pipeline
- Understand various data domains and be able to translate them through data transformations to suit various business needs
- Oversee automated data infrastructure: Collaborate with software engineering and product teams to streamline data tracking in our products.
- Ability to integrate data from different sources via cross-domain data stitching into existing data models
- Develop and maintain secure data warehouses: Implement data models, data quality checks, and governance practices to ensure reliable and accurate data.
- Hands-on data coding, optimization, and query performance
- Liaise with key company teams (DBA, DevOps, SecOps…) for data accessibility and analysis
- Mentor and manage engineers and analysts in the normal execution of their responsibilities
- Successfully implement software development projects to specification and within scheduled timelines and budget parameters
- Leverage AWS DevOps and CI/CD best practices to automate the deployment and management of data pipelines and infrastructure
Preferred Qualifications
- Experience with Postgres/RDBMS
- Working knowledge of Data Visualization tools like Looker and PowerBI
- Experience working with and managing large consumer datasets to derive insights