Senior Data Engineer
Company | Recorded Future |
---|---|
Location | Boston, MA, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s, Master’s |
Experience Level | Senior |
Requirements
- 4+ years of Python programming
- 2+ years of experience with cloud computing tools, e.g. from AWS, Azure, or Google Cloud
- Experience writing scalable, production-grade applications and ETL/ELT pipelines
- Efficient & accurate problem solving skills, including the ability to debug both software and data
- Proven ability to analyze data and apply statistical techniques to draw accurate, impactful conclusions
- Proven success in delivering projects from design and implementation to release
- Excellent attention to detail & ability to work independently while delivering high-quality results
- Excellent written & verbal communication when collaborating with colleagues across various locations and timezones, designing technical approaches, and writing documentation
- Eagerness to continue learning and teaching new skills to team members, in order to raise the bar across the team
Responsibilities
- Work with the Graph Quality team to align, analyze, and ingest asset maps into the Security Intelligence Graph
- Develop, productize, monitor, and maintain data pipelines to analyze and ingest data at scale
- Build tools and APIs to facilitate access to data and analytics developed from the intelligence graph
- Analyze and explain patterns in data to drive business-critical decisions
- Create technical project plans and drive the successful execution of projects, with input from our Product team and other developers on the team
- Collaborate with Data Scientists, Data Engineers, and business leaders to develop and refine technical solutions
- Onboard and guide junior members of the team
- Assist in setting team goals, planning sprints, and leading Agile scrum meetings
Preferred Qualifications
- Familiarity with both batch and streaming pipelines
- Familiarity with any of the following: message buses (e.g. Kafka, RabbitMQ), NoSQL databases (e.g. MongoDB, AWS Neptune, Neo4j), ElasticSearch
- Bachelor’s/Master’s degree in Computer Science, Mathematics, Statistics, Engineering, or equivalent experience
- Exposure to ML approaches, including experience productizing ML models
- Experience with developing REST APIs with Python frameworks (e.g. Flask, Django, FastAPI)
- Leadership experience, with a track record of presenting information to stakeholders with varying levels of technical expertise