Senior Architect – Data
Company | Credera |
---|---|
Location | Dallas, TX, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s, Master’s |
Experience Level | Senior, Expert or higher |
Requirements
- Minimum of 8+ years of technical, hands-on experience building, optimizing, and implementing data pipelines and architecture
- Experience leading teams to wrangle, explore, and analyze data to answer specific business questions and identify opportunities for improvement
- Strong communication and interpersonal skills, and the ability to engage customers at a business level in addition to a technical level
- Deep understanding of data governance and data privacy best practices
- Incorporate the usage of AI tooling, efficiencies, and code assistance tooling in everyday workflows
- Degree in Computer Science, Computer Engineering, Engineering, Mathematics, Management Information Systems or a related field of study
Responsibilities
- Lead teams in implementing modern data architecture, data engineering pipelines, and advanced analytical solutions
- Act as the primary architect and technical lead on projects to scope and estimate work streams
- Architect and model technical solutions to meet business requirements
- Serve as a technical expert in client communications
- Mentor junior project team members
- Participate in design sessions, build data structures for an enterprise data lake or statistical models for a machine learning algorithm, coach junior resources, and manage technical backlogs and release management tools
- Seek out new business development opportunities at existing and new clients
Preferred Qualifications
- Recent technical knowledge of programming languages (e.g. Python, Java, C++, Scala, etc.)
- SQL and NoSQL databases (MySQL, DynamoDB, CosmosDB, Cassandra, MongoDB, etc.)
- Data pipeline and workflow management tools (Airflow, Dagster, AWS Step Functions, Azure Data Factory, etc.)
- Stream-processing systems (e.g. Storm, Spark-Streaming, Pulsar, Flink, etc.)
- Data Warehouse design (Databricks, Snowflake, Delta Lake, Lake formation, Iceberg)
- MLOps platforms (Sagemaker, Azure ML, Vertex.ai, MLFlow)
- Container Orchestration (e.g. Kubernetes, Docker Swarm, etc.)
- Metadata management tools (Collibra, Atlas, DataHub, etc.)
- Experience with the data platform components on one or more of the following cloud service providers: AWS, Google Cloud Platform, Azure