Senior Data Operations Engineer
Company | Acrisure |
---|---|
Location | Austin, TX, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Senior, Expert or higher |
Requirements
- Bachelor’s degree preferred or equivalent experience along with a demonstrated desire for continuing education and improvement.
- 8-10 years of hands-on experience working with Azure cloud services, including Data Factory, Databricks, PowerBI, and SQL Server.
- Strong hands-on experience with Data Lake & Delta Lake Concepts.
- Strong hands-on experience with Databricks Unity Catalog and usage in dealing with Delta tables.
- Ability to Analyze, summarize, and characterize large or small data sets with varying degrees of fidelity or quality, and identify and explain any insights or patterns within them.
- Experience with multi-source data warehouses
- Experience with other cloud environments (GCP) a definite plus.
- Experience in data analytics and reporting, particularly with Power BI a plus
- Hands on experience building logical data models and physical data models
- Write SQL fluently, recognize and correct inefficient or error-prone SQL, and perform test-driven validation of SQL queries and their results.
- Create and share standards, best practices, documentation and reference examples for data warehouse, integration/ELT systems, and end user reporting
- Apply disciplined approach to testing software and data, identifying data anomalies, and correcting both data errors and their root causes
- Should be well versed with Key Vault
create & maintenance and usage of secrets in both Databricks & ADF - Should be knowledgeable in Stored procedures
functions and be able to use them by ADF & Databricks as this is a widely used Practice in Acrisure. - Should be familiar with DevOps process for Azure artifacts and database artifacts.
- Should be well versed with ADF concepts like chaining pipelines, passing parameters, using APIs for ADF & Databricks to perform various activities.
- Should be well versed with Agile and Scrum principles and procedures, and working with Jira
- Proven ability to build, manage, and troubleshoot data pipelines in Databricks and ADF, ensuring reliability and performance.
- Deep familiarity with SQL Server administration—from tuning queries and managing indexes to backups and performance optimization.
- Strong knowledge of SQL Server security best practices, including patching, maintenance plans, and access control.
- Willingness to participate in an on-call rotation for Sev 0 alerts, including occasional weekend support as part of a shared schedule.
- Solid understanding of monitoring and observability tools like Azure Monitor, Log Analytics, and Key Vault for secure and visible operations.
- Comfortable managing secret rotation and credential security in cloud environments, with a strong awareness of data security best practices.
- Versatile in working with various data formats and sources—structured and unstructured—including CSV, JSON, XML, binary column-based formats (Parquet, Arrow), databases, and Azure storage.
- Strong grasp of relational and dimensional data modeling, with experience translating models into efficient, scalable systems.
- Skilled in debugging, tuning, and enhancing workflows across Databricks notebooks and SQL-based transformations.
- Detail-oriented when it comes to documentation, producing clear, actionable runbooks and technical guides.
- Collaborative and communicative across functions, able to explain technical issues clearly and work effectively in cross-functional teams.
- Adaptable in fast-paced environments, with a proactive approach to ownership and accountability.
Responsibilities
- Onboard and curate data sources including data preparation/ELT and modeling to enable data consumption by analytics and AI teams
- Act as a Solution Architect and Technology Leader – ability to make decisions in the face of ambiguity and solve difficult technical problems.
- Onboard, Lead, and Mentor junior and mid-level developers.
- Work closely with Data Source contacts, Analysts, Product and Data Intelligence teams to identify opportunities and assess improvements of our products and services.
- Contribute to workshops with the business user community to further their knowledge and use of the data ecosystem.
- Produce and maintain accurate project documentation, project planning and presentations.
- Collaborate with various data providers to resolve dashboard, reporting and data related issues.
- Perform Data benchmarking, enhancements, optimizations, and platform analytics.
- Participate in the research, development, and adoption of trends in technology, data and analytics
- Monitor and troubleshoot nightly ETL pipelines, primarily built in Databricks, Azure Data Factory (ADF), and SQL Server to ensure reliable data flow and minimal downtime.
- Resolve data issues, collaborating with BI Analysts, and Engineering teams to address data latency, quality, and integration problems quickly and effectively.
- Implement and maintain monitoring and alerting systems using Azure Monitor, Log Analytics, and custom dashboards to proactively catch and address failures.
- Administer and optimize SQL Server databases, including indexing, performance tuning, backups, HA availability groups, transactional replication, and resource management to support fast and efficient queries.
- Ingest and land external data securely using replication, SFTP, or Azure Private Endpoints, ensuring readiness for EDW integration.
- Develop automation scripts to reduce manual work in data validation, pipeline checks, and recovery workflows.
- Manage secret rotation and credential handling securely through Azure Key Vault, ensuring SOCS 2 compliance.
- Contribute to modernization efforts, migrating legacy SSIS workflows to scalable ADF and Databricks platforms.
- Optimize performance and scalability of workflows in Databricks notebooks and SQL Server stored procedures, addressing bottlenecks and improving execution speed.
- Participate in root cause analysis and drive incident resolution and continuous improvements through post-incident reviews.
- Support data governance and compliance, working with security teams on access controls, auditing, and policy enforcement.
- Document operational procedures, and maintain runbooks, playbooks, and support guides to enable consistency, onboarding, and knowledge transfer.
Preferred Qualifications
- Bonus: Experience with IAC (Terraform), Azure Web Applications, DBT or involvement in data governance and compliance initiatives.
- Excellent organizational skills with the ability to prioritize, communicate and execute tasks across multiple projects with tight deadlines and aggressive goals.
- Expert working knowledge of SQL, Python, Scala and Spark, and demonstrated ability to create ad-hoc SQL queries to analyze data, create prototypes, etc.
- Ability to understand complex issues and clearly articulate complex ideas.
- Demonstrated ability to champion change, influence, and drive results in a complex organization.
- Ability to mentor junior developers and contribute to the growth of the team.
- Excellent verbal and written communication skills.
- Experience working in a Multi-Cloud Environment a Plus
- Experience working with dbt a Plus
- Knowledge of the Insurance Industry or FinTech a Plus