Specialist Application Engineer
Company | Waystar |
---|---|
Location | Louisville, KY, USA, Lehi, UT, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s, Master’s |
Experience Level | Senior |
Requirements
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or comparable experience.
- 5–7 years of experience in data engineering with a focus on cloud-based data platforms.
- Experience working with relational databases such as SQL Server and Postgres.
- Proven experience designing and implementing large-scale data pipelines with a focus on value delivery and performance.
- Strong understanding of data warehousing, data lakes, and modern data architectures.
- Experience working with data-sharing frameworks and licensing models.
- Strong business-focused mindset with a clear drive to deliver measurable outcomes.
- Strong attention to detail and focus on data accuracy and quality.
Responsibilities
- Design and implement scalable, high-performance data pipelines using Snowflake and Google BigQuery to deliver licensed data products to customers.
- Develop and maintain ETL/ELT processes to ingest, transform, and store structured and unstructured data, ensuring data accuracy and timeliness.
- Build data infrastructure to support real-time and batch processing use cases for customer-facing products.
- Establish and enforce data quality and privacy standards and monitoring to ensure consistent and accurate data delivery to customers.
- Monitor and troubleshoot data pipeline performance issues and implement proactive improvements to minimize downtime and data delivery delays.
- Design, develop, and maintain RESTful APIs to enable secure external access to licensed data products.
- Strong experience designing and developing APIs (RESTful preferred) for secure data delivery and integration.
- Design and implement data models to support data licensing and resale use cases, ensuring they meet customer requirements and business goals.
- Integrate data from multiple source systems, including SQL Server, Postgres, and external APIs, ensuring data consistency and reliability.
- Create and maintain data dictionaries and metadata to support customer understanding and self-service analytics.
- Implement data partitioning, indexing, and clustering strategies to improve query performance and reduce processing time.
- Develop and maintain secure de-dentification and data-sharing mechanisms to ensure appropriate access controls for licensed data products.
- Work closely with product and business teams to define data product requirements and deliver licensed data products that meet customer expectations.
- Implement secure data-sharing protocols to ensure that licensed data is accurately and efficiently delivered to customers.
- Design and implement data anonymization and de-identification processes to meet data privacy and HIPAA compliance requirements.
- Create scalable data delivery mechanisms to support on-demand and scheduled data access for external clients.
- Ensure data licensing agreements and customer contracts are reflected accurately in data access controls and delivery processes.
- Partner with business and product teams to understand customer goals and translate them into data engineering solutions.
- Ensure that data products and pipelines deliver measurable business value and directly improve customer outcomes (e.g., better claim processing insights, faster payment reconciliation).
- Establish KPIs to track the business impact of data solutions and continuously improve them.
- Actively identify opportunities to improve data quality and speed of delivery to enhance customer experience and support data product growth.
- Automate repetitive data processing tasks using Python and SQL to reduce manual effort and increase efficiency.
- Monitor and optimize data pipeline performance, addressing bottlenecks and failures to improve customer satisfaction.
- Leverage Snowflake’s performance features (e.g., micro-partitions, clustering, materialized views) to optimize query performance.
- Implement data orchestration using tools like Airflow or dbt to automate and monitor workflows.
Preferred Qualifications
- Experience working with healthcare data and understanding RCM processes (claims, billing, payments).
- Familiarity with data orchestration tools (e.g., Airflow, dbt).
- Experience with data security, access controls, and healthcare data compliance (HIPAA).
- Experience with data licensing agreements and customer-facing data products.
- Strong problem-solving skills and ability to work in a fast-paced environment.