Skip to content

Enterprise Solution Architect – Data Quality
Company | Collibra |
---|
Location | Lebanon, KS, USA |
---|
Salary | $140000 – $175000 |
---|
Type | Full-Time |
---|
Degrees | Bachelor’s |
---|
Experience Level | Senior, Expert or higher |
---|
Requirements
- 5-10 years of relevant industry consulting experience.
- A minimum of 2 years of experience working with cloud platforms, network-based API integration (REST/JSON), and deployment automation.
- Experience with Java or Python for custom software implementations in Enterprise Linux or Unix environments.
- Proven ability to quickly learn new technologies and business requirements.
- Hands-on experience with Data Quality, Data Pipelines, Data Orchestration, or Job Control tools.
- A Bachelor’s degree or equivalent in a business, technical, or related field is preferred.
Responsibilities
- Partnering with clients to understand their data quality challenges and needs.
- Conducting interviews with client stakeholders from both Business and IT to uncover current data quality issues and identify emerging trends.
- Defining current and future states for the Data Quality vision.
- Leading efforts to improve client data quality through profiling, analysis, and long-term monitoring of their data.
- Profiling datasets to identify data quality issues and challenges.
- Ensuring service delivery aligns with Collibra’s Engagement Model, Processes, Methodologies, and Artifacts.
- Achieving utilization targets and contributing to the growth of Collibra’s services business.
Preferred Qualifications
- Knowledgeable of data management processes, including data governance, data stewardship, master data management, data cataloging, data warehousing, ETL, data integration, and business rules management.
- Able to work with business and analytics leaders to identify and align on solutions to data quality challenges and advocate for the importance of preserving data integrity.
- Experienced in leading blueprinting, design workshops, and requirements gathering activities.
- Proficient with big data technologies such as Kubernetes, Spark, and Hadoop.
- Familiar with enterprise security solutions such as LDAP or Kerberos.
- Strongly skilled in developing functional specifications, technical specifications, and test specifications.