Data Engineer
Company | Brillio |
---|---|
Location | Dallas, TX, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | |
Experience Level | Junior, Mid Level |
Requirements
- Strong understanding of ETL fundamentals and modern data engineering principles.
- Expertise in SQL (both basic and advanced), PLSQL, and T-SQL.
- Experience with Google Cloud Platform services: BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Composer, Cloud Trace, Cloud Logging, Cloud Storage, Data Catalog.
- Proficiency in Python for data engineering tasks and automation.
- Solid understanding of data warehousing and data modeling practices.
- Experience building pipelines in a modern data platform environment.
Responsibilities
- Design, develop, and maintain scalable ETL pipelines for structured and unstructured data.
- Write and optimize complex SQL queries (Basic + Advanced) to support analytics and reporting.
- Work with Google Big Query for data warehousing, transformation, and performance optimization.
- Develop distributed data processing solutions using Dataproc, Dataflow, and Data Fusion.
- Orchestrate data workflows using Cloud Composer (Airflow).
- Implement logging and monitoring using Cloud Trace, Cloud Logging, and Cloud Storage.
- Build and maintain metadata repositories with Data Catalog for effective data governance.
- Apply strong data modelling fundamentals for designing normalized and denormalized data structures.
- Collaborate with cross-functional teams to implement modern data platform fundamentals.
- Write and manage stored procedures using PLSQL, T-SQL, and other scripting tools.
- Leverage Python for scripting, automation, and data manipulation tasks.
- Ensure data quality, accuracy, and performance across all data platforms.
Preferred Qualifications
- GCP certification (e.g., Professional Data Engineer or equivalent).
- Experience with CI/CD practices in data engineering.
- Familiarity with other cloud platforms (AWS, Azure) is a plus.
- Exposure to data security and governance practices.