Skip to content

ETL Developer
Company | Leidos |
---|
Location | Ashburn, VA, USA |
---|
Salary | $126100 – $227950 |
---|
Type | Full-Time |
---|
Degrees | Bachelor’s |
---|
Experience Level | Expert or higher |
---|
Requirements
- BA/BS with 12+ years of relevant experience or 10+ years of relevant experience with Master’s Degree: OR 4 years of experience in lieu of degree
- 10+ years of experience in ETL development, data integration, and data engineering
- Must be able to maintain and obtain a CBP Background Investigation prior to start
- Expertise in Informatica PowerCenter and Informatica IICS (Cloud Data Integration & Application Integration)
- Hands-on experience with Databricks (PySpark, Delta Lake, Notebooks, Workflows)
- Strong experience in SQL, stored procedures, and performance tuning (SQL Server, PostgreSQL, Redshift, Snowflake, etc.)
- Experience working with cloud-based data platforms (AWS, Azure, or GCP)
- Knowledge of data warehousing concepts, data modeling, and ETL best practices
- Familiarity with REST APIs, JSON, and XML data processing
- Experience with job scheduling tools like Control-M, Airflow, or similar
- Strong problem-solving skills and ability to work independently or in a team environment
- Excellent communication skills and ability to collaborate with cross-functional teams
- Proven experience working on large-scale, complex projects with high data volume transactions.
Responsibilities
- Design, develop, and maintain ETL processes using Informatica PowerCenter, IICS, and Databricks.
- Develop and optimize data pipelines to integrate structured and unstructured data from various sources.
- Work closely with data architects and business analysts to understand data requirements and translate them into scalable ETL solutions.
- Perform data profiling, quality checks, and implement best practices for data governance.
- Optimize performance of ETL jobs and data pipelines to ensure efficiency and scalability.
- Support cloud data migration efforts, including integrating Informatica with cloud platforms (AWS, Azure, or GCP).
- Troubleshoot and resolve issues related to data integration, transformations, and workflow execution.
- Document ETL designs, processes, and technical specifications.
- Handle high data volume transactions efficiently and ensure system scalability.
Preferred Qualifications
- Active CBP BI
- Experience with CBP PSPD
- Experience with Kafka, Spark Streaming, or other real-time data processing technologies.
- Familiarity with CI/CD pipelines for ETL deployments.
- Experience with Python, Scala, or Shell scripting for data transformation and automation.
- Knowledge of Data Governance, Security, and Compliance standards.