Cloud Engineer
Company | KBR |
---|---|
Location | Reston, VA, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Bachelor’s |
Experience Level | Expert or higher |
Requirements
- Active TS/SCI with a Polygraph
- Bachelor’s degree
- Minimum of 10 years
- Experience with cybersecurity, IT systems, and A&A processes
- Experience with facilitation across multi-contractor and staff teams
- Experience collaborating with project teams or multiple entities to define project requirements and acceptance criteria
- Experience monitoring project progress, identifying roadblocks, and implementing mitigation strategies
- Experience designing, building, and maintaining data pipelines for ingestion, processing, and transformation
- Experience collaborating with data scientists and analysts to ensure data quality and accessibility
- Experience building and implementing data governance policies and procedures for data integrity and security
- Experience with design, implementation, and management of secure and scalable cloud infrastructure solutions
- Experience with implementation and management of infrastructure-as-code solutions and collaboration with development teams
- Experience with implementation of CI/CD pipelines and automation of infrastructure provisioning, application deployment, and system monitoring
- Experience envisioning and delivering solutions to business problems using machine learning techniques, including model training, development, and lifecycle management
- Experience in technical architecture expertise, including systems integration and technical leadership
- Proficiency in complex multi-network and security enclave environments
Responsibilities
- Support technical program management: With oversight, support managing the technical program, scope, schedule, budget, risks, and communication
- Deliver Agile-based technical solutions: Work within and across Agile teams to design, develop, test, implement, and support technical solutions, including innovation efforts for modernization and evolution of core infrastructure, platform, and data lake house
- Maintain dashboards and documentation: Create customer-facing dashboard presentations and maintain system requirements, development process documentation, and monitor report generation and delivery
- Enhance cybersecurity and IT operations: Provide recommendations for cybersecurity and IT operations enhancements, and identify solutions and tools for an efficient, streamlined, scalable approach while maintaining high-quality service
- Collaborate on platform development: Collaborate with in-house development teams to construct and implement feature enhancements to the analytic platform with a UI/UX mindset; integrate external systems; and develop, implement, and maintain a data lake house
- Integrate datasets with ML tools: Integrate datasets with corporate machine learning tooling
- Develop scalable data solutions: Develop, implement, and maintain solutions that allow platform users to quickly understand and gain value from massive data sets
Preferred Qualifications
- Experience championing SecDevOps best practices for collaboration, efficiency, and quality
- Experience deploying and managing solutions utilizing retrieval augmented generation techniques
- Experience with big data processing and analysis tools such as Splunk, SOAR, NiFi, or Cribl
- Experience with data visualization and data lake house technologies
- Experience in AWS environment networking, security, logging, administration, and provisioning
- Experience with accreditation and security, including security controls, system hardening, and compliance requirements
- Experience with data transfer tools such as NiFi or Cribl
- Experience creating ad-hoc scripts in Python and JSON
- Experience troubleshooting network connections or scanning
- Experience with JIRA or another ticketing system for task tracking
- Experience or familiarity with DevOps lifecycle and ability to coordinate requirements with development teams
- Experience in Agile and Scrum methodologies for software development
- Technical proficiency in Linux system administration and DevOps tools
- Experience in software engineering, cloud computing, data management, and system integration
- Experience with machine learning frameworks such as TensorFlow, PyTorch, and deploying models in cloud environments (e.g., SageMaker)
- Experience with data engineering and feature engineering for big data processing and machine learning algorithms
- Understanding of cloud platforms (AWS, OCI, Azure, or GCP) and best practices for cloud-based services and security
- Experience with technology, mission, and business systems, as well as architectural vision