Senior Architect
Company | Bank of America |
---|---|
Location | Newark, NJ, USA, Charlotte, NC, USA, Addison, TX, USA |
Salary | $Not Provided – $Not Provided |
Type | Full-Time |
Degrees | Master’s |
Experience Level | Senior, Expert or higher |
Requirements
- 8+ years of technical architecture and solutions experience including 5+ years of in a combination of relevant Big Data/Analytics areas including: 6+ years of hands-on experience in software development and data engineering.
- Deep understanding and 5+ years’ experience with Big Data technologies including Hadoop, Impala, HIVE, Spark and NoSQL
- Experience in Platform Architecture, Solution Design, Development and Deployment
- A broad set of technical skills and knowledge across hardware, software, platforms and solutions development across more than one technical domain.
- Experience building E2E Advanced Analytics and Big Data Platform that enables reporting, real-time analytics, and insight generation using AI/ ML techniques. Extensive hands-on experience in designing, developing, and maintaining software frameworks using Kafka, Spark, Hadoop/MR, Spark Streaming, Python, etc.
- Demonstrated experience in real world IT or other solutions environments including creating (on your own or with a team) a product or IT solution in the area of Big Data & AI Analytics
- Strong verbal and written communication skills, experienced in interacting with both technologists and business representatives. Comfortable representing company in industry standards organizations and/or industry technical forums.
- Strong technical team leadership and mentoring skills, highly collaborative with cross domain stakeholders. Ability to develop technical relationships with LOB Partners, Platform Architecture, Information Management, Security and Development teams.
- Prior experience with Analytics and Machine Learning tools and technologies. (Anaconda, Python, R Studio, SAS)
- Experience architecting and designing end-to-end systems with integrated self-serve capabilities. Expert experience in designing, building and deployment of streaming and batch data pipelines capable of processing and storing large datasets quickly and reliably.
- Deep Knowledge of metadata management, data lineage, data privacy & protection methodologies and principles of data governance.
Responsibilities
- Works across the business, operations and technology to create the solution intent and architectural vision for complex solutions and prioritize functional and non-functional requirements into a technology backlog to enable the technology roadmap and functionality to support evolving capabilities and services
- Contributes to the creation of the architecture roadmap of defined domains (Business, Application, Data, and Technology) in support of the product roadmap and the development of best practices including standardized templates
- Clarifies the architecture, assists with system design to support implementation, and provides solution options to resolve any architectural impediments
- Facilitates solution driven discussions, leads the design of complex architectures, and finds creative solutions through knowledge of domain, practical experiments, and proof of concepts while ensuring architecture is flexible, modular, and adaptable
- Educates team members on the technology practices, standardization strategies, and best practices to create innovative solutions
- Supports the team as needed to select the technology stack required for solutions and helps select preferred technology products
- Performs design and code reviews to ensure all non-functional requirements are sufficiently met (for example, security, performance, maintainability, scalability, usability, and reliability)
- Responsible for understanding emerging and evolving end user usage models and requirements in Big Data and Analytics, documenting those usage models and business, technical and user requirements and designing solutions to meet those requirements and specifying an implementation HW and SW solution stack that works seamlessly OnPrem and Cloud and enables self-service capabilities for technical and line of business users.
Preferred Qualifications
- Underlying infrastructure for Big Data Solutions (Clustered/Distributed Computing, Storage, Data Center Networking)
- Cloud and AI/ ML Modeling technologies and Industry vertical application for Big Data/Analytics
- Strong understanding of OOP programming and architecture in either Java or Python. Experience designing, building and operating cloud services in IT, Systems Integrator or Service Provider.
- Experience designing and building full stack solutions utilizing distributed computing or multi-node database paradigms. Well versed with processing and deployment technologies such YARN, kubernetes/containers and serverless. Hands one experience on Java, Scala or Python
- Expert level experience in performance engineering and managing large data platform that serve hundreds of peta bytes of data and thousands of tenants. Design effective proactive monitoring solutions and tests to ensure platform stability, resilience, and performance. Ensure non-functional requirements for performance (CPU usage, latency).
- MS in Computer Science or related field