Apps Systems Engineer 5

With Wells Fargo in Hyderabad - IN

More jobs from Wells Fargo

Posted on January 20, 2020

About this job

Job type: Full-time
Role: System Administrator


hadoop, apache, web-services

Job description

About Wells Fargo

Wells Fargo & Company (NYSE: WFC) is a leading global financial services company with $2.0 trillion in assets and offices in over 37 countries. Founded in 1852 and headquartered in San Francisco, Wells Fargo provides asset management, capital raising and advisory, financing, foreign exchange, payments, risk management, and trade finance services to support customers who conduct business in the global economy. At Wells Fargo, we want to satisfy our customers’ financial needs and help them succeed financially. We also value the viewpoints of our team members and encourage them to be their best. Join our diverse and inclusive team where you will feel valued and inspired to contribute your unique skills and experience. We are looking for talented people who will put our customers at the center of everything we do. Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you. Learn more at our International Careers website.

Market Job Description

About Enterprise Global Services
Enterprise Global Services (EGS) enables global talent capabilities for Wells Fargo Bank NA., by supporting over half of Wells Fargo's business lines and staff functions across Technology, Business Services, Risk Services and Knowledge Services. EGS operates in Hyderabad, Bengaluru and Chennai in India and in Manila, Philippines. Learn more about EGS at our International Careers website

Department Overview
The Enterprise Technology Infrastructure (ETI) team within the Enterprise Information Technology (EIT) is responsible for enterprise-wide infrastructure services across the global footprint, including networking, production services, storage, data centers, mainframe, midrange and Cloud-based systems, distributed systems, and international infrastructure.

About the role
Require a lead BigData engineer for platform Data and build activities, responsible for setting up and managing data services platform for the Risk & Finance core services. Requires to drive both near-term data to satisfy regulatory requirements while strategically creating the data services platform.
The person that fills this position will be a vital part of our team and will be responsible for implementing, managing and administering Bigdata infrastructure.


Managing the application support for various critical bank applications and Bigdata infrastructure.

Managing several Hadoop cluster’s reference applications in development and production environments.

Ensure proper metrics instrumentation in software components, to help facilitate real time and remote troubleshooting/performance monitoring

Incident analysis/review, impact analysis of critical issues

Work with development teams and IT service groups to encourage process transparency, mutually-beneficial design and delivery practices, and a clear view of roles and responsibilities across the deployment pipeline

Understand and own component security analysis, including code and data flow review. Collaborate with security team to implement and verify secure coding techniques

Contribute to efficient development process pipeline by leveraging best-in-class tools

Identify and utilize best practices in the industry to maximize efficient and elegant solutions while minimizing cost

Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist

Leverage knowledge of industry trends to build best in class technology to provide competitive advantage

Acts as expert technical resource to programming staff in the program development, testing, and implementation process

Essential Qualifications

8+ years of application and infrastructure implementation and support experience

5+ years of experience in administration of Hadoop clusters and NoSQL databases

Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm

Strong knowledge on middleware technologies like HTTP server, Tomcat and load balancers.

Strong automation build experience for infrastructure health checks and performance analysis

Experience with code deploy tools like Artifactory and UDeploy

Experience in Python or Shell scripting to develop scripts, tools, and test vectors to automate test executions and improve efficiency in testing leading to robust products

Desired Qualifications:
Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects

Operational risk, conduct risk or compliance domain experience

Must be knowledgeable in processes and data related to risk and finance functions

Experience in leading Data Warehouse and Data Analytics applications for Risk and Finance

Experience in public/private cloud platform and Dockerization of APPs.

Market Skills and Certifications

L3 Support

We Value Diversity

At Wells Fargo, we believe in diversity and inclusion in the workplace; accordingly, we welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We comply with all applicable laws in every jurisdiction in which we operate.

Apply here