Technology Lead

With Wells Fargo in Hyderabad - IN

More jobs from Wells Fargo

Posted on February 14, 2020

About this job

Job type: Full-time
Role: System Administrator

Technologies

hadoop, apache, web-services

Job description

About Wells Fargo

Wells Fargo & Company (NYSE: WFC) is a leading global financial services company with $2.0 trillion in assets and offices in over 37 countries. Founded in 1852 and headquartered in San Francisco, Wells Fargo provides asset management, capital raising and advisory, financing, foreign exchange, payments, risk management, and trade finance services to support customers who conduct business in the global economy. At Wells Fargo, we want to satisfy our customers’ financial needs and help them succeed financially. We also value the viewpoints of our team members and encourage them to be their best. Join our diverse and inclusive team where you will feel valued and inspired to contribute your unique skills and experience. We are looking for talented people who will put our customers at the center of everything we do. Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you. Learn more at our International Careers website.

Market Job Description

About Enterprise Global Services
Enterprise Global Services (EGS) enables global talent capabilities for Wells Fargo Bank NA., by supporting over half of Wells Fargo's business lines and staff functions across Technology, Business Services, Risk Services and Knowledge Services. EGS operates in Hyderabad, Bengaluru and Chennai in India and in Manila, Philippines. Learn more about EGS at our International Careers website.

About the role
Require a lead BigData engineer for platform Data and build activities, responsible for setting up and managing data services platform for the Risk & Finance core services. Requires to drive both near-term data to satisfy regulatory requirements while strategically creating the data services platform.
The person that fills this position will be a vital part of our team and will be responsible for implementing, managing and administering Bigdata infrastructure.

Department Overview
Enterprise Function Technology within the Enterprise Information Technology (EIT) business is responsible for supporting Enterprise Data technology, Enterprise technology initiatives related to messaging & collaboration, risk technology, corporate systems data, and various internal systems and tools.

Responsibilities

  • Managing the application support for various critical bank applications and Bigdata infrastructure.

  • Managing several Hadoop cluster’s reference applications in development and production environments.

  • Ensure proper metrics instrumentation in software components, to help facilitate real time and remote troubleshooting/performance monitoring

  • Incident analysis/review, impact analysis of critical issues

  • Work with development teams and IT service groups to encourage process transparency, mutually-beneficial design and delivery practices, and a clear view of roles and responsibilities across the deployment pipeline

  • Background and expertise in automating regular jobs and do regular performance review of the Hadoop infrastructure

  • Understand and own component security analysis, including code and data flow review. Collaborate with security team to implement and verify secure coding techniques

  • Contribute to efficient development process pipeline by leveraging best-in-class tools

  • Identify and utilize best practices in the industry to maximize efficient and elegant solutions while minimizing cost

  • Should be well versed with version control systems and efficiently manage deployment strategy Required Qualifications

  • 8+ years of application and infrastructure implementation and support experience

  • 5+ years of experience in administration of Hadoop clusters and NoSQL databases

  • Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm

  • Strong knowledge on middleware technologies like HTTP server, Tomcat and load balancers.

  • Strong automation build experience for infrastructure health checks and performance analysis

  • Experience with code deploy tools like Artifactory and UDeploy

  • Experience in Python or Shell scripting to develop scripts, tools, and test vectors to automate test executions and improve efficiency in testing leading to robust products Desired Qualifications:

  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects

  • Operational risk, conduct risk or compliance domain experience

  • Must be knowledgeable in processes and data related to risk and finance functions

  • Experience in leading Data Warehouse and Data Analytics applications for Risk and Finance

  • Experience in public/private cloud platform and Dockerization of APPs. Market Skills and Certifications

Bigdata
DevOPS
Hadoop

We Value Diversity

At Wells Fargo, we believe in diversity and inclusion in the workplace; accordingly, we welcome applications for employment from all qualified candidates, regardless of race, color, gender, national or ethnic origin, age, disability, religion, sexual orientation, gender identity or any other status protected by applicable law. We comply with all applicable laws in every jurisdiction in which we operate.

Apply here