Desired Skills and Experience
- Provide expertise regarding systems and infrastructure to various project stakeholders.
- Develop and document system and infrastructure configurations utilizing the SDLC methodology.
- Participate in the preparation of system implementation plans and support procedures.
- Provide ongoing system automation management support to Information Excellence teams and related business partners.
- Contribute to the on-going development of the team by sharing information, knowledge, expertise and lessons learned on a regular basis
- Evaluate value added Hadoop tools and/or utilities to enhance Information Excellence services and platform
- Post-secondary degree: Computer Science, Engineering or similar degree preferred.
- A minimum of 3 to 5 years of experience in system administration, information management, system automation and testing.
- A minimum of 1 year of Big Data and Hadoop experience preferred or strong proficiency in Linux shell scripting and system administration.
- Experience with information technology; data and systems management; knowledge of Unix/Linux especially RHEL is a requirement: Hadoop administration and utilities, Java, virtual environments, configuration and deployment automation; and knowledge of RESTful API-based web services is preferred but not mandatory.
- Demonstrated history of being self-motivated, energetic, results-driven, and executing with excellence
- Effective inter-personal skills working well with a fast moving team; able to build and maintain strong relationships with business and technology partners
- Demonstrated ability to work and deliver on multiple complex projects on time.
- Understanding of Hadoop tools and utilities (HDFS, Pig, Hive, MapReduce, Sqoop, Flume, Spark, Kafka) and CDH.
- Understanding of Linux/Unix, especially RHEL.
- Working experience using a scripting language such as Bash, Python or Perl.
- Ability to debug/trace Java or Scala code an asset
- Good understanding and experience on systems automation, scheduling, agile code promotion, system access and proactive system management (DevOps).
- Familiarity with orchestration workflows and high-level configuration management concepts and implementations.
- Experience in process analytics and process flow documentation.
- Knowledge of source code repository systems and data lineage standards. In addition, ability to use revision control systems such as Git.
- Proficient with operating and/or developing Java applications.
- Familiarity of hosting models consistent with Google, Amazon, Microsoft, and other next generation technology companies.
- Experience using RESTful API-based web services and applications.
- Familiarity with using orchestration systems, and automation tools such as Puppet, Chef, Ansible or Saltstack.
- Database experience with MySQL, PostgreSQL, DB2 or Oracle.
- Experience with Cloud infrastructure and Virtual Environments: KVM, Docker or Kubernetes.
- Familiarity with networking, firewalls and load balancing.
- Proactive, organized, excellent analytical and problem solving skills.
- Canadian Retail, including TD Canada Trust, TD Auto Finance Canada, TD Wealth, TD Direct Investing, and TD Insurance
- U.S. Retail, including TD Bank, America’s Most Convenient Bank, TD Auto Finance U.S., and an investment in TD Ameritrade
- Wholesale Banking, including TD Securities.