Desired Skills and Experience
- Improve scalability, stability, accuracy, speed and efficiency of our existing data systems
- Design, build, test and deploy new libraries, frameworks or full systems for our core systems while keeping to the highest standards of testing and code quality
- Work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.
- Proactive and reactive monitoring of ElasticSearch/Kafka to provide 24 x 7 up-time and availability across multiple data centres
- Bachelor’s degree in Computer Science /Information Systems/Engineering/related field
- At least 3 years’ experience working with modern systems languages
- Experience debugging and reasoning about production issues
- A good understanding of data architecture principles
- Be able to attend in on-call rotation
- Any experience with ‘Big Data’ technologies / tools
- Strong systems administration skills in Linux
- Strong experience in JVM languages (Java / Scala in particular)
- Python/Perl/Go/Shell scripting skills also a plus.
- Experience working with ElasticSearch/Kafka Architectures
- Experience working with open source products
- Working in an agile environment using test driven methodologies.