Experienced Big Data DevOps

With Kindred Group in Stockholm - SE

More jobs from Kindred Group

Posted on March 18, 2020

About this job

Location options: Visa sponsor
Job type: Full-time
Experience level: Mid-Level, Senior
Role: DevOps
Industry: Casino, Gambling
Company size: 1k–5k people
Company type: Public

Technologies

hadoop, bigdata

Job description

The role

You will be part of the Big Data team that is bringing state-of-the-art analysis techniques to Kindred. You will own and manage the Data Operations work stream that is responsible for provisioning, automation, maintenance and support of the Hadoop and Cloud platform ecosystem.

We work in an Agile way with open communication, trust and compassion always striving to improve ourselves, to learn from other members of the team and to share knowledge.

We are a fast paced international team working around five core tenets - we are individuals united to a common goal, we will challenge the status quo and push the boundaries, we are trusting and always friendly, we innovate through experimentation and exploration - always.

This position can be localized in Stockholm or London.

What you will do**?**

  • Maintain and keep the Cloud platform up and running.
  • Provision the Hadoop cluster and complementary tools.
  • Monitor and maintain the ecosystem to guarantee uptime.
  • Automate deployments of application code and configuration changes.
  • Implement security and audit mechanisms to guarantee data security.
  • Support the Big Data deployment and handle incidents.
  • Continuously improve performance of the team and the technology stack.
  • Bring world-class knowledge on processes to ensure data quality.
  • Work closely with development teams and other business stakeholders. 

What have you done?

  • Extensive experience on AWS Cloud. Received certification. 
  • Strong experience in Linux system administration and Hadoop platform setup, monitoring, maintenance and support. Received HDP CERTIFIED ADMINISTRATOR or CCA Administrator certification.
  • Clear hands-on experience of using Kerberos, LDAP and Active Directory 
  • Experience in programming in Bash and Python, and preferably Java
  • Experience in maintaining and deploying application on Spark, Kafka, Storm
  • Experience using Splunk for system monitoring.
  • Deep understanding of real-time data processing concepts with knowledge of the industry best practices.
  • A keen mind with an appetite for problem solving
  • Passion for open source technologies and a desire to apply them to large volumes of data. 
  • Extensive experience in Docker and Kubernetes
  • Experience using SQL and NoSQL databases

Apply here