Desired Skills and Experience

  • Lead and mentor the Infrastructure team.
  • Define a short and long term roadmap aligned with the company’s vision
  • Work in a collaborative and multidisciplinary team of software engineers, data scientists, marketing and product owners
  • Improve, maintain, provision and scale our core cluster of terabytes of data
  • Guarantee high availability for all the services and products of the company
  • Define and implement security policies to restrict accesses to our infrastructure and the data that resides there
  • Be challenged every day with complex and unique problems
  • The opportunity of working on projects involving cutting technologies
  • The opportunity of presenting your work on different workshops
  • The opportunity to find a subject for your thesis!
  • Work-related travel opportunities
  • Online or presential trainings according with your career plan
  • An entrepreneurial and scientific environment with a competitive salary
  • Coffee and a fridge full of drinks
  • Performance-based annual bonus
  • English classes at the office
  • Free lunch all fridays
  • Health insurance for you and your family
  • Capability to lead and mentor a team
  • Experience with hardware/software load balancers including F5, BigIP, HA Proxy, Nginx, Apache HTTP Server
  • Solid experience with configuration management tools like Puppet, Chef, CFEngine, Juju, Rudder, Ansible, Salt
  • Strong knowledge of network protocols (TCP/IP, HTTP, HTTPS, DNS, SMTP, TELNET, DHCP, SSH, FTP, SFTP, etc)
  • Solid experience working and administrating Linux environments
  • Experience automating IT processes using any scripting language (Bash, Python, etc)
  • Experience with monitoring and alerting tools (Nagios, Zenoss, Zabbix, etc)
  • Experience with server virtualization using VM or container tools (VirtualBox, VMware, QEMU, Docker, etc)
  • Understanding of cryptography algorithms and experience with cryptography tools, i.e. GPG.
  • Experience provisioning and maintaining Big Data infrastructure (HDFS, Hadoop, Spark, YARN, Hive, Zookeeper, Kafka, Atlas, Ranger, Zeppelin, etc).
  • Experience defining and implementing security and privacy policies.
  • Experience with any Big Data platform distribution, ideally Hortonworks.
  • Experience with cluster management tools (Kubernetes, Mesos, Docker Swarm)
  • Experience with Kerberos authentication protocol
  • Experience installing and maintaining Databases (MySQL, PostgreSQL, etc)
  • Experience with CI/CD
  • Advanced English communication skills
  • Bachelor’s or Master’s degree in Computer Science, Software Engineering or a related field

Apply