Desired Skills and Experience

  • You will build, administer and scale data pipelines that process billions of messages a day spanning over multiple data centers
  • You will be comfortable navigating the following technology stack: Linux, JVM, Java/Scala, C#, Golang, Kafka, scripting (Bash/Python), Hadoop, ElasticSearch
  • You will develop and expand upon existing frameworks that is used by Teams throughout Agoda to produce messages to the data pipeline
  • You will build and manage data ingestion into multiple systems (Hadoop, ElasticSearch, other Distributed Systems)
  • You will build tools that monitor high data accuracy SLAs for the data pipeline
  • You will fix production problems
  • You will profile for performance, self-recovery and stability
  • You will collaborate with other teams and departments
  • You will automate system tasks via code as needed
  • You will explore available new technologies that improve upon our quality of data, processes and data flow
  • You will develop quality software through design review, code reviews and test driven development
  • You’ll probably have a B.Sc. in Computer Science / Information Systems / Computer Engineering or related field
  • You have two plus years of industry experience, preferred at a tech company
  • A passion for Big (Petabytes worth) Data
  • Good knowledge of data architecture principles
  • You have operational experience debugging production issues
  • You have experience with at least one of the following: Scala, Java, C#, GoLang and/or any functional language
  • An experienced coder, who can stand your ground with experience building systems with purpose that are flexible, well-tested, maintainable and scale
  • You’re detail oriented considering every outcome of a particular decision
  • You have no issues being on-call and working at odd hours as needed
  • You can communicate in technical English with fluidity, both verbal and written
  • Good understanding of how Kafka works
  • Kafka Administrator Experience
  • Experience producing messages to Kafka from any one of the following languages: Java, Scala, C#, GoLang
  • Understands Concepts relating to Schema Registry and Schema Evolution
  • Experience working with Serialization Formats either with ProtocolBuffers, Avro or Thrift
  • Knows how to use ElasticSearch proficiently
  • Development experience on Hadoop (MapReduce, Spark, Hive, Impala, SparkSql)
  • Experience with data ingestion from Kafka into Hadoop, ElasticSearch, other Distributed Systems
  • Strong systems administration skills in Linux
  • Worked on or contributed to Open Source Project
  • Generous annual leave increasing each year with additional business and family leave days
  • 15 days of public holidays
  • Provident fund
  • Local or International health insurance options available with added additional options for family members
  • Staff discounts on hotel bookings
  • Staff discount scheme on restaurants, flights and local amenities
  • Free Thai or English classes (if you want to learn or improve these skills)