We have lots of data and are want to get a proper grip on it. That’s why we need you to help us perfect our data storage capabilities and provide real time access to traffic and revenue data. You’ll be working as part of the data analytics team who are responsible for dashboarding, reporting as well as some machine learning modelling. You’ll also be working with the data science team who focus on data research including NLP. We want you to be a self-starter and bring your own ideas to the table.

Location:

Responsibilities:

Desired Skills and Experience

  • Anywhere / remote
  • We prefer that you are between UTC-3 and UTC-8
  • If you prefer working in an office: Toronto or Grand Cayman are the options
  • You’ll be in charge of maintaining our BigQuery data sets. You’ll make sure the data is securely stored, always up-to-date and accessible.
  • You’ll have free rein over the BigQuery data architecture, keeping in mind the needs of VerticalScope.
  • You will combine data from different sources (PostgreSQL, MySQL, flat files, MS Dynamics)
  • You’ll also be responsible for making sure that various Python scripts run correctly every day. Currently they’re scheduled using Jenkins. The scripts pull data from our partners and store it in BigQuery. Sometimes this is done via APIs, sometimes webscraping. You should understand how to work with APIs, selenium and other scraping techniques.
  • You’re happy dealing with small data (a few thousand rows), as well as large (TB+) data.
  • You should be able to assess how various (sometimes non-technical) data needs translate into data engineering requirements.
  • Google Big Query experience
  • Python scripting knowledge significantly beyond print(‘Hello World!’)
  • Strong data architecture know-how
  • 3+ years of work experience
  • ElasticSearch knowledge
  • Java, Groovy knowledge
  • 6+ years of work experience
  • Remote working experience