Software Engineer - ML Data Platform

With Criteo in Ann Arbor MI US

More jobs from Criteo

Posted on March 18, 2020

About this job

Job type: Full-time
Experience level: Junior, Mid-Level, Senior
Role: System Administrator
Industry: Ad Tech, Advertising Technology, AI Research
Company size: 1k–5k people
Company type: Public

Technologies

apache, java, web-services

Job description

Who we are

Criteo (NASDAQ: CRTO) is the global technology company powering the world’s marketers with trusted and impactful advertising. 2,800 Criteo team members partner with over 20,000 customers and thousands of publishers around the globe to deliver effective advertising across all channels, by applying advanced machine learning to unparalleled data sets. Criteo empowers companies of all sizes with the technology they need to better know and serve their customers.

How you’ll make an impact

You will join our team of machine learning researchers and software engineers to design and build machine learning platforms that will be used to power experimentation and production ML applications at Criteo. Your responsibilities will include building libraries, services and datasets that will be used by ML researchers and practitioners across Criteo.

  • Contribute directly to the development of Criteo’s infrastructure for experimentation/productionizing of ML applications
  • Write high quality, maintainable code. Mentor other engineers.
  • Build scalable big-data distributed data processing systems targeted towards
  • Machine Learning using industry standard services like Apache Flink / Apache Spark / Presto / Hive in languages like Java and Scala. 
  • Collaboratively architect the system design along with others.
  • Develop open source projects. Consider becoming a committer. We are big users of open source and want to give back to the community.

What we are looking for

  • BS/MS in Computer Science or relevant experience
  • 2+ years of programming experience in a high-level language like Python, Java, Scala or C++ or equivalent [5+ years for senior eng]
  • Rock-solid foundation in Computer Science (data structures, algorithms) as well as the basics of machine learning
  • Experience with large scale big-data processing in the Hadoop ecosystem
  • Strong hands-on skills in sourcing, cleaning, manipulating and analyzing large volumes of data
  • Experience developing and extending systems of moderate complexity
  • Strong communication skills (written and oral) and a team player who can work efficiently with others
  • A strong sense of ownership and taking pride in your work

Apply here