We are looking for a self-motivated and responsible team player willing to join a modern Big Data project from its early beginning.

Requirements:

  • 5+ years in software development industry
  • General knowledge of Hadoop ecosystem
  • Apache Spark expertise is a must
  • PostgreSQL, Postgre Cytus
  • SQL Server
  • Python
  • English: intermediate, proven ability in written and verbal communication with English-speaking clients

Would be a plus:

  • MS Azure, .NET Core
  • Databricks
  • Scraping expertise

Responsibilities:

  • Helping to develop a new data collection and processing platform for a leading USA vacation rental market analytics provider
  • Data scraping, modelling, building data pipelines and enhancing the data platform to handle billions of data records with high performance requirements
  • Data preparation for ML will be required on later stages of the project

What we offer:

  • Working with top-notch experts in Data Engineering and Data Science
  • Modern technology stack
  • Ability to work remotely, flexible working hours
  • Competitive compensation
  • Business trips (USA) are possible

Interested?

Apply