Our partner is a rapidly expanding international technology and data science business. They build high quality SaaS solutions which automate data science using advanced machine learning and deep learning techniques. They use some of the trendiest technology on the planet so you will never get bored of doing the same thing.
About the role
They are looking for a talented developer to join their team. The job entails building and operating high performance big data pipelines to facilitate all their SaaS products for some of the world's leading brands. You'll be part of a remote team of developers and data scientists based in the United Kingdom, South Africa and Hungary.
Requirements:
- Experience writing testable functional Scala in a production grade system.
- To have utilized Apache Spark in a production system utilizing Scala
- Having worked with Spark orchestration technologies like Apache Airflow is a plus.
- Experience of using a cloud platform to architect and build data pipelines.
- To be a developer who can quickly take up a new technology and deliver features in an extremely agile way.
- To easily navigate the administration of a Hadoop cluster on a cloud platform such as Databricks
- To have used Docker containers to deploy your systems
- Having a strong JVM development background including use of Spring
- Having worked with data streaming for example Kafka
What they offer:
- Remote and flexible working
- Competitive Salary
- Career Development
- Exciting Clients and Projects
- Talented Teams
- Benefits: stock option plan, staff referral scheme, quarterly staff events, wellness day, volunteering opportunities, birthday lie in, sunshine hours, Christmas gift cards, Flexible Leave Policy, private health care, cafeteria system, enhanced maternity & paternity leave, drinks & snacks, fruit.