Skip to main content

Data Engineer

Data Engineer

  • Anywhere

Data Engineer

Huge opportunity for a Data Engineer to join a company who have outlined a very ambitious and exciting data strategy.

They offer perks in their bucket load including;

  • Up to 15% bonus
  • 36 days holiday including banks plus buy and sell options
  • Staff discount
  • Gym membership
  • Remote working
  • Private healthcare

They are focused on creating an internal analytics capability that will create value by helping them all to make better decisions that improve internal welfare, enables a brilliant customer experience and optimises the working life of all partners.

The strategy will enable:

  • Get more impactful, relevant and timely insight out to colleagues and partners
  • Increase the velocity and effectiveness of activity and customer experience journeys
  • Leverage advanced analytics to optimise supply chain, customer spend, workload, colleague rotas, pricing, next best action and profitability

They have taken a forward-looking approach and placed data and analytics at the heart of the business particularly as they have a successful customer scheme that they can leverage far more than they already have.

They believe that if they can harness the power of their data it will open up new opportunities. To maximise these opportunities requires a leading class analytics capability within the organisation.

They are very early on in their journey, and as a Data Engineer you will have a lot of opportunity to influence the technical and strategic direction of the team. You will also have the opportunity to take ownership of parts of the existing platform, new greenfield development.

Key role responsibilities

  • Develop, maintain and improve our cloud data platform and help plan, design, monitor, communicate and execute data projects
  • Assist the analytics teams in the implementation of their machine learning use-cases
  • Evangelize about our data platform, products and Data Engineering capabilities with other departments in order to bring more relevant data into our eco-system and develop future data-products that solve real business problems.
  • Maintain simple but useful relevant technical documentation as it is of key importance to make sure that our services and applications are easy to understand and use by the analytics community
  • Deliver software that is scalable, high availability and fault-tolerant
  • Drive automation particularly in the continuous integration pipelines, infrastructure management and configuration.
  • Ensure that data is of the highest quality and legality coming into and going out of the analytical platform.
  • Adoption and improvement of software development patterns and best practices particularly around open-source components
  • Continuous delivery and Dev Ops experience in order to drive the Data engineering team in infrastructure automation, monitoring, logging, auditing and security implementation and practices
  • Offer improvement to software development patterns and best practices for an analytical platform
  • Conduct code reviews, pair programming and knowledge sharing sessions

What you can bring:

  • Demonstrable experience of working with and designing a cloud-based analytical platform including best practices around data ingestion on an industrial scale (batch and streaming ETL/ELT) and turning data science/machine learning algorithms into production-grade products
  • Strong software development skills (particularly in Python) – object oriented and/or functional design, coding, and testing patterns, the relevant DevOps principles, and the ability to document in a clean manner
  • Solid knowledge of data modelling and structures, and experience with data laking and warehousing tools and techniques (BigQuery, Spanner, Snowflake, Redshift etc.)
  • Hands-on experience with ingesting and processing streaming data – messaging queues (RabbitMQ, Kafka, PubSub etc.) and data flow orchestration (Data Flow, Apache NiFi, Airflow, Luigi etc.)
  • Strong understanding of the end-to-end deployment process of data products (from raw code to scalable deployment), the relevant CI/CD tools (Jenkins, Spinnaker, TeamCity) and containerisation (Docker, Kubernetes, Helm)
Upload your CV/resume or any other relevant file. Max. file size: 68 MB.

Job Overview
Category
BI and Data Analytics
Offered Salary
55000
Job Location