Data Engineer

Date Posted
9th June 2018
Job Type
£50,000 to £70,000 Per Annum

Data Engineer - Python, Spark, AWS
Marketing Analytics Consultancy
Central London - Oxford Street

A Marketing Analytics Company are looking to add a further Data Engineer to their 5 strong team based in Central London. You will build and maintain data pipelines in Python and Spark, model data in an AWS technologies, and deploy machine learning models to drive their cutting-edge Data Science product and consultancy offerings. The data is a really diverse blend of Social Media data, Transactional Data, Financial Services data, and more which should provide interesting challenges operating at scale. This is a great opportunity to join a young, fun team delivering interesting client led solutions.

There is also the opportunity to do more data science focused work further down the line when projects demand it.


The business are just moving out of start-up phase, making a profit, with a team of around 50 people. They access unique data sources to provide analysis on products with low volume transactional data, meaning they have to get creative with different data-sets and analysis techniques to add value. You can expect a young, friendly, ambitious and energetic team with a collaborative approach to work.


As a Data Engineer, you will:

  • Development of data pipelines on Hadoop, PostgreSQL and Spark.
  • Design and develop new algorithms for extracting insight from social data using SQL and Python
  • Develop crawlers to extract data from the web or APIs
  • Build infrastructure around existing internal tools to enhance capabilities and improve data flow.
  • Implement statistical models and algorithms including clustering on large scale graph data
  • Deploy to, work on and manage AWS especially EMR and S3.


To qualify for this Data Engineering role, you will need one or more of the following:

  • Ideally Python programming skills, although other languages can be considered
  • SQL and Data Modelling
  • Experience in Cloud technology (e.g. AWS or GCP)
  • An understanding of distributed computing in Hadoop or Spark
  • A driven and collaborative approach to work.


To apply for this role, please do via this site. For more information, reach out to Ross at Harnham 0208 408 6070

KEYWORDS: Python, Data Engineer, AWS, GCP, Hadoop, Spark, SQL

Register and Apply

Login and Apply

Profile Options

Location(s) you would like to work in*:
Job Type*:
Most Recent Position*:
My Top 5 skills
Skill 1:
Skill 2:
Skill 3:
Skill 4:
Skill 5:
Role(s) Sought:
Minimum salary required*:
Willing to Relocate*:
By submitting this form you agree to our terms of use

Copyright © 2018 Daylight Resources Ltd

Site design by Zero One Design Ltd