Senior Data Engineer

Data is central to everything we do at LADbible. We use data to understand how the overall business is performing, how and why content resonates with our audiences, and to identify the best opportunities for viral growth. It gives us the edge which keeps us ahead of our competitors. At the scale of our operation, the challenge of getting maximal value from this data requires both technical excellence, and creativity! The successful candidate will be a key player in a team which is blazing the trail of Data Science work within this sector with great results.


We’re looking for someone who can push forward the capabilities of our Data Warehouse to allow us to internally support core business operations alongside externally presenting insights through data. Alongside supporting the team on key deliverables, the successful candidate will plan out and execute a long term plan for keeping the Data Warehouse running, which will include reviewing the entire stack, suggesting new architectural changes and writing ETL pipelines to assist the Data Scientists in presenting that data in a meaningful way.

Data Engineers at LADbible Group are problem solvers, responsible for delivering robust, scalable technical solutions feeding into the Data Warehouse. You will be passionate and goal driven. Data Science is an area of investment and excitement within the company, and the successful candidate will enjoy unrivalled opportunities for development, progression, and reward.

We are looking for someone to join a vibrant and enthused team, picking up some of our current techniques, and equally bringing with them their own ideas and areas of expertise to enrich the collective knowledge base of the department.

Skills & Responsibilities

Maintaining a Data Warehouse should be seen as tracking a moving target. Due to the nature of how we consume data from various sources, those integrations require constant attention and maintenance to ensure the validity and consistency of that data.

  • Be responsible for maintaining and progressing the entire stack
  • Mentor & support engineers and juniors to deliver excellent solutions
  • Be comfortable working within an agile environment
  • Utilise data to support decisions
  • Contribute to technical design and delivery of projects and solutions
  • Support day-to-day operations within the team
  • Produce clear and concise documentation
  • Author scalable ETL pipelines for the collection of data from public APIs
  • Document methods used to build and maintain pipelines
  • Create backup strategies for both the pipelines and the databases within the warehouse
  • Create version control strategies for both the pipelines and the databases within the warehouse
  • Evaluate the technical stack used in the current warehouse and suggesting improvements to ensure longevity and stability
  • Support the team to ensure the correct data is in place for further analysis and manipulation
  • Explore new data sources to identify potentially useful data for incorporation into bigger-picture deliverables

Proficiency in modern technologies:

  • ETL Pipelines
  • Amazon Redshift
  • Python and R

Essential attributes:

  • Experience with a Data Science role or similar (Data Analyst, BI Analyst, Statistical Programmer, Analyst Programmer)
  • Experience working with relational and non-relational database technologiewis: SQL, NoSQL, basics of database design and operation
  • Experience manipulating and processing data with Python or a similar scripting language. Experience interfacing between this and the data storage/retrieval layer
  • Communication skills: ability to ingest simple requirements into potentially complex solutions, and ability to explain the output of these concisely to non-technical stakeholders
  • Comfort working both independently/autonomously, and also as part of a team
  • A productive and personable outlook; to enrich the culture of the business
  • Experience with data infrastructure considerations; storage and pipelines
  • Experience with cloud data products; AWS, GCP etc
  • Experience with Linux