Senior Data Engineer

  • Temporary
  • London
  • Negotiable GBP / Year

Senior Data Engineer

Our client, a leading global supplier for IT services, requires Senior Data Engineers to be based at their client’s office in London, UK.

This is a hybrid role – you can work remotely in the UK and attend the London office 3 days per week .

This is a 6+ month temporary contract to start asap

Day rate: Competitive Market rate

This is an exciting opportunity to work with a talented team, harness cutting-edge technologies like Data vault, Azure, Snowflake, DBT and Airflow, and drive innovative solutions that shape the future of the organisation.

As a Senior Data Engineer, you will play a crucial role in designing, developing, and maintaining scalable data pipelines, ensuring high-quality data is available for decision-making across the business. You’ll collaborate closely with Data Architects, Product teams, Analysts, and Data Scientists to implement data solutions that power analytics and reporting.

The ideal candidate will have strong experience in data engineering, cloud platforms like DBT, Azure/ AWS, data modeling, and ETL/ ELT processes.

Key Responsibilities

  • ETL pipeline development: Develop, optimise, and maintain ETL pipelines to efficiently extract, transform, and load data from various sources, ensuring high data quality
  • Monitor and troubleshoot production data pipelines, ensuring their performance and reliability
  • Mentor Junior Engineers and lead technical discussions to drive best practices and innovation within the team
  • Stay up to date with the latest trends and technologies in data engineering and recommend solutions to improve data processing capabilities
  • Query Optimisation & Data Transformation: Write and optimise SQL queries, ensuring data integrity, performance, and scalability, using best practices and techniques
  • Data vault Model implementation: Implement flexible Data vault model in Snowflake to support large-scale analytics and business intelligence
  • Cross-Team Collaboration: Collaborate with Data Engineers, Product Managers, and Data Scientists to deliver solutions that support data-driven insights and innovation
  • Stakeholder Engagement: Engage with business stakeholders to understand requirements and translate them into technical solutions that add value
  • Data Quality & Governance: Implement and enforce data governance and quality processes, ensuring accurate and consistent data flows across all systems
  • Cloud & Infrastructure Support: Work with cloud platforms such as AWS/ Azure and DBT with Snowflake to build and maintain scalable data solutions
  • Continuous Improvement: Proactively look for ways to improve data systems, processes, and tools, ensuring efficiency and scalability

Key Requirements

Essential Skills:

  • ETL/ ELT & Data Pipelines: Solid understanding of ETL/ ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow
  • Experience in designing and implementing data products and solutions on cloud-based architectures
  • Cloud Platforms: Experience working with cloud data warehouses and analytics platforms, such as Snowflake, and AWS/ Azure
  • GitHub Skills and Experience: Proficiency in using GitHub for version control, code collaboration and managing data engineering projects
  • Data Governance and Compliance: Expertise in implementing data governance frameworks in Alation, including data quality management and compliance with industry regulations
  • Effective Communication and Collaboration: Excellent communication skills for interacting with stakeholders, presenting technical concepts, and collaborating with cross-functional teams
  • Collaboration & Communication: Strong interpersonal skills with the ability to work cross-functionally with stakeholders, engineers, and analysts
  • SQL Proficiency: Expertise in writing complex SQL queries, query optimization, and database design for analytics
  • Problem-Solving & Analytical Thinking: Ability to think critically and solve complex problems, translating business requirements into actionable insights

Desirable Skills:

  • Knowledge of Data Visualisation Tools: Experience with tools such as MicroStrategy / Power BI
  • Terraform: Experience with Terraform and Terragrunt for infrastructure as code
  • GenAI: Experience with Generative AI technologies

Due to the volume of applications received, unfortunately we cannot respond to everyone.

If you do not hear back from us within 7 days of sending your application, please assume that you have not been successful on this occasion.

Please do keep an eye on our website https://projectrecruit.com/jobs/ for future roles.

Upload your CV/resume or any other relevant file. Max. file size: 32 MB.