Tableau (or similar) Strong knowledge of SQL and experience in testing databases and data warehouses with dBt (e.g., Snowflake - Preferred, Redshift, BigQuery) Strong Knowledge of workload automation platforms like Apache Airflow and dBt (Data Build Tool) Familiarity with CI/CD tools (e.g. Azure DevOps – Preferred, Jenkins) and experience integrating automated tests into pipelines. Experience with cloud platforms (AWS … for developing test automation scripts and frameworks. Proficiency with automation testing frameworks (Cucumber, Gherkin, TestNG, or similar) for data testing workloads. Knowledge of performance testing and load testing tools (Apache JMeter or Gatling) Experience: Proven track record in supporting and improving test processes in data-related projects. Experience in leadership, mentoring, or training roles is highly advantageous. Desirable Qualifications More ❯
our data systems, all while mentoring your team and shaping best practices. Role Lead Technical Execution & Delivery - Design, build, and optimise data pipelines and data infrastructure using Snowflake, Hadoop, Apache NiFi, Spark, Python, and other technologies. - Break down business requirements into technical solutions and delivery plans. - Lead technical decisions, ensuring alignment with data architecture and performance best practices. - Optimise … data pipeline efficiency. All About You Technical & Engineering Skills - Extensive demonstrable experience in data engineering, with expertise in building scalable data pipelines and infrastructure. - Deep understanding of Snowflake, Hadoop, Apache NiFi, Spark, Python, and other data technologies. - Strong experience with ETL/ELT processes and data transformation. - Proficiency in SQL, NoSQL, and data modeling. - Familiarity with cloud data platforms More ❯
London, England, United Kingdom Hybrid / WFH Options
DEPOP
experience managing data infrastructure or platform teams at scale, ideally in a consumer or marketplace environment. Deep understanding of distributed systems and modern data ecosystems - including experience with Databrick, Apache Spark, Apache Kafka and DBT. Demonstrated success in managing data platforms at scale, including both batch processing and real-time streaming architectures. Deep understanding of data warehousing concepts More ❯
London, England, United Kingdom Hybrid / WFH Options
SBS
and Lakehouse Design: Strong data modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, Apache Airflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and analytical skills with attention to detail. More ❯
and Lakehouse Design: Strong data modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, Apache Airflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and analytical skills with attention to detail. More ❯
focus on automation and data process improvement. Demonstrated experience in designing and implementing automation frameworks and solutions for data pipelines and transformations. Strong understanding of data processing frameworks (e.g., Apache Spark, Apache Kafka) and database technologies (e.g., SQL, NoSQL). Expertise in programming languages relevant to data engineering (e.g., Python, SQL). Hands on data preparation activities using More ❯
Glue Catalog, and AWS Glue Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like Apache Airflow and dbt, Spark/Python or Spark/Scala on AWS Platform. The data engineer More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary up More ❯
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary up More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary up More ❯
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Proficiency in cloud-based data architecture (AWS, Azure, GCP, Snowflake). Understanding of Data Mesh, Data Fabric, and product-led data strategies. Technical Knowledge: Familiarity with big data technologies (Apache Spark, Hadoop). Knowledge of programming languages such as Python, R, or Java. Experience with ETL/ELT processes, SQL, NoSQL databases, and DevOps principles. Understanding of AI and More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary up More ❯
skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal English skills. Last but not least More ❯