London, England, United Kingdom Hybrid / WFH Options
Marshmallow
are often financially disadvantaged when they arrive in the UK. Innovative Tech Stack: Work with modern tools like Snowflake, DBT, AWS Kinesis, Step Functions, Airflow, Datahub, and Looker. What You’ll Be Doing Platform Ownership: As the subject matter expert for our data platform, you'll manage and enhance … services like S3, Kinesis, Lambda, and Glue, and how they combine to create an effective data platform. Data Orchestration: Familiar with orchestration tools like Airflow or AWS Step Functions. SQL & Data Modelling: Advanced SQL skills and experience in data modelling. Technical Proficiency: Skilled in Python, DBT, Terraform, and Docker More ❯
deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with More ❯
In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their … the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate More ❯
London, England, United Kingdom Hybrid / WFH Options
Scope3
ORM & PostgreSQL REST and GraphQL APIs React w/Next.js for frontend applications Low latency + high throughput Golang API Big Query Data warehouse Airflow for batch orchestration Temporal for event orchestration Apache Beam (dataflow runner) for some batch jobs Most transformations are performed via SQL directly in … with Google Cloud Platform and/or Amazon Web Services Expertise in Python, SQL Big Query or equivalent data warehouse experience (Redshift, Snowflake, etc.) Airflow or equivalent in-house data platform experience (Prefect, Dagster, etc.) Experience with Clickhouse Demonstrated experience perpetuating an inclusive and collaborative working environment Preference may More ❯
London, England, United Kingdom Hybrid / WFH Options
Solirius Reply
schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client …/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving More ❯
London, England, United Kingdom Hybrid / WFH Options
Merantix
Preferred Qualifications Hands-on experience with: Distributed computing frameworks, such as Ray Data and Spark. Databases and/or data warehousing technologies, such as Apache Hive. Data transformation via SQL and DBT. Orchestration platforms, such as Apache Airflow. Data catalogs and metadata management tools. o Vector data stores. More ❯
London, England, United Kingdom Hybrid / WFH Options
Datapao
companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. … Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You’re proficient in SQL and Python , using them to transform and optimize data like a pro; You know … at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for learning More ❯
London, England, United Kingdom Hybrid / WFH Options
DATAPAO
most complex projects - individually or by leading small delivery teams. Our projects are fast-paced, typically 2 to 4 months long, and primarily use Apache Spark/Databricks on AWS/Azure. You will manage customer relationships either alone or with a Project Manager, and support our pre-sales … GCP); Proven experience with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); Extensive ETL/ELT and data pipeline orchestration experience (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions); Proficiency in SQL and Python for data transformation and optimization; Knowledge of CI/CD pipelines and Infrastructure as Code More ❯
London, England, United Kingdom Hybrid / WFH Options
Whitehall Resources Ltd
systems into scalable cloud-native solutions • Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow • Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code • Participate in code reviews, ensuring adherence to best … experience working with Teradata and Infomatica. • Proficiency in working with legacy systems and traditional ETL workflows • Solid experience building data pipelines using modern tools (Airflow, DBT, Glue etc.) and working with large volumes of structures and semi-structured data • Demonstrated experience with SQL and Python for data manipulation, pipeline More ❯
London, England, United Kingdom Hybrid / WFH Options
Ziff Davis
independently. The ideal candidate will have strong software development experience with Python and SQL, along with a keen interest in working with Docker, Kubernetes, Airflow, and AWS data technologies such as Athena, Redshift, and EMR. You will join a team of over 25 engineers across mobile, web, data, and … with CI/CD practices and tools Understanding of Agile Scrum development lifecycle What you’ll be doing: Implementing and maintaining ETL pipelines using Airflow and AWS technologies Contributing to data-driven tools, including content personalization Managing ingestion frameworks and processes Monitoring and maintaining our data infrastructure in AWS More ❯
London, England, United Kingdom Hybrid / WFH Options
RELX
The technology underpinning these capabilities includes industry-leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent, Astronomer/Airflow, and Kubernetes. This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platforms that enable a … driving a culture of iterative improvement. Modern data stack – hands-on deployment and governance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran, Airflow, AWS, GitHub, Terraform, etc.) for self-service workloads. Coding languages – deployable and reusable Python, JavaScript, and Jinja templating languages for ETL/ELT data More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Position: Data Engineer Location: Cambridge/Luton, UK (Hybrid 2-3 days onsite in a week) Duration: Long Term B2B Contract Job Description: The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python More ❯
Mountain View, California, United States Hybrid / WFH Options
LinkedIn
LinkedIn is the world's largest professional network, built to create economic opportunity for every member of the global workforce. Our products help people make powerful connections, discover exciting opportunities, build necessary skills, and gain valuable insights every day. We More ❯
We're looking for a Senior Data Engineer to join Pleo and help us in our journey in our Business Analytics team. This team is responsible for delivering and enhancing high-quality, robust data solutions that drive commercial performance, revenue More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava Limited
with business objectives. Key Responsibilities Architect, implement, and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake, or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target … ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Data Modelling: Designing dimensional, relational, and hierarchical data models. Scalability & Performance: Building fault-tolerant, highly available data architectures. Security More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
modern data architecture, including data lakes, data warehouses, structured and semi-structured data processing. Experience with data transformation tools (DBT, Coalesce) and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). … Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms (Denodo, Domo) to enable seamless self-service data exploration and analytics. Strong background in cloud platforms (AWS, Azure, Google Cloud) and their More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
Gateway, and Kinesis. Integrating third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data … services including Lambda, API Gateway, S3, Kinesis, and CloudWatch. Strong programming ability in Python and data transformation skills using SQL and DBT. Experience with Airflow for orchestration and scheduling. Familiarity with third-party API integration and scalable data delivery methods. Excellent communication and the ability to work in a More ❯
In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their … the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate More ❯
In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their … the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate More ❯
of 5). Salary: £130,000 – £150,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places … teams Leading a small team of 5 data engineers What you’ll bring: Strong leadership experience in data engineering Deep expertise with AWS, Snowflake, Airflow, and DBT A pragmatic, product-first approach to building data systems Excellent communication and stakeholder management skills Solid understanding of agile data development lifecycles More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Immersum
of 5). Salary: £130,000 – £150,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places … teams Leading a small team of 5 data engineers What you’ll bring: Strong leadership experience in data engineering Deep expertise with AWS, Snowflake, Airflow, and DBT A pragmatic, product-first approach to building data systems Excellent communication and stakeholder management skills Solid understanding of agile data development lifecycles More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
tech. The team are using Databricks and AWS and they’re keen for someone who’s worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. You’ll also work closely with analysts, scientists and other business teams, so you’ll need to be able to … modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure DevOps or similar) Experience working in an Agile environment A proactive mindset — you More ❯
London, England, United Kingdom Hybrid / WFH Options
Zoopla
CI/CD, observability, versioning, and testing in data workflows Architect and evolve our data platform, including data warehousing (Redshift), lakehouse (Databricks), and orchestration (Airflow, Step Functions) capabilities Lead efforts around data governance/cataloging, compliance, and security, ensuring data is trustworthy and well-managed Requirements Essential skills & experience … implementing scalable data platforms and ETL/ELT pipelines Knowledge of data warehousing and data lake architectures, and modern orchestration tools (e.g. Step Functions, Airflow) Experience with infrastructure as code (e.g. Terraform) Understanding of data governance and data quality practices Ability to communicate technical concepts clearly and influence senior More ❯
London, England, United Kingdom Hybrid / WFH Options
HeliosX Group
looking to hire an inquisitive, solutions-focused Analytics Engineer to join our team. You’ll work with the latest cloud technologies (AWS, Snowflake, dbt, Airflow, etc.), building exceptional data products across the full stack from ingestion through to visualisations. You’ll also have the opportunity to work on exciting … SQL and Python, alongside an understanding of data warehouse design principles/best practices. Practical experience using data transformation (e.g. Dbt) and orchestration (e.g. Airflow DAGs) tools, as well as experience using Git for version control. Understanding of data visualisation tools (we use Metabase) and applied experience building dashboards More ❯
About 9fin The world's largest asset class, debt, operates with the worst data. Technology has revolutionized equity markets with electronic trading, quant algos and instantaneous news. However, in debt capital markets, the picture is completely different. It still behaves More ❯