City of London, London, United Kingdom Hybrid / WFH Options
83data
continue to scale their data infrastructure , they’re seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hartree Partners
pandas, xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Jefferson Frank
the businesses data arm. Requirements: * 3+ Years data engineering experience * Snowflake experience * Proficiency across an AWS tech stack * DBT Expertise * Terraform Experience Nice to Have: * Data Modelling * Data Vault * ApacheAirflow Benefits: * Up to 10% Bonus * Up to 14% Pensions Contribution * 29 Days Annual Leave + Bank Holidays * Free Company Shares Interviews ongoing don't miss your chance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kolayo
complex data-related issues. Nice to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (ApacheAirflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Prism Digital
on cloud experience (AWS or Azure or GCP) with IAC( Terraform )and CI/CD ( Jenkins ). Experience in developing and deploying AI/ML pipelines . Familiarity with Airflow , Redshift , and dbt . What you will do: Architect, build, and optimise robust ETL processes for efficient data extraction, transformation, and loading. Develop an automation model and sophisticated data … pipelines using Python, Airflow, Redshift, and more. Collaborate with data scientists to design, implement, and deploy AI & ML pipelines, turning raw data into actionable insights.. Drive scalable data architectures using Terraform and Jenkins, ensuring optimal performance and security. Benefits: Annual Leave: 24 days to start, rising to 29 days . Parental Leave: Enhanced provisions for growing families . Health More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Block MB
cost Collaborate with engineers to integrate data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for clean, maintainable code More ❯
more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us? Work with large-scale, high-impact data Solve real-world problems with a top-tier More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
based work as well as migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data Modelling Data Vault … ApacheAirflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage interviews next week so if you’re interest, get in touch ASAP with a copy of your most up to date CV and email me at m.fox@tenthrevolution.com or call me on More ❯
Spark (PySpark). Experience with Azure Databricks, Delta Lake , and data architecture . Familiarity with Azure cloud , version control (e.g., Git), and DevOps pipelines . Experience with tools like ApacheAirflow, dbt , and Power BI . Working knowledge of NoSQL databases , API integration, and economic/financial data . Azure certifications (e.g., Azure Data Fundamentals) are highly desirable. More ❯
team are working on the core research data platform, as well as data infrastructure, ingestion pipelines and back end services for the aforementioned trading desks. Tech Stack: Python, ETL, Airflow, SQL, AWS Please apply if this is of interest. More ❯
years+ experience in a revelant role. This will mean exposure to market data vendors, ability to communicate with traders and mentoring junior engineers. Tech stack: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL Please apply if this of interest More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Radley James
Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Winston Fox
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Winston Fox
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like ApacheAirflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop … or Composer (ApacheAirflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large More ❯
data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like ApacheAirflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop … or Composer (ApacheAirflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large More ❯
City of London, England, United Kingdom Hybrid / WFH Options
ACLED
English, problem-solving skills, attention to detail, ability to work remotely. Desirable: Cloud architecture certification (e.g., AWS Certified Solutions Architect). Experience with Drupal CMS, geospatial/mapping tools, ApacheAirflow, serverless architectures, API gateways. Interest in conflict data, humanitarian tech, open data platforms; desire to grow into a solution architect or technical lead role. Application Process Submit More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Immersum
Data Engineer (leading a team of 5). Salary: £130,000 – £150,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … product, data science, and engineering teams Leading a small team of 5 data engineers What you’ll bring: Strong leadership experience in data engineering Deep expertise with AWS, Snowflake, Airflow, and DBT A pragmatic, product-first approach to building data systems Excellent communication and stakeholder management skills Solid understanding of agile data development lifecycles Why Join: Be a key More ❯
Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element … and system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Cognify Search
needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯