City of London, London, United Kingdom Hybrid / WFH Options
83data
continue to scale their data infrastructure , they’re seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hartree Partners
pandas, xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but More ❯
City of London, London, United Kingdom Hybrid / WFH Options
SGI
for deployment and workflow orchestration Solid understanding of financial data and modelling techniques (preferred) Excellent analytical, communication, and problem-solving skills Experience with data engineering & ETL tools such as ApacheAirflow or custom ETL scripts. Strong problem-solving skills with a keen analytical mindset especially in handling large data sets and complex data transformations. Strong experience in setting More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance More ❯
more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us? Work with large-scale, high-impact data Solve real-world problems with a top-tier More ❯
more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us? Work with large-scale, high-impact data Solve real-world problems with a top-tier More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Roc Search
Manage deployments with Helm and configuration in YAML. Develop shell scripts and automation for deployment and operational workflows. Work with Data Engineering to integrate and manage data workflows using ApacheAirflow and DAG-based models. Perform comprehensive testing, debugging, and optimization of backend components. Required Skills Bachelor's degree in Computer Science, Software Engineering, or a related field … and YAML for defining deployment configurations and managing releases. Proficiency in shell scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with ApacheAirflow for managing complex data processing workflows. Familiarity with database systems (SQL and NoSQL) and proficiency in writing efficient queries. Solid understanding of software development best practices, including More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
based work as well as migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data Modelling Data Vault … ApacheAirflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage interviews next week so if you’re interest, get in touch ASAP with a copy of your most up to date CV and email me at m.fox@tenthrevolution.com or call me on More ❯
and AWS tools (Redshift, Glue, S3, Lambda, etc.) Strong grasp of data modeling, governance, and pipeline orchestration Excellent communication and collaboration skills in Agile teams Desirable: Experience with dbt, Airflow, Monte Carlo, and BI tools (Power BI, Tableau, QuickSight) Knowledge of data product principles, real-time pipelines, and data enablement programs Join to shape a data-driven future and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Radley James
Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
Python, and AWS tools (Redshift, Glue, S3, Lambda, etc.) Strong grasp of data modeling, governance, and pipeline orchestration Excellent communication and collaboration skills in Agile teams Experience with dbt, Airflow, Monte Carlo, and BI tools (Power BI, Tableau, QuickSight) Knowledge of data product principles, real-time pipelines, and data enablement programs Join to shape a data-driven future and More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
data warehouses) Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. #J-18808-Ljbffr More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Winston Fox
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Winston Fox
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Avanti
data modelling concepts (e.g. dimensional models, star/snowflake schemas) Knowledge of data quality, access controls , and compliance frameworks Nice to Have Experience with orchestration or pipeline frameworks like Airflow or dbt Familiarity with BI platforms (e.g. Power BI, Tableau, QuickSight) Exposure to streaming data , observability, or data lineage tools Comfort working with diverse data sources such as APIs More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element … and system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Parser
ll make: Design and develop reusable Python packages (pip/conda) to productionize data science solutions. Process big data at scale using Hadoop/Spark and optimize workflows with Airflow/orchestration tools. Build scalable applications and REST/RPC APIs (Flask/FastAPI/gRPC) for global products. Advocate for engineering best practices, including CI/CD, DevOps … and containerization (Docker/Kubernetes). Mentor junior engineers and lead initiatives to enhance research tooling and dashboards. Languages: Python, SQL Tools: Spark, Hadoop, Airflow, Docker, FastAPI/Flask Cloud: AWS, CI/CD pipelines (Jenkins, Git) What you'll bring to us: 7+ years in data engineering/science, with expertise in Python, Spark, and SQL. Proven experience More ❯
years+ experience in a revelant role. This will mean exposure to market data vendors, ability to communicate with traders and mentoring junior engineers. Tech stack: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL Please apply if this of interest More ❯
Financial Services (FS) Deep expertise in the full data engineering lifecycle—from data ingestion through to end-user consumption Practical experience with modern data tools and platforms, including Redshift, Airflow, Python, DBT, MongoDB, AWS, Looker, and Docker Strong grasp of best practices in data modelling, transformation, and orchestration Proven ability to build and support both internal analytics solutions and More ❯
data warehousing - any of BigQuery, Redshift, Snowflake or Databricks is fine experience working with cloud infrastructures - AWS and/or GCP being most advantageous 👍 Bonus points for experience with: Airflow RudderStack, Expo and/or Braze (or similar tools) Working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments More ❯
Computer Science or a related discipline Solid Python programming skills Good working knowledge of SQL Comfortable using Git for version control Desirables: Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package More ❯
. Strong understanding of data security, quality, and governance principles. Excellent communication and collaboration skills across technical and non-technical teams. Bonus Points For: Experience with orchestration tools like Apache Airflow. Familiarity with real-time data processing and event-driven systems. Knowledge of observability and anomaly detection in production environments. Exposure to visualization tools like Tableau or Looker. Relevant More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Jefferson Frank
Lead Data Engineer - Snowflake, DBT, Airflow - London - Up to £100k I'm working with a key client of ours here at TRG who are looking to grow out their Data & Analytics function. My client are globally renowned for being a leader within their relative field. Whilst they are a very well recognised house hold brand, they are also known … happy employee means higher productivity! So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... * Expert in Snowflake * Strong DBT experience * Strong Airflow experience * Expert knowledge and understanding of Data Warehousing * Strong AWS experience This is a great opportunity to join outstanding organisation who pride themselves on being one of the best … is of an interest then get in touch ASAP. Send across your CV to t.shahid@nigelfrank.com or alternatively, give me a call on 0191 3387551. Keywords: Snowflake, DBT, SQL, Airflow, AWS, Engineer, DWH, Data Warehouse, Data Warehousing, Architecture, London #J-18808-Ljbffr More ❯