South East London, England, United Kingdom Hybrid / WFH Options
83data
continue to scale their data infrastructure , they’re seeking a seasoned Principal Data Engineer to help design, build, and optimise a modern data platform, with a focus on orchestration (Airflow), scalable data warehouse architecture, and high-performance pipelines. The ideal candidate will bring both technical depth and strategic thinking , with the ability to communicate effectively across business and technical … warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using ApacheAirflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with ApacheAirflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Hartree Partners
pandas, xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
PyTorch). Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary More ❯
and implementation of distributed data solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian Jira or Microsoft DevOps and More ❯
opportunities to optimise data workflows, adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency: Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like ApacheAirflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise: Familiarity with cloud platforms like Snowflake, Databricks, or AWS/GCP/Azure for More ❯
Brighton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
opportunities to optimise data workflows, adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency : Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like ApacheAirflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise : Familiarity with cloud platforms like Snowflake, Databricks, or AWS/GCP/Azure for More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Ignite Digital Talent
Data Engineer - Python, SQL, DBT, DWH, Airflow, Kafka, Snowflake - Challenger Digital Bank Location: London (Hybrid, 2 days per week in office) Type: Permanent Salary: Competitive + 20% bonus + excellent benefits Are you a Data Engineer ready to shape the future of a digital-first banking experience? Join a fast-growing start-up challenger bank where data is at … bank’s next-generation data infrastructure and enable insight-led growth from the ground up. What you will be doing: Build and optimise robust data pipelines using Python , Kafka , Airflow , and DBT Design and manage Snowflake data warehouse objects to support scalable analytics Write clean and efficient SQL to support reporting, dashboards and data products Collaborate across engineering, analytics … Python in a data context Proven skills in SQL Experience with Data Warehousing (DWH) ideally with Snowflake or similar cloud data platforms (Databricks or Redshift) Experience with DBT, Kafka, Airflow, and modern ELT/ETL frameworks Familiarity with data visualisation tools like Sisense, Looker, or Tableau Solid understanding of data architecture, transformation workflows, and pipeline orchestration Clear communicator who More ❯
marketing strategists, data analysts, data engineers, and product owners to define use cases and deliver scalable solutions. Model Deployment & Monitoring: Deploy models using MLOps practices and tools (e.g., MLflow, Airflow, Docker, cloud platforms) ensuring performance, reliability, and governance compliance. Innovation & Research: Stay current on advancements in AI/ML and proactively bring forward new ideas, frameworks, and techniques that … or notebooks-based workflows for experimentation. Data Access & Engineering Collaboration : Comfort working with cloud data warehouses (e.g., Snowflake, Databricks, Redshift, BigQuery) Familiarity with data pipelines and orchestration tools like Airflow Work closely with Data Engineers to ensure model-ready data and scalable pipelines. Nice to have Prior experience working in financial services or within a marketing analytics function. Knowledge More ❯
more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Why Join Us? Work with large-scale, high-impact data Solve real-world problems with a top-tier More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Burns Sheehan
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
based work as well as migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data Modelling Data Vault … ApacheAirflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage interviews next week so if you’re interest, get in touch ASAP with a copy of your most up to date CV and email me at m.fox@tenthrevolution.com or call me on More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as Apache Hive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
team are working on the core research data platform, as well as data infrastructure, ingestion pipelines and back end services for the aforementioned trading desks. Tech Stack: Python, ETL, Airflow, SQL, AWS Please apply if this is of interest. More ❯
years+ experience in a revelant role. This will mean exposure to market data vendors, ability to communicate with traders and mentoring junior engineers. Tech stack: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL Please apply if this of interest More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Radley James
Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
z2bz0 years experience gained in a Hedge Fund, Investment Bank, FinTech or similar Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git More ❯
/Haskell/F# etc) Nice to Have A ComSci degree from a top rated uni Experience in a fast-paced startup environment Task orchestration frameworks (e.g. Luigi, Dask, Airflow + Celery etc) Experience owning or being involved longer-term in an open-source project Demonstrable Rust experience or keen interest Data pipelines and big data tech Docker: both More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Allica Bank Limited
production-grade tools, have a test-driven approach, consistent and well documented code). You have strong SQL skills. Deployed applications on cloud services. Experience in using orchestration tools (Airflow, Dagster or Prefect). Experience with container technology (Docker, Kubernetes). Experience with CI/CD pipelines (preferably Azure DevOps). Experience with application deployment to Cloud services (GCP More ❯
Social network you want to login/join with: Software Engineer - Data Reporting, slough col-narrow-left Client: Selby Jennings Location: slough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 31.05.2025 More ❯
data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like ApacheAirflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop … or Composer (ApacheAirflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
pipelines using AWS Lambda, API Gateway, and Kinesis. Integrating third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND … REQUIREMENTS Proven experience with AWS services including Lambda, API Gateway, S3, Kinesis, and CloudWatch. Strong programming ability in Python and data transformation skills using SQL and DBT. Experience with Airflow for orchestration and scheduling. Familiarity with third-party API integration and scalable data delivery methods. Excellent communication and the ability to work in a collaborative, agile environment. HOW TO More ❯
degree in Computer Science. -Python and SQL experience. -1-5 years experience in a data or software engineering role. -Familiarity with cloud/data warehousing. -Experience with Snowflake, Kafka, Airflow would be helpful. -Experience with financial data sets/vendors would be helpful. More ❯
Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element … and system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Cognify Search
needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯