London, England, United Kingdom Hybrid / WFH Options
HeliosX Group
and services internationally. We are looking to hire an inquisitive, solutions-focused Analytics Engineer to join our team. You’ll work with the latest cloud technologies (AWS, Snowflake, dbt, Airflow, etc.), building exceptional data products across the full stack from ingestion through to visualisations. You’ll also have the opportunity to work on exciting machine learning projects, training and … engineering role. Strong knowledge of SQL and Python, alongside an understanding of data warehouse design principles/best practices. Practical experience using data transformation (e.g. Dbt) and orchestration (e.g. Airflow DAGs) tools, as well as experience using Git for version control. Understanding of data visualisation tools (we use Metabase) and applied experience building dashboards to specifications from business stakeholders. More ❯
eager to work with complex and fast growing datasets, reflect a strong desire to learn and are able to communicate well. Our stack involves but is not limited to Airflow, Data Build Tool (DBT), a multitude of AWS services, Stitch and CICD pipelines. As a Data Engineer you will have the opportunity to promote emerging technologies where they can … worked with (or are keen to do so) Data Analysts, Data Scientists and Software Engineers. Practical knowledge of (or strong desire to learn) the following or similar technologies: Python Airflow Data Warehousing (Redshift/Snowflake) SQL (We use DBT for modelling data in the warehouse) Data Architecture including Dimensional Modelling Otherwise an interest in learning these, with the support More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cognify Search
needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯
needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Cognify Search
needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯
experience with Cloud infrastructure (ideally AWS), DevOps technologies such as Docker or Terraform and CI/CD processes and tools. Have previously worked with MLOps tools like MLFlow and Airflow, or on common problems such as model and API monitoring, data drift and validation, autoscaling, access permissions Have previously worked with monitoring tools such as New Relic or Grafana … and associated ML/DS libraries (scikit-learn, numpy, pandas, LightGBM, LangChain/LangGraph, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, ECR, Athena, etc. MLOps: Terraform, Docker, Spacelift, Airflow, MLFlow Monitoring: New Relic CI/CD: Jenkins, Github Actions More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for More ❯
have experience/expertise/knowledge in the following (in rough priority order): AWS Kubernetes (EKS) Data/network security Python Docker Grafana Postgres CDC systems Data related products (airflow, jupyter, spark, etc) The projects will be varied and we’re looking for someone who can work autonomously and proactively to scope problems and solve and deliver pragmatic solutions … Our Tech Stack... Python as our main programming language Terraform on Spacelift for our infrastructure definition and deployment Kubernetes for data services and task orchestration Airflow for job scheduling and tracking Circle CI for continuous deployment Databricks for our data lake platform Parquet and Delta file formats on S3 for data lake storage Postgres/aurora for our relational More ❯
will need experience in the following: A track record of consolidating data sources, creating pipelines, and ideally a software engineering mindset. Excellent experience with Containers. Strong knowledge of Python, Airflow, SQL, and Linux. Proficient in AWS Lambda, ECS, Athena, S3, Kinesis, Glue, CloudWatch/Terraform/CDK, Tableau Online. They are particularly keen on Senior Data Engineers who utilize More ❯
in data engineering Deep expertise in building and maintaining large-scale data pipelines and ETL processes Strong programming skills in Python, SQL, and experience with modern data tools (Spark, Airflow, dbt) Experience using Databricks, Azure, AWS, and Snowflake Track record of leading technical teams and mentoring junior engineers Excellent communication skills and ability to translate business requirements into technical More ❯
Experience with cloud platforms such as AWS (e.g., S3, Redshift, Lambda) , Azure (e.g., Data Factory, Synapse) , or GCP (e.g., BigQuery, Cloud Functions) . Familiarity with data orchestration tools (e.g., Airflow, dbt) and version control (Git). Solid understanding of data governance, security, and compliance frameworks . Nice to Have: Experience with data lake architectures (Delta Lake, Lakehouse). Familiarity More ❯
Experience with cloud platforms such as AWS (e.g., S3, Redshift, Lambda) , Azure (e.g., Data Factory, Synapse) , or GCP (e.g., BigQuery, Cloud Functions) . Familiarity with data orchestration tools (e.g., Airflow, dbt) and version control (Git). Solid understanding of data governance, security, and compliance frameworks . Nice to Have: Experience with data lake architectures (Delta Lake, Lakehouse). Familiarity More ❯
of data warehousing concepts and data modeling. Excellent problem-solving and communication skills focused on delivering high-quality solutions. Understanding or hands-on experience with orchestration tools such as Apache Airflow. Deep knowledge of non-functional requirements such as availability, scalability, operability, and maintainability. Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job function More ❯
Strong experience with Snowflake, including building full solutions from scratch with security and user access controls. Knowledge of DBT and general data modelling; Vault experience is desirable. Experience with Airflow and Python. Proficiency with AWS services such as Lambda, S3, SNS, and CDK for DevOps. Ability to build, deploy, and manage infrastructure using Terraform. Benefits Bonus opportunity - up to More ❯
London, England, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
in a Data Engineering position. Strong experience with building data pipelines in the cloud (AWS, Azure or GCP). Excellent knowledge of PySpark, Python and SQL fundamentals. Familiar with Airflow, Databricks and/or BigQuery. Ability to work on messy, complex real-world data challenges. Comfortable working in a fast-paced environment. Experience working in finance previous would be More ❯
Docker, Jenkins/TeamCity, monitoring, automated testing Ability to communicate clearly with technical and non-technical colleagues Experience in the following areas would be ideal: dbt and Snowflake Kafka Airflow Job responsibilities Building a streaming platform to capture and aggregate large volumes of tick data Developing ELT pipelines to ingest and transform datasets with Python, Snowflake and dbt Enhancing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Bounce Digital
rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI/CD exposure Background in a quantitative field; AI/ML interest a plus Bonus: Experience in manufacturing More ❯
rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI/CD exposure Background in a quantitative field; AI/ML interest a plus Bonus: Experience in manufacturing More ❯
actionable for operational and ML-driven use cases About you: Strong background in software and data engineering leadership Proficient in Python, SQL, and modern ELT practices (e.g. dbt, Fivetran, Airflow) Deep knowledge of data warehousing (Snowflake), AWS services (e.g. Lambda, Kinesis, S3), and IaC (Terraform) Experienced in building data platforms with a focus on governance, reliability, and business value More ❯
actionable for operational and ML-driven use cases About you: Strong background in software and data engineering leadership Proficient in Python, SQL, and modern ELT practices (e.g. dbt, Fivetran, Airflow) Deep knowledge of data warehousing (Snowflake), AWS services (e.g. Lambda, Kinesis, S3), and IaC (Terraform) Experienced in building data platforms with a focus on governance, reliability, and business value More ❯
analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
using Python within financial markets (ideally for a systematic trading desk) Strong knowledge of SQL and relational databases In depth knowledge of data streaming technologies like Kafka, S3 and Airflow Degree or higher in Computer Science or similar field Willingness to do support and occasional on call work as and when required #Referment Seniority level Seniority level Mid-Senior More ❯
production environments. Expertise in AWS (especially Lambda, ECS/EC2, S3, and RDS). Strong hands-on experience with Terraform and infrastructure-as-code practices. Familiarity with tools like Airflow , DBT , and cloud data platforms such as Snowflake or Databricks . Understanding of CI/CD , observability, and platform reliability practices in cloud-native environments. Experience with container orchestration More ❯
Salisbury, England, United Kingdom Hybrid / WFH Options
Tedaisy Insurance Group
Pandas, SQL, NoSQL. Has strong experience working with modern database platforms such as Cosmos, Athena, and MySQL. Is comfortable with both NoSQL and relational database design. Able to use Airflow to a high skill level. Enjoys working independently in a small team. Has a good understanding of AWS and Azure architecture. There is also an opportunity to get involved More ❯
a senior role Deep expertise with AWS services (S3, Glue, Lambda, SageMaker, Redshift, etc.) Strong Python and SQL skills; experience with PySpark a bonus Familiarity with containerization (Docker), orchestration (Airflow, Step Functions), and infrastructure as code (Terraform/CDK) Solid understanding of machine learning model lifecycle and best practices for deployment at scale Excellent communication skills and the ability More ❯
London, England, United Kingdom Hybrid / WFH Options
Principle
and knowledge systems 5+ years’ experience with SQL and Python Strong data modelling and visualisation skills (e.g. Tableau, MicroStrategy) Experience with orchestration/integration tools (e.g. Azure Data Factory, Airflow) Proven background in data engineering within product-led environments Nice to Have: Experience with analytics and user-facing support tools Exposure to big tech, eCommerce, or high-scale digital More ❯