and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
staying current with emerging data technologies. Technical Requirements Proficiency in SQL , including complex query design and optimisation. Strong Python programming skills, particularly with libraries such as pandas , NumPy , and Apache Spark . Experience building and maintaining data ingestion pipelines and optimising performance. Hands-on experience with open-source data frameworks such as Apache Spark , Apache Kafka , or … ApacheAirflow . Knowledge of distributed computing and big data concepts. Experience using version control systems (Git) and CI/CD practices. Familiarity with relational databases (PostgreSQL, MySQL, or similar). Experience with containerisation technologies ( Docker , Kubernetes ). Understanding of data orchestration tools (e.g., Airflow or Dagster). Knowledge of data warehousing principles and dimensional modelling . More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like ApacheAirflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Novatus Global
containerisation (Docker, Kubernetes), and modern DevOps practices. Exposure to regulatory reporting, compliance systems, or financial services technology. Experience with big data tools and the modern data stack (e.g., dbt, Airflow, Kafka). Knowledge of security best practices and data governance. Benefits: Private Medical Insurance (AXA) – includes mental health, dental, vision, and private GP access Employee Assistance Program Enhanced parental More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Intellect Group
of ETL/ELT processes and database fundamentals. A motivated, proactive mindset with a strong desire to learn and grow. Nice to Have: Exposure to data pipeline tools (e.g. ApacheAirflow, dbt). Experience with containerisation tools like Docker. Familiarity with version control and CI/CD practices. Knowledge of BI tools (Looker, Tableau, or similar). If More ❯
slough, south east england, united kingdom Hybrid / WFH Options
SGI
skills with the ability to interact effectively with both technical and trading stakeholders. Desirable: Previous front-office or systematic trading desk experience. Familiarity with modern MLOps (Docker, Kubernetes, MLflow, Airflow) and distributed computing (Spark, Ray). Experience with alpha signal generation, regime detection, or portfolio optimization. Exposure to alternative/ESG datasets, macroeconomic indicators, and sentiment analysis. More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Kubrick Group
or GCP native stacks) Experience with platform observability and CI/CD for data platforms Hands-on experience with modern data engineering tools such as dbt, Fivetran, Matillion or Airflow History of supporting pre-sales activities in a product or consultancy-based business What Kubrick offers: A fast moving and fast growth business which is doing something seriously innovative More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
role, you will: Lead, mentor, and inspire analytics and data engineering teams, fostering technical excellence and a culture of innovation. Architect and optimise ELT pipelines and workflows using dbt, Airflow, SQL, and Python. Oversee the design and evolution of large-scale analytics platforms and data warehouses (Snowflake, BigQuery, Redshift). Ensure best practices in governance, modelling, data quality, and More ❯
Solid understanding of data modeling concepts (star/snowflake schemas, normalization). Experience with version control systems (e.g., Git) and CI/CD practices. Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
background at a leading finance or big tech firm Strong hands-on expertise in Python and modern ETL frameworks Experience designing and maintaining cloud-based data pipelines (e.g. AWS, Airflow, Snowflake) Deep understanding of data modelling, validation, and pipeline resilience Familiarity with financial or alternative datasets preferred More ❯
as pytest. Preferred Qualifications Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data). Knowledge of bioinformatics or large-scale research data. Familiarity with Nextflow, Airflow, or Google Workflows. Understanding of NLP techniques and processing unstructured data. Experience with AI/ML-powered applications and containerised development (Docker). Contract Details Day Rate: £750 (Inside More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Montash
S3 and Redshift, particularly around storage, computation and security. Familiarity with modern BI tools such as Power BI or AWS QuickSight. Experience with open-source data stack tools like Airflow, DBT, Airbyte or similar. Strong grasp of software development best practices and CI/CD processes. Skilled in performance tuning, testing, and automation within data engineering environments. Excellent communication More ❯
proactive, curious mindset and a willingness to take initiative. Desirable Experience with dashboarding/BI tools (Power BI, Tableau, Streamlit). Familiarity with Linux scripting or workflow orchestration tools (Airflow, Prefect). Exposure to financial markets , asset management, or consulting environments. This role would suit someone who is technically strong, logical, and ambitious, with the maturity to take ownership More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Oscar
PyTorch). Solid knowledge of SQL and data warehousing concepts. Demonstrated experience with cloud platforms ( GCP & Azure preferred). Strong understanding of data modelling, transformation, and orchestration (e.g. dbt, Airflow, Azure Data Factory). Familiarity with digital marketing data flows (e.g. GA4, ad servers, programmatic platforms, CRMs). Proven ability to manage or mentor junior team members, providing technical More ❯
lakes for MFT and HFT quant teams, ingest vast amounts of data into the company, which can be used for quant research and model development. Stack: Python, AWS, S3, Airflow, SQL The role needs a talented data engineer who can build a scalable and bespoke data platform, but at the same time someone comfortable with communicating technical solutions to More ❯
NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Experis
Proven experience in developing and deploying machine learning models in production. Solid understanding of data structures, algorithms, and software engineering principles. Experience with ML pipelines and orchestration tools (e.g., Airflow, Kubeflow, MLflow). Proficiency in working with cloud services (AWS, GCP, or Azure). Strong understanding of CI/CD, containerisation (Docker), and orchestration (Kubernetes). Excellent problem-solving More ❯
and Python programming languages. · Strong understanding of graph databases (e.g., RDF, Neo4j , GraphDB). · Experience with data modeling and schema design. · Knowledge of data pipeline tools and frameworks (e.g., ApacheAirflow, Luigi). · Excellent problem-solving and analytical skills. · Ability to work independently and as part of a team. Clinical knowledge More ❯
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯