in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
C2S). • Familiar with Amazon Web Managed Services (AWS). • Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar. • Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML. • Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis. • Familiar with Linux/ More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Erwin, Lucidchart). Strong grasp of data governance, metadata management, and compliance frameworks (e.g., GDPR). Hands-on understanding of Python, SQL, and modern data engineering tools (e.g., Airflow, DBT, Atlan). Exceptional consulting and stakeholder management skills. Career experience within leading consultancies or SIs (e.g., IBM, Thoughtworks, EY, Capgemini). What's on Offer More ❯
projects Support talent acquisition and continuous learning initiatives Knowledge and Experience Knowledge of ML model development and deployment frameworks (MLFlow, Kubeflow Advanced data querying (SQL) and data engineering pipelines (Airflow Extensive experience with comprehensive unit testing, integration testing, and test coverage strategies Experience working with Product Management teams and ability to translate complex technical concepts for non-technical stakeholders More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
C2S). Familiar with Amazon Web Managed Services (AWS). Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar. Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML. Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis. Familiar with Linux/ More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
development efficiency and deployment effectiveness, including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong More ❯
large scale bioinformatics datasets. Experience using Nextflow pipelines. Knowledge of NLP techniques and experience of processing unstructured data, using vector stores, and approximate retrieval. Familiarity with orchestration tooling (e.g. Airflow or Google Workflows). Experience with AI/ML powered applications. Experience with Docker or containerized applications. Why GSK? Uniting science, technology and talent to get ahead of disease More ❯
large scale bioinformatics datasets. Experience using Nextflow pipelines. Knowledge of NLP techniques and experience of processing unstructured data, using vector stores, and approximate retrieval. Familiarity with orchestration tooling (e.g. Airflow or Google Workflows). Experience with AI/ML powered applications. Experience with Docker or containerized applications. Why GSK? Uniting science, technology and talent to get ahead of disease More ❯
large scale bioinformatics datasets. Experience using Nextflow pipelines. Knowledge of NLP techniques and experience of processing unstructured data, using vector stores, and approximate retrieval. Familiarity with orchestration tooling (e.g. Airflow or Google Workflows). Experience with AI/ML powered applications. Experience with Docker or containerized applications. Why GSK? Uniting science, technology and talent to get ahead of disease More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
set of ML and NLP models - Build and maintain batch and real-time feature computation pipelines capable of processing complex structured and unstructured data using technologies such as Spark, ApacheAirflow, AWS SageMaker etc. - Contribute to the implementation of foundational ML infrastructure such as feature storage and engineering, asynchronous (batch) inference and evaluation - Apply your keen product mindset More ❯
b2c who you will have heard of/used, who are seeking 2x Data Engineers to join them asap on an initial 6 months contract. Python, SQL, AWS, dbt, Airflow (spark/scala a bonus) 6 months Outside IR35 (ltd company) £500-600 per day Asap start More ❯
b2c who you will have heard of/used, who are seeking 2x Data Engineers to join them asap on an initial 6 months contract. Python, SQL, AWS, dbt, Airflow (spark/scala a bonus) 6 months Outside IR35 (ltd company) £500-600 per day Asap start More ❯
developing with Python and related ML libraries Functional programming experience a plus Willing to learn and develop in new technologies as required Experience with MongoDB Experience with MLFlow or AirFlow a plus Other tools: Maven, GIT, LINUX, Location: Customer Site, Telework Telework: 75% (In office at least one day a week More ❯
in AWS environments* Deep understanding of cloud-native data services: S3, Redshift, Glue, Athena, EMR, Kinesis, Lambda* Strong hands-on expertise in data modelling, distributed systems, and pipeline orchestration (Airflow, Step Functions)* Background in energy, trading, or financial markets is a strong plus* Excellent knowledge of Python, SQL, and data governance frameworks* Experience working with stakeholders in fast-paced More ❯
Science, or a related technical field required • 4+ years in data engineering, preferably within secure or classified environments • Strong proficiency in Python, Spark, SQL, and orchestration tools such as Airflow • Hands-on experience with classified data management, secure networking, and infrastructure performance tuning Preferred Experience • Familiarity with secure cloud environments (e.g., AWS GovCloud, C2S) • Strong troubleshooting and optimization skills More ❯
Data Engineer who enjoys coaching & mentoring others, and knowledge sharing. You’ll bring: Strong industry experience within Data Engineering & Cloud Experience with these tools, or familiar: Python, SQL, Kafka, Airflow, Azure, AWS DevOps experience would be a plus (desirable) Formal OR informal Training or Coaching experience (Coach/Mentor/Trainer/Instructor/Tutor/Lecturer etc.) Resilience More ❯
roles and least privilege access. Governance: Data models, metadata, documentation, and standards. Collaboration: Work closely with global colleagues; communicate designs clearly. Requirements AWS data stack, hands on experience: Glue, Airflow/MWAA, Athena, Redshift, RDS, S3. Strong coding skills in Python & PySpark/Pandas & SQL. CI/CD automation: AWS CodePipeline, CloudFormation (infrastructure as code). Architect-level experience More ❯
Cloud Platform (GCP) and cloud data tools Background in CRM systems and customer data structures Understanding of data warehousing concepts and cloud architecture Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline Data Insights reporting experience Competent with real-time data processing and streaming technologies Proficiency in Tableau or other data visualisation tools is desirable ** PLEASE More ❯