New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices in testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For More ❯
systems and APIs (RESTful/GraphQL), with solid experience in microservices and databases (SQL/NoSQL). You know your way around big data tools (Spark, Dask) and orchestration (Airflow, DBT). You understand NLP and have experience working with Large Language Models. You're cloud-savvy (AWS, GCP, or Azure) and comfortable with containerization (Docker, Kubernetes). You More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and familiarity with Git Strong communicator, eager to learn, and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯
gaming, food, health care). Knowledge of regulatory reporting and treasury operations in retail banking Exposure to Python, Go or similar languages. Experience working with orchestration frameworks such as Airflow/Luigi Have previously used dbt, dataform or similar tooling. Used to AGILE ways of working (Kanban, Scrum) The Interview Process: Our interview process involves 3 main stages More ❯
Cloud usage VMWare usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused: Data Pipeline Orchestration and ELT tooling such as ApacheAirflow, Apache NiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3, MinIO, LakeFS CI/CD More ❯
VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
and Jenkins Proficient in Python and shell scripting Experience with Delta Lake table formats Strong data engineering background Proven experience working with large datasets Nice to Have : Familiarity with Airflow Background in full stack development Team & Culture : Join a collaborative team of 10 professionals Friendly, delivery-focused environment Replacing two outgoing contractors - the handover will ensure a smooth start More ❯
intelligence tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
Bath, Somerset, South West, United Kingdom Hybrid / WFH Options
Cathcart Technology
FastAPI with either Flask or Django for API frameworks for web services AWS CDK experience with Infrastructure as code for AWS deployments Docker Containerization for local development Prefect or Airflow or Dagster experience for pipeline orchestration frameworks What you'll be doing: Develop and optimize FastAPI applications integrated with AWS CDK infrastructure. Collaborate closely with developers, subject matter experts … with AWS services, deployment processes, and infrastructure as code approaches AWS CDK, Terraform. Comfortable working with Docker containers for local development. Familiarity with pipeline orchestration frameworks such as Prefect, Airflow, or Dagster. Excellent communication skills with a collaborative mindset. Contract Details: Location: Fully remote UK based candidates only. Length: 9 months. Rate: £450 to £465 per day Outside IR35. More ❯
troubleshoot, and manage daily Data Ops, including ETL workflows and Power BI releases Be the escalation point for Level 2/3 support, resolving issues hands-on across AWS (Airflow, S3, Redshift, Glue) Lead and coordinate with a team of 5–6 offshore data engineers and suppliers Support and automate Power BI deployments and data pipeline releases Own release … communications and issue resolution across stakeholders Work closely with the BI Operations Manager and wider tech/data teams Tech Stack You’ll Use: AWS: Glue, S3, Redshift, Airflow Power BI: Deployments, troubleshooting, performance tuning ETL & Scripting: SQL, Python (desirable) Monitoring & incident response in a live production data environment What You'll Need ✅ Extensive experience in analytics engineering Strong More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) – Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW More ❯
efficiency and predictive capabilities, with demonstrable performance metrics. Desirable Bachelor's degree in Computer Science or Software Engineering. Experience deploying AI within AWS and MS Azure. Experience using Docker, Airflow and Openshift. Cloud Certification. Data science model review, code refactoring, model optimization, containerisation, deployment, versioning, monitoring of model quality and non-functional requirements. T he role offers a strong More ❯
software architecture, perform thorough code reviews, and ensure high coding standards. Foster innovative thinking and strive to create robust and scalable systems. Your Skills: Proficiency in Python, Kubernetes, Docker, Airflow, Harness, and Jenkins. Knowledge and experience with CI/CD pipelines and methods. Experience building ML pipelines and working with classical ML algorithms. Ability to develop and maintain high More ❯
Lumi Space is empowering the future prosperity of earth - making space scalable and sustainable using ground-based laser systems. We work with global companies and institutions to build products and services to precisely track satellites and remove the dangers of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
to our bespoke data pipeline and associated API services. Key Requirements 5+ years of Python experience with frameworks like Flask, FastAPI, and Django. Strong command orchestration tools (e.g. Prefect, Airflow), Docker, and AWS infrastructure (CDK, Terraform). Solid understanding of API services, authentication methods (JWT, SSO), and clear, pragmatic communication skills. Maintain, upgrade, and improve existing systems and custom More ❯
teams to embed data-driven thinking across the organisation Ensure data governance, quality, and security best practices are in place Utilising the AWS tech stack including S3, Lambda, Redshift, Airflow Preferably some experience with Snowflake although this is not essential This is a hands-on leadership role - ideal for someone who enjoys both strategic thinking and technical delivery. Benefits More ❯