with an ability to work on multiple projects simultaneously Strong interpersonal and communication skills Quick, self-learning capabilities and creativity in problem-solving Preferred: Familiarity with Python Familiarity with Airflow, ETL tools, Snowflake and MSSQL Hands-on with VCS (Git) Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. Bloomberg is an equal More ❯
development of a new global data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting skills in Python. Knowledge of More ❯
/CD processes GitHub experience including actions and CI/CD Data Modeling experience, including e xtensive experience designing dimensional models based on business use cases and reporting needs Airflow experience (Task scheduler and orchestrator) Python experience (Programming Language) Soft Skills: Interpersonal skills to engage and communicate effectively with customers and audiences of different backgrounds within the organization Please More ❯
role. A strong understanding of data quality concepts, methodologies, and best practices. Proficiency in SQL and data querying for data validation and testing purposes. Hands-on experience with Snowflake, Airflow or Matillion would be ideal Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases, and data modelling concepts. Excellent analytical and More ❯
especially in areas like compliance, payments, or SaaS metrics. Exposure to statistical analysis or experimentation (A/B testing, cohort analysis, etc.). Familiarity with data pipeline tools (e.g., Airflow) and modern data stack concepts. Why join us Unique opportunity . Be part of a company with direct access to tens of thousands of new businesses in the UK. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
1st Formations
especially in areas like compliance, payments, or SaaS metrics. Exposure to statistical analysis or experimentation (A/B testing, cohort analysis, etc.). Familiarity with data pipeline tools (e.g., Airflow) and modern data stack concepts. Why join us Unique opportunity . Be part of a company with direct access to tens of thousands of new businesses in the UK. More ❯
similarly complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience More ❯
.NET framework backend services and React frontends. You'll utilise tools such as Terraform for infrastructure-as-code ( IaC ), AWS (Lambda, EC2, EKS, Step Functions, VPC etc.) for ETL, Airflow pipelines, Snowflake, and ensure architectural alignment with AI/ML initiatives and data-driven services. You will serve as the go-to engineer for: End-to-end data migration More ❯
sensitive and global datasets. Hands-on experience integrating advanced AI/ML capabilities into operational and analytical data platforms. Extensive knowledge of modern data orchestration and workflow technologies (e.g., Airflow, Kubeflow), and infrastructure automation frameworks (Terraform, CloudFormation). Demonstrated leadership in managing technical product roadmaps, agile delivery practices, and stakeholder management in complex environments. Boston Consulting Group is an More ❯
software development and code quality. Vendor Collaboration: Work closely with third-party vendors to integrate their solutions, ensuring they meet our high standards for production environments. Workflow Automation: Utilize Airflow to automate and optimize workflows, ensuring efficient and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java More ❯
intelligence tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and More ❯
Spring Boot, Python/FastAPI - Frontend: TypeScript, React, Next.js . Headless CMS, Design systems. - CI/CD: ArgoCD, GitHub Actions - Infrastructure: GCP, Kubernetes, Terraform, Grafana - Data: Postgres, dbt, BigQuery, Airflow, Hex What You'll Be Doing Igniting Potential: Lead, mentor, and empower a high-performing team of 3-6 engineers, focusing relentlessly on outcomes that wow our members , not More ❯
Job Title: Data Modeller Salary: £85,000 - £95,000 + Benefits and Bonus Location: London (3 days a week onsite) The Role: As a Data Modeler , you will be responsible for designing and implementing data models to support complex data More ❯
only clean, high-quality data flows through the pipeline. Collaborate with research to define data quality benchmarks . Optimize end-to-end performance across distributed data processing frameworks (e.g., Apache Spark, Ray, Airflow). Work with infrastructure teams to scale pipelines across thousands of GPUs . Work directly with the leadership on the data team roadmaps. Manage the … on experience in training and optimizing classifiers. Experience managing large-scale datasets and pipelines in production. Experience in managing and leading small teams of engineers. Expertise in Python , Spark , Airflow , or similar data frameworks. Understanding of modern infrastructure: Kubernetes , Terraform , object stores (e.g. S3, GCS) , and distributed computing environments. Strong communication and leadership skills; you can bridge the gap More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
southampton, south east england, united kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, ApacheAirflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional Collaboration: Participate in code reviews, architectural discussions, and sprint planning to deliver features end-to More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Harnham
and resolve ETL failures or data issues Collaborate with cross-functional and offshore teams, as well as suppliers Hands-on support for tools like Power BI, AWS, SQL and AirFlow Staying ahead of emerging AI tech and research to propose exciting solutions Proactively manage and escalate data issues SKILLS AND EXPERIENCE Required 5+ years industry experience (flexible depending on … quality of experience) Airflow (must-have), AWS (Redshift, S3, Glue), Power BI Strong SQL, as well as Python AWS ecosystem familiarity is essential Both hands-on and management/leadership experience is required Able to work in a fast-paced, dynamic environment This includes a two-stage interview process! This role cannot sponsor. Apply below More ❯
Bath, Somerset, United Kingdom Hybrid / WFH Options
Cathcart Associates Group Ltd
FastAPI with either Flask or Django for API frameworks for web services AWS CDK experience with Infrastructure as code for AWS deployments Docker Containerization for local development Prefect or Airflow or Dagster experience for pipeline orchestration frameworks What you'll be doing: Develop and optimize FastAPI applications integrated with AWS CDK infrastructure. Collaborate closely with developers, subject matter experts … with AWS services, deployment processes, and infrastructure as code approaches AWS CDK, Terraform. Comfortable working with Docker containers for local development. Familiarity with pipeline orchestration frameworks such as Prefect, Airflow, or Dagster. Excellent communication skills with a collaborative mindset. Contract Details: Location: Fully remote UK based candidates only. Length: 9 months. Rate: £450 to £465 per day Outside IR35. More ❯
troubleshoot, and manage daily Data Ops, including ETL workflows and Power BI releases Be the escalation point for Level 2/3 support, resolving issues hands-on across AWS (Airflow, S3, Redshift, Glue) Lead and coordinate with a team of 5–6 offshore data engineers and suppliers Support and automate Power BI deployments and data pipeline releases Own release … communications and issue resolution across stakeholders Work closely with the BI Operations Manager and wider tech/data teams Tech Stack You’ll Use: AWS: Glue, S3, Redshift, Airflow Power BI: Deployments, troubleshooting, performance tuning ETL & Scripting: SQL, Python (desirable) Monitoring & incident response in a live production data environment What You'll Need ✅ Extensive experience in analytics engineering Strong More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) – Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯