intelligence tool, providing reliable data access to users throughout Trustpilot Design, build, maintain, and rigorously monitor robust data pipelines and transformative models using our modern data stack, including GCP, Airflow, dbt, and potentially emerging technologies like real-time streaming platforms Develop and manage reverse ETL processes to seamlessly integrate data with our commercial systems, ensuring operational efficiency Maintain and More ❯
Spring Boot, Python/FastAPI - Frontend: TypeScript, React, Next.js . Headless CMS, Design systems. - CI/CD: ArgoCD, GitHub Actions - Infrastructure: GCP, Kubernetes, Terraform, Grafana - Data: Postgres, dbt, BigQuery, Airflow, Hex What You'll Be Doing Igniting Potential: Lead, mentor, and empower a high-performing team of 3-6 engineers, focusing relentlessly on outcomes that wow our members , not More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Are you a problem-solver with a passion for data, performance, and smart engineering? This is your opportunity to join a fast-paced team working at the forefront of data platform innovation in the financial technology space. You'll tackle More ❯
Job Title: Data Modeller Salary: £85,000 - £95,000 + Benefits and Bonus Location: London (3 days a week onsite) The Role: As a Data Modeler , you will be responsible for designing and implementing data models to support complex data More ❯
only clean, high-quality data flows through the pipeline. Collaborate with research to define data quality benchmarks . Optimize end-to-end performance across distributed data processing frameworks (e.g., Apache Spark, Ray, Airflow). Work with infrastructure teams to scale pipelines across thousands of GPUs . Work directly with the leadership on the data team roadmaps. Manage the … on experience in training and optimizing classifiers. Experience managing large-scale datasets and pipelines in production. Experience in managing and leading small teams of engineers. Expertise in Python , Spark , Airflow , or similar data frameworks. Understanding of modern infrastructure: Kubernetes , Terraform , object stores (e.g. S3, GCS) , and distributed computing environments. Strong communication and leadership skills; you can bridge the gap More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
southampton, south east england, united kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
Basingstoke, Hampshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
e.g., Archimate), and cloud platforms (AWS, Azure) Hands-on experience with DevSecOps tooling, automation platforms (Power Platform, UiPath), and secure software development practices Familiarity with data integration pipelines (Kafka, ApacheAirflow), API management, and scripting (Python) Strong understanding of software design patterns including microservices, cloud-native, and OO design Eligible and willing to undergo high-level security clearance More ❯
ML Pipelines: Engage with data and ML scientists to plan the architecture for end-to-end machine learning workflows. Implement scalable training and deployment pipelines using tools such as ApacheAirflow and Kubernetes. Perform comprehensive testing to ensure reliability and accuracy of deployed models. Develop instrumentation and automated alerts to manage system health and detect issues in real More ❯
features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, ApacheAirflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional Collaboration: Participate in code reviews, architectural discussions, and sprint planning to deliver features end-to More ❯
Bath, Somerset, South West, United Kingdom Hybrid / WFH Options
Cathcart Technology
FastAPI with either Flask or Django for API frameworks for web services AWS CDK experience with Infrastructure as code for AWS deployments Docker Containerization for local development Prefect or Airflow or Dagster experience for pipeline orchestration frameworks What you'll be doing: Develop and optimize FastAPI applications integrated with AWS CDK infrastructure. Collaborate closely with developers, subject matter experts … with AWS services, deployment processes, and infrastructure as code approaches AWS CDK, Terraform. Comfortable working with Docker containers for local development. Familiarity with pipeline orchestration frameworks such as Prefect, Airflow, or Dagster. Excellent communication skills with a collaborative mindset. Contract Details: Location: Fully remote UK based candidates only. Length: 9 months. Rate: £450 to £465 per day Outside IR35. More ❯
troubleshoot, and manage daily Data Ops, including ETL workflows and Power BI releases Be the escalation point for Level 2/3 support, resolving issues hands-on across AWS (Airflow, S3, Redshift, Glue) Lead and coordinate with a team of 5–6 offshore data engineers and suppliers Support and automate Power BI deployments and data pipeline releases Own release … communications and issue resolution across stakeholders Work closely with the BI Operations Manager and wider tech/data teams Tech Stack You’ll Use: AWS: Glue, S3, Redshift, Airflow Power BI: Deployments, troubleshooting, performance tuning ETL & Scripting: SQL, Python (desirable) Monitoring & incident response in a live production data environment What You'll Need ✅ Extensive experience in analytics engineering Strong More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
Technologies & Tools: Salesforce platform (Admin, Developer, and Deployment tools) Snowflake Data Cloud Git, Bitbucket, GitHub Jenkins, Azure DevOps, or GitLab CI/CD Jira, Confluence DataOps tools (e.g., dbt, Airflow) - Desirable The Ideal Candidate: Has previously built environments for large-scale IT transformation programmes Brings hands-on experience with automation and process optimisation Is highly proficient in JIRA Has More ❯
ideally with some prior management or lead responsibility. A real passion for coaching and developing engineers. Hands-on experience with their tech stack - any cloud, Snowflake (or equivalent), Python, Airflow, Docker Ability to juggle multiple products and effectively gather requirements. Experience with real-time data products is a big plus. Strong communication skills and a good academic background. HOW More ❯
Lambda, Azure) Strong problem-solving skills and critical thinking Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), and cloud deployment experience Familiarity with Agile methodologies If you're interested in joining as our new Head of AI, we'd love to hear from you More ❯
problem-solving skills and critical thinking in AI research Strategic planning abilities Additional Skills (desired): Experience with protein or DNA bioinformatics MLOps expertise Software engineering skills, data pipelining (e.g., Airflow), cloud deployment experience Knowledge of Agile methodologies Interested in joining as our Head of AI? We look forward to hearing from you More ❯
efficiency and predictive capabilities, with demonstrable performance metrics. Desirable Bachelor's degree in Computer Science or Software Engineering. Experience deploying AI within AWS and MS Azure. Experience using Docker, Airflow and Openshift. Cloud Certification. Data science model review, code refactoring, model optimization, containerisation, deployment, versioning, monitoring of model quality and non-functional requirements. T he role offers a strong More ❯
ideally in early-stage startups Deep expertise in React Native and TypeScript, with the ability to dip into Swift or Objective-C Experience navigating backend or infrastructure (Python, GCP, Airflow) to unblock yourself A scrappy, resourceful mindset and comfort moving fast and iterating quickly Excellent judgment on what good looks like in design, performance, and user value Strong product More ❯
such as Lightdash, Looker or Tableau Comfortable with version control, testing frameworks, and CI/CD in a data context Familiarity with Python and orchestration tools like Dagster or Airflow is highly desirable Experience with Ecommerce and/or Subscription services is desirable The Interview Process Meet & Greet Call with a Talent Partner Call with the Hiring Manager and More ❯
System Reliability Engineer. Experience with building, maintaining and continuously enhancing automations needed for scalability & efficiency in running the Network Infrastructure. Experience in infrastructure Automation and orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. Proven experience with object-oriented programming languages preferably in Python. A bachelor's or master's degree in computer science, Engineering, Mathematics, a similar field of More ❯
What we’d like to see from you: • Extensive experience designing and deploying ML systems in production • Deep technical expertise in Python and modern ML tooling (e.g. MLflow, TFX, Airflow, Kubeflow, SageMaker, Vertex AI) • Experience with infrastructure-as-code and CI/CD practices for ML (e.g. Terraform, GitHub Actions, ArgoCD) • Proven ability to build reusable tooling, scalable services More ❯
support analytics platforms like MicroStrategy, ThoughtSpot, or similar tools Strong knowledge of Zero Trust Architecture and secure enclave management in government cloud implementations Familiarity with data pipeline integration (e.g., Airflow, Redshift, SQL, Python) in cloud environments Ability to serve as a lead architect on large federal programs with classified data requirements Certifications such as AWS Certified Solutions Architect - Professional More ❯