City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
ML evaluation methodologies and key IR metrics Passion for shipping high-quality products and a self-motivated drive to take ownership of tasks Tech Stack Core : Python, FastAPI, asyncio, Airflow, Luigi, PySpark, Docker, LangGraph Data Stores : Vector Databases, DynamoDB, AWS S3, AWS RDS Cloud & MLOps : AWS, Databricks, Ray ️ Unlimited vacation time - we strongly encourage all of our employees take More ❯
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if youve worked with More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
data pipelines Strong understanding of data modeling, schema design, and lakehouse principles Familiarity with data governance, lineage, and quality frameworks Experience working on enterprise-class applications Preferred: Experience with Apache Spark Hands-on experience with AWS data services (e.g., S3, Glue, Lambda, MSK) Capable in batch or streaming data processing using technologies such as Spark, Kafka, Flink, and DLT … Proficient in CI/CD pipelines, automated testing, code quality enforcement, and environment management for production-grade data systems Expert in orchestration and transformation frameworks such as Airflow, dbt, and Dagster, along with cloud-native platforms like Databricks Financial services or FinTech industry experience Knowledge and Skills: Operates with full autonomy on large-scale, complex data projects. Go to More ❯
Royal Leamington Spa, England, United Kingdom Hybrid / WFH Options
Kwalee
At Kwalee, we foster an environment where creativity and collaboration come together. Specialising in both the development and publishing of casual and hybrid casual games, we also bring our creative touch to publishing PC & Console titles, ensuring a diverse and More ❯
rugby, midlands, united kingdom Hybrid / WFH Options
Kwalee
At Kwalee, we foster an environment where creativity and collaboration come together. Specialising in both the development and publishing of casual and hybrid casual games, we also bring our creative touch to publishing PC & Console titles, ensuring a diverse and More ❯
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
london, south east england, united kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Futuria
data integrity, consistency, and accuracy across systems. Optimize data infrastructure for performance, cost efficiency, and scalability in cloud environments. Develop and manage graph-based data systems (e.g. Kuzu, Neo4j, Apache AGE) to model and query complex relationships in support of Retrieval Augmented Generation (RAG) and agentic architectures. Contribute to text retrieval pipelines involving vector embeddings and knowledge graphs, for … workflows. Proficiency with cloud platforms such as Azure, AWS, or GCP and their managed data services. Desirable: Experience with asynchronous python programming Experience with graph technologies (e.g., Kuzu, Neo4j, Apache AGE). Familiarity with embedding models (hosted or local): OpenAI, Cohere etc or HuggingFace models/sentence-transformers. Solid understanding of data modeling, warehousing, and performance optimization. Experience with … messaging middleware + streaming (e.g. NATS Jetstream, Redis Streams, Apache Kafka or Pulsar etc.) Hands-on experience with data lakes, lakehouses, or components of the modern data stack. Exposure to MLOps tools and best practices. Exposure to workflow orchestration frameworks (e.g. Metaflow, Airflow, Dagster) Exposure to Kubernetes Experience working with unstructured data (e.g., logs, documents, images). Awareness More ❯
Staffordshire, England, United Kingdom Hybrid / WFH Options
MSA Data Analytics Ltd
and strengthen the organisation’s data engineering and analytics capability within its AWS-based environment. We’re ideally looking for someone with strong hands-on experience across AWS services, Airflow, Python, and SQL. You’ll play a key role in designing, building, and maintaining modern data infrastructure that powers insight-led decision-making across the business. Working within a … and key stakeholders to deliver practical, scalable solutions that make a real impact. Key Responsibilities Design, build, and maintain robust, scalable ETL/ELT pipelines using tools such as Airflow and AWS services (S3, Redshift, Glue, Lambda, Athena). Integrate new data sources and continuously optimise performance and cost efficiency. Ensure data quality, integrity, and security across all systems. … with new tools and trends in data engineering, particularly within the AWS ecosystem. Skills & Experience Strong hands-on experience with AWS (S3, Redshift, Glue, Lambda, Athena). Skilled in Airflow for workflow orchestration. Advanced SQL and proficient in Python for data engineering. Experience with data modelling (e.g. dimensional) and familiarity with NoSQL databases (e.g. Elasticsearch). Confident using Git More ❯
Data Engineer (leading a team of 4). Salary: £100,000 – £130,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … Implement redundancy, backups, and DB triggers to ensure reliability and data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Immersum
Data Engineer (leading a team of 4). Salary: £100,000 – £130,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … Implement redundancy, backups, and DB triggers to ensure reliability and data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a More ❯
london, south east england, united kingdom Hybrid / WFH Options
Immersum
Data Engineer (leading a team of 4). Salary: £100,000 – £130,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … Implement redundancy, backups, and DB triggers to ensure reliability and data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Immersum
Data Engineer (leading a team of 4). Salary: £100,000 – £130,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … Implement redundancy, backups, and DB triggers to ensure reliability and data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Immersum
Data Engineer (leading a team of 4). Salary: £100,000 – £130,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … Implement redundancy, backups, and DB triggers to ensure reliability and data integrity. Work with Python to build scalable data solutions. Introduce and adopt new technologies such as Kafka, Docker, Airflow, and AWS . Define and enforce data hygiene practices (ontology, storage, artifacts, version control). Reduce engineering load per person through automation and efficient design. Collaborate closely with a More ❯
Yarnton, Kidlington, Oxfordshire, England, United Kingdom Hybrid / WFH Options
Noir
Machine Learning Engineer Machine Learning Engineer - AI for Advanced Materials - Oxford/Remote (UK) (Tech stack: Python, PyTorch, TensorFlow, Scikit-learn, MLflow, Airflow, Docker, Kubernetes, AWS, Azure, GCP, Pandas, NumPy, SciPy, CI/CD, MLOps, Data Visualization, Bayesian Modelling, Probabilistic Programming, Terraform) We're looking for a Machine Learning Engineer to join a rapidly scaling deep-tech company that … Our client is seeking Machine Learning Engineers with experience in some or all of the following (full training provided to fill any gaps): Python, PyTorch, TensorFlow, Scikit-learn, MLflow, Airflow, Docker, Kubernetes, Pandas, NumPy, SciPy, CI/CD, Data Visualization, Bayesian Modelling, Probabilistic Programming, Terraform, Azure, AWS, GCP, Git, and Agile methodologies. Join a team that's fusing AI More ❯
management to analyze and improve our business processes Conduct workload and complexity assessments Design, plan and implement new efficient software solutions in Python and Go Maintain and enhance our Airflow data pipelines Improve observability and scalability to support our constantly growing client base Continuously adapt to changing requirements in a business driven environment Build and extend knowledge of financial … innate curiosity to learn new things Preferred qualifications Background in traditional finance or digital assets, ideally in trading domain Hands-on experience with Python libraries and frameworks (NumPy, Pandas, Airflow, FastAPI, Flask, SQLAlchemy) Highly proficient in asynchronous, event driven distributed systems Working knowledge of cloud-native architectures, GCP preferred Experience in Go and working with real-time data streams More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Swissblock Technologies AG
management to analyze and improve our business processes Conduct workload and complexity assessments Design, plan and implement new efficient software solutions in Python and Go Maintain and enhance our Airflow data pipelines Improve observability and scalability to support our constantly growing client base Continuously adapt to changing requirements in a business driven environment Build and extend knowledge of financial … innate curiosity to learn new things Preferred qualifications Background in traditional finance or digital assets, ideally in trading domain Hands-on experience with Python libraries and frameworks (NumPy, Pandas, Airflow, FastAPI, Flask, SQLAlchemy) Highly proficient in asynchronous, event driven distributed systems Working knowledge of cloud-native architectures, GCP preferred Experience in Go and working with real-time data streams More ❯