frameworks, and cloud-based data platforms (AWS, Azure, or GCP). Proven track record in credit risk modelling, fraud analytics, or similar financial domains. Familiarity with big data technologies (Spark, Hive) and MLOps practices for production-scale deployments. Excellent communication skills to engage stakeholders and simplify complex concepts. Desirable Extras Experience with regulatory frameworks (e.g., Basel, GDPR) and model More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions
frameworks, and cloud-based data platforms (AWS, Azure, or GCP). Proven track record in credit risk modelling, fraud analytics, or similar financial domains. Familiarity with big data technologies (Spark, Hive) and MLOps practices for production-scale deployments. Excellent communication skills to engage stakeholders and simplify complex concepts. Desirable Extras Experience with regulatory frameworks (e.g., Basel, GDPR) and model More ❯
on leadership and communication, ensuring all key builds and improvements flow through this individual. Working with a modern tech stack including AWS, Snowflake, Python, SQL, DBT, Airflow, Spark, Kafka, and Terraform, you'll drive automation and end-to-end data solutions that power meaningful insights. Ideal for ambitious, proactive talent from scale-up or start-up environments, this position More ❯
on leadership and communication, ensuring all key builds and improvements flow through this individual. Working with a modern tech stack including AWS, Snowflake, Python, SQL, DBT, Airflow, Spark, Kafka, and Terraform, you'll drive automation and end-to-end data solutions that power meaningful insights. Ideal for ambitious, proactive talent from scale-up or start-up environments, this position More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Hunter Bond
reliability Enjoy experimenting with emerging technologies and tools Value writing clean, modular, and maintainable code Are excited to learn more about financial markets and trading systems Bonus experience: Ruby, Spark, Trino, Kafka Financial markets exposure SQL (Postgres, Oracle) Cloud-native deployments (AWS, Docker, Kubernetes) Observability tools (Splunk, Prometheus, Grafana) Why Apply? This is a fantastic opportunity to join a More ❯
reliability Enjoy experimenting with emerging technologies and tools Value writing clean, modular, and maintainable code Are excited to learn more about financial markets and trading systems Bonus experience: Ruby, Spark, Trino, Kafka Financial markets exposure SQL (Postgres, Oracle) Cloud-native deployments (AWS, Docker, Kubernetes) Observability tools (Splunk, Prometheus, Grafana) Why Apply? This is a fantastic opportunity to join a More ❯
cloud adoption, resiliency engineering, and operational observability. Drive technical governance, platform strategy, and roadmap decisions for Operations Technology. Provide hands-on guidance for distributed systems development using Akka, Kafka, Spark, and related technologies. Support delivery teams with architecture reviews, performance tuning, and integration patterns across upstream and downstream platforms. Required Skills & Experience Expert-level experience with Scala and functional More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Harnham
infrastructure-as-code Docker; Kubernetes (EKS, GKE, AKS); Jenkins, GitLab CI, or GitHub Actions; Terraform or CloudFormation; Prometheus, Grafana, Datadog, or New Relic; Slurm, Torque, LSF; MPI; Hadoop or Spark;Director of In Experience with high-performance computing, distributed systems, and observability tools Strong communication and executive presence, with the ability to translate complex technical concepts for diverse audiences More ❯
infrastructure-as-code Docker; Kubernetes (EKS, GKE, AKS); Jenkins, GitLab CI, or GitHub Actions; Terraform or CloudFormation; Prometheus, Grafana, Datadog, or New Relic; Slurm, Torque, LSF; MPI; Hadoop or Spark;Director of In Experience with high-performance computing, distributed systems, and observability tools Strong communication and executive presence, with the ability to translate complex technical concepts for diverse audiences More ❯
Practical knowledge of infrastructure as code, CI/CD best practices, and cloud platforms (AWS, GCP, or Azure). Experience with relational databases and data processing and query engines (Spark, Trino, or similar). Familiarity with monitoring, observability, and alerting systems for production ML (Prometheus, Grafana, Datadog, or equivalent). Understanding of ML concepts. You don't need to More ❯
Practical knowledge of infrastructure as code, CI/CD best practices, and cloud platforms (AWS, GCP, or Azure). Experience with relational databases and data processing and query engines (Spark, Trino, or similar). Familiarity with monitoring, observability, and alerting systems for production ML (Prometheus, Grafana, Datadog, or equivalent). Understanding of ML concepts. You don't need to More ❯
a plus). Experience with model lifecycle management (MLOps), including monitoring, retraining, and model versioning. Ability to work across data infrastructure, from SQL to large-scale distributed data tools (Spark, etc.). Strong written and verbal communication skills, especially in cross-functional contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Method Resourcing
a plus). Experience with model lifecycle management (MLOps), including monitoring, retraining, and model versioning. Ability to work across data infrastructure, from SQL to large-scale distributed data tools (Spark, etc.). Strong written and verbal communication skills, especially in cross-functional contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous More ❯
Greater London, England, United Kingdom Hybrid/Remote Options
Primis
conceptual, logical, physical) and enterprise data management principles Hands-on experience with data modelling tools (e.g., Erwin, Sparx, PowerDesigner) and big data/cloud ecosystems (e.g., Databricks, Snowflake, Redshift, Spark) Solid grasp of data governance, metadata, and data quality frameworks Excellent stakeholder engagement skills — able to communicate complex data concepts to both technical and business audiences Bonus Points For More ❯
Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and … best practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
Databricks platform. Optimise data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure … Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and … best practices. Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Peaple Talent
delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with Big Data technologies such as ApacheSpark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications More ❯
delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with Big Data technologies such as ApacheSpark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications More ❯
monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or ApacheSpark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance More ❯
technical stakeholders What You Bring 3–5+ years of experience in software engineering (Python) Experience with fastAPI, cloud platforms (AWS, Azure, or GCP), Docker. Bonus: experience with ML workflows, Spark, Airflow, or trading systems Why Join Up to £140K Up to 100% bonus Relocation package Flat, entrepreneurial team structure More ❯
technical stakeholders What You Bring 3–5+ years of experience in software engineering (Python) Experience with fastAPI, cloud platforms (AWS, Azure, or GCP), Docker. Bonus: experience with ML workflows, Spark, Airflow, or trading systems Why Join Up to £140K Up to 100% bonus Relocation package Flat, entrepreneurial team structure More ❯
modelling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Executive Facilities
domains. Proficiency in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) and ETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and More ❯
including monitoring, alerting, and automated checks. Optimise data workflows for performance, cost-efficiency, and maintainability , using platforms such as Azure Data Factory, AWS Data Pipeline, Glue, Lambda, Databricks, and Apache Spark. Support the integration of transformed data into visualisation and analytical platforms , including Power BI, ServiceNow, and Amazon QuickSight. Ensure compliance with data governance, security, and privacy standards across More ❯
pace with evolving technologies and techniques Candidate Profile 1+ years’ experience in a data-related role (Analyst, Engineer, Scientist, Consultant, or Specialist) Experience with technologies such as Python, SQL, Spark, Power BI, AWS, Azure, or GCP Strong analytical and problem-solving skills Comfortable working directly with clients and stakeholders Excellent communication and teamwork abilities Must hold active SC or More ❯