services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
the payments industry. Experience using Azure Databricks. Experience using containerised services such as Docker/Kubernetes. Experience using IaC tools such as Terraform/Bicep. Experience using programming languages; Scala, Powershell, YAML. Comprehensive, payments industry training by in-house and industry experts. Excellent performance-based earning opportunity, including OKR-driven bonuses. Future opportunity for equity, rewarded to high performers. Personal More ❯
Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to both technical and non-technical More ❯
complex real-world problems Drive innovation in fraud analytics, data engineering, and AI applications What You Bring Proven experience in Java or a similar OOP stack (e.g., C#, Kotlin, Scala) Strong grasp of REST APIs, microservices, and containerisation (Docker/Kubernetes) Agile mindset with strong software engineering fundamentals Passion for innovation, learning, and leading by example Familiarity with DevOps tools More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
GitHub proficiency Strong organizational, analytical, problem-solving, and communication skills Comfort working with remote teams and distributed delivery models Additional skills that are a plus: Programming languages such as Scala, Rust, Go, Angular, React, Kotlin Database management with PostgreSQL Experience with ElasticSearch, observability tools like Grafana and Prometheus What this role can offer Opportunity to deepen understanding of AI and More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
these requirements. In order to secure one of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Good experience in using Databricks Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance More ❯
such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or SQL. Familiarity with containerization technologies (Docker) and orchestration tools (Kubernetes). Strong knowledge of software development principles, including object-oriented design, design patterns, and clean code practices. Excellent More ❯
Azure Event Hubs ) Solid understanding of SQL , data modelling , and lakehouse architecture Experience deploying via CI/CD tools (e.g., Azure DevOps, GitHub Actions) Nice to Have: Knowledge of Scala/Java Understanding of GDPR and handling sensitive data This is a contract role (UK-based) offering the chance to work on high-impact projects shaping the future of finance More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
re solving for reliability, compliance, performance, and speed - at once. You'll be key to making it work. Required Skills: Knowledge of one or more programming languages (Java/Scala, TypeScript, Python). Validated experience operating distributed systems at scale in production. Cloud AWS (primary), Kubernetes (future), Docker (current), Terraform. Excellent debugging skills across network, systems, and data stack. Observability More ❯
security). Working knowledge of AWS core services, including S3, EC2/EMR, IAM, Athena, Glue or Redshift. Hands-on experience with Databricks Spark on large datasets, using PySpark, Scala, or SQL. Familiarity with Delta Lake, Unity Catalog or similar data lakehouse technologies. Proficient in Linux environments, including experience with shell scripting, basic system operations, and navigating file systems. Deep More ❯
project deadlines. Strong collaborative spirit, working seamlessly with team members and external clients. Proficiency in R or Python. Solid understanding of SQL; experience working with Spark (Java, Python, or Scala variants) and cloud platforms like Databricks is a plus. Strong statistical knowledge, including hypothesis testing, confidence intervals, and A/B testing. Ability to understand and communicate the commercial impact More ❯
architecture, integration tools & approaches AI methods & techniques and the challenges faced in establishing effective delivery of value across the business Some hands-on coding experience with SQL, Python or Scala would be advantageous but not compulsory Relevant experience in Data Platform Technologies (knowledge of any or all including Snowflake, Databricks, Microsoft Fabric would be beneficial) Previous consulting experience would be More ❯
d love to hear from you. Experience working with data: wrangling, cleaning, and transforming datasets for analysis. Familiarity with tools such as Alteryx, DataRobot, SAS, Databricks, SPSS, R, Python, Scala, Java, or Spark. Exposure to data visualisation platforms like Tableau or Power BI. Understanding of machine learning concepts and algorithms (e.g. classification, clustering, regression). Interest in or experience with More ❯
your development journey! Key skills and experience: Experience with automated testing frameworks like Playwright Experience with profiling tools and load testing Commercial experience with Java or JVM languages (Groovy, Scala, Kotlin ) API Engineering expertise Relational database knowledge, e.g., PostgreSQL DevOps skills: CI/CD, Docker, Git Additional skills that are a plus: Understanding of Software Engineering Principles: SOLID, design patterns More ❯
experience: Experience with automated testing frameworks such as Playwright Experience with profiling tools and load testing Commercial experience with a core programming language like Java or JVM languages (Groovy, Scala, Kotlin ) Expertise in API Engineering Experience with relational databases like PostgreSQL DevOps: CI/CD, Docker, Git The following skills and technologies are a plus: Understanding of Software Engineering Principles More ❯
d love to hear from you. Experience working with data: wrangling, cleaning, and transforming datasets for analysis. Familiarity with tools such as Alteryx, DataRobot, SAS, Databricks, SPSS, R, Python, Scala, Java, or Spark. Exposure to data visualisation platforms like Tableau or Power BI. Understanding of machine learning concepts and algorithms (e.g. classification, clustering, regression). Interest in or experience with More ❯
and BI Teams. Skills & Experience: * Proficiency in Azure tools (Data Factory, Databricks, Synapse, etc.). * Strong SQL and experience with data warehousing (Kimball methodology). * Programming skills in Python, Scala, or PySpark. * Familiarity with Power BI, SharePoint, and data integration technologies. * Understanding of DevOps, CI/CD, and cloud concepts (IaaS, PaaS, SaaS). * PowerShell scripting, containerisation, Semarchy MDM, NEC More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
in designing and implementing data warehousing solutions using Snowflake and AWS. The ideal candidate must have: Strong experience as an AWS Data Engineer Software Engineering background Coding lanuguages Java, Scala, C# or Python Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB More ❯
tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version control, CI/CD, and infrastructure-as More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
IO Associates
tools and services (preferably Azure ) Required Skills & Experience: Proven experience as a Data Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with Delta Lake , Azure Data Lake , or similar technologies Familiarity with version control, CI/CD, and infrastructure-as More ❯