services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to both technical and non-technical More ❯
complex real-world problems Drive innovation in fraud analytics, data engineering, and AI applications What You Bring Proven experience in Java or a similar OOP stack (e.g., C#, Kotlin, Scala) Strong grasp of REST APIs, microservices, and containerisation (Docker/Kubernetes) Agile mindset with strong software engineering fundamentals Passion for innovation, learning, and leading by example Familiarity with DevOps tools More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
flow issues, optimize performance, and implement error handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
GitHub proficiency Strong organizational, analytical, problem-solving, and communication skills Comfort working with remote teams and distributed delivery models Additional skills that are a plus: Programming languages such as Scala, Rust, Go, Angular, React, Kotlin Database management with PostgreSQL Experience with ElasticSearch, observability tools like Grafana and Prometheus What this role can offer Opportunity to deepen understanding of AI and More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
data platform that powers smarter decisions, better insights, and streamlined operations.Key skills and responsibilities * Proven experience in data engineering and data platform development * Strong programming skills in Python, Java, Scala, or similar * Advanced SQL and deep knowledge of relational databases * Hands-on experience with ETL tools and building robust data pipelines * Familiarity with data science, AI/ML integration, and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Anson McCade
review standards and agile delivery. Contributing to technical strategy and mentoring other engineers. What You’ll Need: Strong experience in data engineering with expertise in languages such as Python, Scala, or Spark. Proficiency in designing and building data pipelines, working with both structured and unstructured data. Experience with cloud platforms (AWS, Azure, or GCP), using native services for data workloads. More ❯
of emerging technology and trends. Provides out of hours support for applications to ensure the shop stays open and fully functional. Essential knowledge and skills Proficient in Python or Scala Familiarity in Java Experience in a Marketing technical stack and 3rd party tools Broad experience of working within AWS; including infrastructure (VPC, EC2, Security groups, S3 etc) to AWS data More ❯
such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or SQL. Familiarity with containerization technologies (Docker) and orchestration tools (Kubernetes). Strong knowledge of software development principles, including object-oriented design, design patterns, and clean code practices. Excellent More ❯
Azure Event Hubs ) Solid understanding of SQL , data modelling , and lakehouse architecture Experience deploying via CI/CD tools (e.g., Azure DevOps, GitHub Actions) Nice to Have: Knowledge of Scala/Java Understanding of GDPR and handling sensitive data This is a contract role (UK-based) offering the chance to work on high-impact projects shaping the future of finance More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
re solving for reliability, compliance, performance, and speed - at once. You'll be key to making it work. Required Skills: Knowledge of one or more programming languages (Java/Scala, TypeScript, Python). Validated experience operating distributed systems at scale in production. Cloud AWS (primary), Kubernetes (future), Docker (current), Terraform. Excellent debugging skills across network, systems, and data stack. Observability More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯
project deadlines. Strong collaborative spirit, working seamlessly with team members and external clients. Proficiency in R or Python. Solid understanding of SQL; experience working with Spark (Java, Python, or Scala variants) and cloud platforms like Databricks is a plus. Strong statistical knowledge, including hypothesis testing, confidence intervals, and A/B testing. Ability to understand and communicate the commercial impact More ❯
platforms (Azure) and data engineering best practices . Advanced proficiency in Power BI , including DAX, Power Query, and data modeling. Strong programming skills in Python, SQL, and/or Scala for data processing and automation. Experience with ETL/ELT, data warehousing, and event-driven architectures . Knowledge of AI/ML applications in data analytics and business intelligence. Proven More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Searchability
as a Data Engineer, with Python & SQL expertise Familiarity with AWS services (or equivalent cloud platforms) Experience with large-scale datasets and ETL pipeline development Knowledge of Apache Spark (Scala or Python) beneficial Understanding of agile development practices, CI/CD, and automated testing Strong problem-solving and analytical skills Positive team player with excellent communication abilities TO BE CONSIDERED More ❯
architecture, integration tools & approaches AI methods & techniques and the challenges faced in establishing effective delivery of value across the business Some hands-on coding experience with SQL, Python or Scala would be advantageous but not compulsory Relevant experience in Data Platform Technologies (knowledge of any or all including Snowflake, Databricks, Microsoft Fabric would be beneficial) Previous consulting experience would be More ❯
d love to hear from you. Experience working with data: wrangling, cleaning, and transforming datasets for analysis. Familiarity with tools such as Alteryx, DataRobot, SAS, Databricks, SPSS, R, Python, Scala, Java, or Spark. Exposure to data visualisation platforms like Tableau or Power BI. Understanding of machine learning concepts and algorithms (e.g. classification, clustering, regression). Interest in or experience with More ❯
experience: Experience with automated testing frameworks such as Playwright Experience with profiling tools and load testing Commercial experience with a core programming language like Java or JVM languages (Groovy, Scala, Kotlin ) Expertise in API Engineering Experience with relational databases like PostgreSQL DevOps: CI/CD, Docker, Git The following skills and technologies are a plus: Understanding of Software Engineering Principles More ❯