write clean, scalable, robust code using python or similar programming languages. Background in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing More ❯
Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You must be: Willing to work on client sites, potentially for extended periods. Willing to travel for work More ❯
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Strong Python and other programming skills (Java and/or Scala desirable) Strong SQL background Some exposure to big data technologies (Hadoop, spark, presto, etc.) NICE TO HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and More ❯
are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in at least one programming language (Python, Java, or Scala) Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in at least one programming language (Python, Java, or Scala) Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
technical skills and knowledge Proficiency in Programming Languages : Strong proficiency in Python is essential, along with experience in Bash/Shell scripting. Familiarity with additional languages such as Java, Scala, R, or Go is a plus. Understanding of Machine Learning Fundamentals: A solid understanding of machine learning concepts, including algorithms, data pre-processing, model evaluation, and training. Familiarity with ML More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
flow issues, optimize performance, and implement error handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
data platform that powers smarter decisions, better insights, and streamlined operations.Key skills and responsibilities * Proven experience in data engineering and data platform development * Strong programming skills in Python, Java, Scala, or similar * Advanced SQL and deep knowledge of relational databases * Hands-on experience with ETL tools and building robust data pipelines * Familiarity with data science, AI/ML integration, and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Anson McCade
review standards and agile delivery. Contributing to technical strategy and mentoring other engineers. What You’ll Need: Strong experience in data engineering with expertise in languages such as Python, Scala, or Spark. Proficiency in designing and building data pipelines, working with both structured and unstructured data. Experience with cloud platforms (AWS, Azure, or GCP), using native services for data workloads. More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
of emerging technology and trends. Provides out of hours support for applications to ensure the shop stays open and fully functional. Essential knowledge and skills Proficient in Python or Scala Familiarity in Java Experience in a Marketing technical stack and 3rd party tools Broad experience of working within AWS; including infrastructure (VPC, EC2, Security groups, S3 etc) to AWS data More ❯
Azure Event Hubs ) Solid understanding of SQL , data modelling , and lakehouse architecture Experience deploying via CI/CD tools (e.g., Azure DevOps, GitHub Actions) Nice to Have: Knowledge of Scala/Java Understanding of GDPR and handling sensitive data This is a contract role (UK-based) offering the chance to work on high-impact projects shaping the future of finance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Eden Smith Group
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
and advise on technical implications. Contribute to technical proposals and support pre-sales activities. Essential Skills & Experience: Proven leadership in engineering teams delivering data-intensive systems. Proficiency in Java, Scala, or Python . Experience with platforms like Informatica , Azure Databricks , or other ETL tools. Strong SQL expertise and understanding of distributed data systems. Ability to communicate complex technical designs clearly. More ❯
platforms, preferably with GCP expertise. Deep knowledge of cloud architecture, data engineering, pipelines, and big data technologies (e.g., BigQuery, Dataflow, Pub/Sub). Proficiency in Python, Java, or Scala; familiarity with microservices, Docker, Kubernetes, CI/CD tools (e.g., Jenkins, GitLab CI), and cloud monitoring. Proven experience in digital transformation and Agile environments. Preferred understanding of banking risk management More ❯