City Of Westminster, London, United Kingdom Hybrid / WFH Options
nudge Global Ltd
focus on data mining and big data processing Strong knowledge of open banking frameworks, protocols, and APIs (e.g. Open Banking UK, PSD2) Proficiency in programming languages such as Python, Scala, or Java Experience with cloud data platforms such as GCP (BigQuery, Dataflow) or Azure (Data Factory, Synapse) Expert in SQL, MongoDB and distributed data systems such as Spark, Databricks or More ❯
technical skills, knowledge and professional qualifications provided: Databases – has a deep understanding of relational databases Use DevOps methods and work in an Agile environment Programming - expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows Has expertise in SQL AI and Machine Learning knowledge as it applies to end More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
are looking for data engineers who have a variety of different skills which include some of the below. Strong proficiency in at least one programming language (Python, Java, or Scala) Extensive experience with cloud platforms (AWS, GCP, or Azure) Experience with: Data warehousing and lake architectures ETL/ELT pipeline development SQL and NoSQL databases Distributed computing frameworks (Spark, Kinesis More ❯
technical skills and knowledge Proficiency in Programming Languages : Strong proficiency in Python is essential, along with experience in Bash/Shell scripting. Familiarity with additional languages such as Java, Scala, R, or Go is a plus. Understanding of Machine Learning Fundamentals: A solid understanding of machine learning concepts, including algorithms, data pre-processing, model evaluation, and training. Familiarity with ML More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
data platform that powers smarter decisions, better insights, and streamlined operations.Key skills and responsibilities * Proven experience in data engineering and data platform development * Strong programming skills in Python, Java, Scala, or similar * Advanced SQL and deep knowledge of relational databases * Hands-on experience with ETL tools and building robust data pipelines * Familiarity with data science, AI/ML integration, and More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
in motivated teams, collaborating effectively and taking pride in your work. Strong problem-solving skills, viewing technology as a means to solve challenges. Proficiency in a programming language (e.g., Scala, Python, Java, C#) with understanding of domain modelling and application development. Knowledge of data management platforms (SQL, NoSQL, Spark/Databricks). Experience with modern engineering tools (Git, CI/ More ❯
broad range of problems using your technical skills. Demonstrable experience of utilising strong communication and stakeholder management skills when engaging with customers Significant experience of coding in Python and Scala or Java Experience with big data processing tools such as Hadoop or Spark Cloud experience; GCP specifically in this case, including services such as Cloud Run, Cloud Functions, BigQuery, GCS More ❯
years of software development with minimum 2 years in Ops or DevOps experience in trading environments, with expertise in large-scale distributed systems; recent work in Java, Go, or Scala is preferred. Hands-on experience with container orchestration (Kubernetes, Docker, etc.) and cloud infrastructure, especially AWS; familiarity with Infrastructure-as-Code tools like Terraform or CloudFormation. Strong CI/CD More ❯
platforms (Azure) and data engineering best practices . Advanced proficiency in Power BI , including DAX, Power Query, and data modeling. Strong programming skills in Python, SQL, and/or Scala for data processing and automation. Experience with ETL/ELT, data warehousing, and event-driven architectures . Knowledge of AI/ML applications in data analytics and business intelligence. Proven More ❯
Experience with at least one Cloud provider (AWS, Azure) and traditional Data Platforms Knowledge of high load and high throughput Data Platform architectures Hands-on experience with Python/Scala/Java and SQL development Experience with continuous delivery tools and technologies Knowledge of Serverless Designs, MPP Architectures, Containers, and Resource Management systems Ability to research, compare, and select appropriate More ❯
web/mobile applications or platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. " Exposure to polyglot programming languages like Scala, Python and Golang will be a plus " Ability to read/write code and expertise with various design patterns " Have used NoSQL database such as MongoDB, Cassandra, etc. " Work on More ❯
activities We are looking for experience in the following skills: Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languages eg. Python, R, Scala, etc. (Python preferred) Strong proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies eg. pySpark, Hive, etc. Experienced working with structured and unstructured data eg. More ❯
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
team from scratch. The Head of Data Engineering will require the following experience: Quant Degree such as Maths, Physics, Computer Science, Engineering etc Software Development experience in Python or Scala An understanding of Big Data technologies such as Spark, messaging services like Kafka or RabbitMQ, and workflow management tools like Airflow SQL & NoSQL expertise, ideally including Postgres, Redis, MongoDB etc More ❯
metrics, and error handling to support monitoring and incident response. ? Align implementations with InfoSum's privacy, security, and compliance practices. Required Skills and Experience: ? Proven experience with Apache Spark (Scala, Java, or PySpark), including performance optimization and advanced tuning techniques. ? Strong troubleshooting skills in production Spark environments, including diagnosing memory usage, shuffles, skew, and executor behavior. ? Experience deploying and managing More ❯