GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing More ❯
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to both technical and non-technical More ❯
complex real-world problems Drive innovation in fraud analytics, data engineering, and AI applications What You Bring Proven experience in Java or a similar OOP stack (e.g., C#, Kotlin, Scala) Strong grasp of REST APIs, microservices, and containerisation (Docker/Kubernetes) Agile mindset with strong software engineering fundamentals Passion for innovation, learning, and leading by example Familiarity with DevOps tools More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Anson McCade
review standards and agile delivery. Contributing to technical strategy and mentoring other engineers. What You’ll Need: Strong experience in data engineering with expertise in languages such as Python, Scala, or Spark. Proficiency in designing and building data pipelines, working with both structured and unstructured data. Experience with cloud platforms (AWS, Azure, or GCP), using native services for data workloads. More ❯
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Eden Smith Group
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
solving and logical thinking skills. Ability to work independently in a remote environment. Preferred Qualifications: Exposure to tools like Apache Airflow, Spark, or Kafka. Basic experience with Python or Scala for scripting and automation. Knowledge of cloud platforms like AWS, Azure, or GCP. Previous academic or personal projects related to data pipeline development or big data handling. Internship Details: Duration More ❯
of performance. Your Experience Must have: 4+ years of industry experience in applied ML with recent experience in AI/LLM systems Strong proficiency with Python, and familiarity with Scala, Go, or Rust Cloud platform expertise with AWS, GCP, or Azure, including AI-specific services (SageMaker, Vertex AI, Azure AI) Databricks platform experience with Unity Catalog, MLflow, and Databricks Machine More ❯
be helpful in this role. While SQL is a language everybody on the team knows very well, Snowflake is fast evolving to allow native integrations with Python, Java and Scala & with Snowpark , if you have experience with Spark you will definitely be able to put these skills to good use. DataOps or DevOps for data teams is getting ever more More ❯
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
education) in a STEM discipline. Proven experience in software engineering and development, and a strong understanding of computer systems and how they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong More ❯
governance. Strong expertise in balancing trade-offs within complex distributed systems, focusing on data quality, performance, reliability, availability, and security. Proficient in software engineering with modern languages (e.g. Python, Scala, Java), applying best practices to create maintainable, scalable, and robust code. A continuous learner, up-to-date with the latest technology trends, with the ability to assess new technologies pragmatically More ❯
Azure DevOps). Excellent communication and stakeholder management skills. Bonus Points for: Previous experience in a consultancy environment . Hands-on coding experience in additional languages like Python, Ruby, Scala, PHP, or C++ . Knowledge of performance testing tools like JMeter, Gatling, K6 or Neoload. What's in It for You? At Ten10, we believe in recognizing and rewarding great More ❯
Azure DevOps). Excellent communication and stakeholder management skills. Bonus Points for: Previous experience in a consultancy environment . Hands-on coding experience in additional languages like Python, Ruby, Scala, PHP, or C++ . Knowledge of performance testing tools like JMeter, Gatling, K6 or Neoload . What's in It for You? At Ten10, we believe in recognizing and rewarding More ❯
Purview or equivalent for data governance and lineage tracking Experience with data integration, MDM, governance, and data quality tools . Hands-on experience with Apache Spark, Python, SQL, and Scala for data processing. Strong understanding of Azure networking, security, and IAM , including Azure Private Link, VNETs, Managed Identities, and RBAC . Deep knowledge of enterprise-scale data architecture patterns , including More ❯
champion a culture of innovation and challenge. We have over 300 tech experts across our teams all using the latest tools and technologies including Docker, Kubernetes, AWS, Kafka, Java, Scala, Python, iOS, Android .Net Core, Swift, Kotlin, Node.js and MongoDB. Theres something for everyone. Were a place of opportunity. Youll have the tools and autonomy to drive your own career More ❯
systems Experience with full implementation of a Generative AI application is highly desirable Strong expertise in Generative AI, LLMs, or related NLP technologies Proficiency in Python and/or Scala; experience with ML libraries such as TensorFlow, PyTorch, HuggingFace, or scikit-learn Experience with Databricks, distributed data systems (e.g., Spark, Hadoop), and cloud platforms (AWS, GCP, or Azure) Ability to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
in designing and implementing data warehousing solutions using Snowflake and AWS. The ideal candidate must have: Strong experience as an AWS Data Engineer Software Engineering background Coding lanuguages Java, Scala, C# or Python Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB More ❯
to life. We foster a culture of innovation and challenge. Our teams of over 300 tech experts use the latest tools and technologies including Docker, Kubernetes, AWS, Kafka, Java, Scala, Python, iOS, Android, .NET Core, Swift, Kotlin, Node.js, and MongoDB. There's something for everyone. We offer opportunities for growth. You'll have the tools and autonomy to develop your More ❯
MLlib for machine learning tasks Strong understanding of predictive modeling techniques (e.g., regression, classification, clustering) Experience with distributed systems like Hadoop for data storage and processing Proficiency in Python, Scala, or Java for ML development Familiarity with data preprocessing techniques and feature engineering Knowledge of model evaluation metrics and techniques Experience with deploying ML models in production environments Permanent or More ❯