frameworks and practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
write clean, scalable, robust code using python or similar programming languages. Background in software engineering a plus. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
GeoServer). Technical Skills: Expertise in big data frameworks and technologies (e.g., Hadoop, Spark, Kafka, Flink) for processing large datasets. Proficiency in programming languages such as Python, Java, or Scala, with a focus on big data frameworks and APIs. Experience with cloud services and technologies (AWS, Azure, GCP) for big data processing and platform deployment. Strong knowledge of data warehousing More ❯
Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building and leading. You must be: Willing to work on client sites, potentially for extended periods. Willing to travel for work More ❯
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
security, compliance, and data governance standards (e.g. GDPR , RBAC) Mentor junior engineers and guide technical decisions on client engagements ✅ Ideal Experience Strong in Python and SQL , plus familiarity with Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake More ❯
degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Strong Python and other programming skills (Java and/or Scala desirable) Strong SQL background Some exposure to big data technologies (Hadoop, spark, presto, etc.) NICE TO HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and More ❯
Mandatory Proficient in either GCP (Google) or AWS cloud Hands on experience in designing and building data pipelines using Hadoop and Spark technologies. Proficient in programming languages such as Scala, Java, or Python. Experienced in designing, building, and maintaining scalable data pipelines and applications. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of Infrastructure as Code tools. More ❯
and Capital Markets is a plus. Experience with Big Data technologies ( i.e. Kafka, Apache Spark, NOSQL) Knowledge of BI tools like Power BI, Microstrategy etc Exposure to Python and Scala Exposure to Salesforce ecosytem About S&P Global Ratings At S&P Global Ratings, our analyst-driven credit ratings, research, and sustainable finance opinions provide critical insights that are essential More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
define requirements and deliver innovative solutions Support internal capability development by sharing your expertise and experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI More ❯
serving layers Experience with data lakehouse architecture, schema design, and GDPR-compliant solutions Working knowledge of DevOps tools and CI/CD processes Bonus Points For Development experience in Scala or Java Familiarity with Cloudera, Hadoop, HIVE, and Spark ecosystem Understanding of data privacy regulations, including GDPR, and experience working with sensitive data Ability to learn and adapt new technologies More ❯
collaborating with stakeholders to define optimal solutions - Strong SQL skills with experience across relational and non-relational databases - Proficiency in a modern programming language such as Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory More ❯
large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Good More ❯
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Eden Smith Group
as Actuarial and Finance, to help with their data presentation requirements and help deliver data for visualisation solutions, such as Power BI Key skills required: Expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows. Strong knowledge of Power BI Strong SQL experience. Familiarity with technical data structures, data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Investment Banking - London/Hybrid (Data Engineer, SQL Data Engineer, Java, Python, Spark, Scala, SQL, Snowflake, OO programming, Snowflake, Databricks, Data Fabric, design patterns, SOLID principles, ETL, Unit testing, NUnit, MSTest, Junit, Microservices Architecture, Continuous Integration, Azure DevOps, AWS, Jenkins, Agile, Data Engineer, SQL Data Engineer) We have several fantastic new roles including a Data Engineer position to More ❯
and working in a fast-paced and ever-changing environment. Ideally, you are also experienced with at least one of the programming languages such as Java, C++, Spark/Scala, Python, etc. Major Responsibilities: - Work with a team of product and program managers, engineering leaders, and business leaders to build data architectures and platforms to support business - Design, develop, and More ❯
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
in motivated teams, collaborating effectively and taking pride in your work. Strong problem-solving skills, viewing technology as a means to solve challenges. Proficiency in a programming language (e.g., Scala, Python, Java, C#) with understanding of domain modelling and application development. Knowledge of data management platforms (SQL, NoSQL, Spark/Databricks). Experience with modern engineering tools (Git, CI/ More ❯
control, task tracking). ·Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. ·Experience in other programming languages for data manipulation (e.g., Python, Scala). ·Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. ·Understanding of the principles of data More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Datatech
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
years of professional software development experience Excellent problem-solving skills and a proactive approach to addressing challenges. Strong programming skills in languages such as Go, SQL, Python, Java or Scala Experience in the effective application of AI tooling in your development process A growth mindset and eagerness to work in a fast-paced, mission-driven environment Excellent communication skills, with More ❯