data services – Databricks, ADF, ADLS, Power BI. Proficiency in SQL and data profiling for test design and validation. Hands-on experience with test automation frameworks such as Python/PySpark, Great Expectations, Pytest, or dbt tests. Practical understanding of CI/CD integration (Azure DevOps, GitHub Actions, or similar). Strong problem-solving skills and the ability to work More ❯
Practical experience with Deep Learning frameworks (e.g., TensorFlow, PyTorch) for applications like NLP (Transformer models, BERT) or computer vision. Big Data Tools: Experience with big data platforms like Spark (PySpark) for handling large-scale datasets. MLOps: Familiarity with MLOps tools and concepts (e.g., Docker, Kubernetes, MLflow, Airflow) for model deployment and lifecycle management. Financial Domain Knowledge: Direct experience with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
ECS
Requirements: 10+ years in cloud architecture/engineering with a strong focus on building scalable data pipelines Expertise in Azure Databricks (7+years) including building and managing ETL pipelines using PySpark or Scala (essential) Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure Data Lake Storage Gen2, Azure Data Factory, and Azure More ❯
modelling techniques + data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg More ❯
EC4N 6JD, Vintry, United Kingdom Hybrid / WFH Options
Syntax Consultancy Ltd
modelling techniques + data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg More ❯
Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models and Fabric data modelling DevOps deployment using ARM/Bicep templates More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
CDC. Knowledge of public/enterprise cloud technologies (AWS EC2, S3 Bucket, GCP, Azure) is advantageous but not required. Some skills/experience with automated testing frameworks (Java, Python, PySpark, Bitbucket, Gitlab, Jenkins) is advantageous but not required. Strong Environment Management skill Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to More ❯
Azure DevOps. Required Skills & Experience Strong expertise in Power BI, DAX, SQL, and data architecture. Proven experience with Azure Data Factory, Fabric Lakehouse/Warehouse, and Synapse. Proficiency in PySpark, T-SQL, and data transformation using Notebooks. Familiarity with DevOps, ARM/Bicep templates, and semantic modelling. Experience delivering enterprise-scale BI/data warehouse programmes. Exposure to Microsoft More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
Team Leading experience - REQUIRED/Demonstrable on CV (Full Support from Engineering Manager is also available) Hands on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
Team Leading experience - REQUIRED/Demonstrable on CV (Full Support from Engineering Manager is also available) Hands on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We More ❯
Contract duration: 6 months (can be extended) Location: London Must have skills: Primary Skills - SAS Admin, Enterprise Guide, Basic SAS coding skill Secondary Skills - Unix Scripting, Gitlab, YAML, Autosys, pyspark, Snowflake, AWS, Agile practice, SQL Candidate should have strong experience in SAS Administration and Expert SAS Coding Skills. Should have more than 6 years experience. Should have very good More ❯
PySpark + Fabric Developer (Contract) | London | Office-Based Location: London (Office-based) Contract: 6 months (potential extension) Start: ASAP Rate: Market rate - Inside IR35 We’re looking for experienced PySpark + Fabric Developers to join a major transformation programme with a leading global financial data and infrastructure organisation. This is an exciting opportunity to work on cutting-edge … enhance throughput. Collaborate with analysts and stakeholders to translate business needs into technical solutions. Maintain clear documentation and contribute to internal best practices. Requirements Strong hands-on experience with PySpark (RDDs, DataFrames, Spark SQL). Proven ability to build and optimise ETL pipelines and dataflows. Familiar with Microsoft Fabric or similar lakehouse/data platform environments. Experience with Git More ❯
Job Title: Data Quality Engineer Work Location: Cardiff, UK (Twice a month) The Role: Data Quality Engineer Responsibilities: As part of a multi-discipline team challenged with building a cloud data platform, you will be responsible for ensuring the quality More ❯
Job Description Role/Job Title: Developer (PySpark + Fabric) Work Location: London (Office Based) The Role The role will be integral to realizing the customer's vision and strategy in transforming some of their critical application and data engineering components. As a global financial markets infrastructure and data provider , the customer leverages cutting-edge technologies to support its More ❯
Central London, London, United Kingdom Hybrid / WFH Options
iDPP
someone who enjoys building scalable data solutions while staying close to business impact. The Role As a Data Analytics Engineer , youll design, build, and maintain reliable data pipelinesprimarily using PySpark, SQL, and Python to ensure business teams (analysts, product managers, finance, operations) have access to well-modeled, actionable data. Youll work closely with stakeholders to translate business needs into … spend more time coding, managing data infrastructure, and ensuring pipeline reliability. Who Were Looking For Data Analytics : Analysts who have strong experience building and maintaining data pipelines (particularly in PySpark/SQL ) and want to work on production-grade infrastructure. Data Engineering : Engineers who want to work more closely with business stakeholders and enable analytics-ready data solutions. Analytics … Professionals who already operate in this hybrid space, with proven expertise across big data environments, data modeling, and business-facing delivery. Key Skills & Experience Strong hands-on experience with PySpark, SQL, and Python Proven track record of building and maintaining data pipelines Ability to translate business requirements into robust data models and solutions Experience with data validation, quality checks More ❯
contract assignment. In order to be successful, you will have the following experience: Extensive AI & Data Development background Experiences with Python (including data libraries such as Pandas, NumPy, and PySpark) and Apache Spark (PySpark preferred) Strong experience with data management and processing pipelines Algorithm development and knowledge of graphs will be beneficial SC Clearance is essential Within this More ❯
Data Developer for an urgent contract assignment. Key Requirements: Proven background in AI and data development Strong proficiency in Python , including data-focused libraries such as Pandas, NumPy, and PySpark Hands-on experience with Apache Spark (PySpark preferred) Solid understanding of data management and processing pipelines Experience in algorithm development and graph data structures is advantageous Active SC More ❯
Role: Developer (PySpark+ Fabric) Location: London Contract (6months +) Hybrid (Inside IR35) The Role The role will be integral to realizing the customer's vision and strategy in transforming some of their critical application and data engineering components. As a More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Oliver James
Proven experience working as a principal or lead data engineer * Strong background working with large datasets, with proficiency in SQL, Python, and PySpark * Experience managing and mentoring engineers with varying levels of experience I'm currently working with a leading insurance broker who is looking to hire a Lead Azure Data Engineer on an initial 12-month fixed-term … an Azure-based data lakehouse. Key requirements: * Proven experience working as a principal or lead data engineer * Strong background working with large datasets, with proficiency in SQL, Python, and PySpark * Experience managing and mentoring engineers with varying levels of experience * Hands-on experience deploying pipelines within Azure Databricks, ideally following the Medallion Architecture framework Hybrid working: Minimum two days More ❯
Job Description Role/Job Title: Developer (PySpark + Fabric) Work Location: London (Office Based) The Role The role will be integral to realizing the customer's vision and strategy in transforming some of their critical application and data engineering components. As a global financial markets infrastructure and data provider , the customer leverages cutting-edge technologies to support its More ❯