data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with DataOps, CI/CD practices, and infrastructure-as-code Knowledge of data governance, data More ❯
Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with DataOps, CI/CD practices, and infrastructure-as-code Knowledge of data governance, data More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Az-Tec Talent
and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model More ❯
and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model More ❯
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
Machine Learning fundamentals and strong knowledge in a specific domain (e.g., Computer Vision, Deep Learning, NLP). Expert coding skills in Python and at least one of these languages: Scala, C++, Java. Hands-on experience with database management languages (e.g., SQL, PostgreSQL). Hands-on experience in cloud-based infrastructures (AWS/GCP/Azure). Hands-on experience with More ❯
models. Good understanding of ML fundamentals and strong knowledge in a domain (e.g., Computer Vision, Deep Learning, NLP). Expert coding skills in Python and at least one of Scala, C++, Java. Hands on experience with SQL, PostgreSQL and other database languages. Experience with cloud based infrastructures (AWS/GCP/Azure). Knowledge of Unix command line and DevOps More ❯
Machine Learning fundamentals and strong knowledge in a specific domain (e.g., Computer Vision, Deep Learning, NLP). Expert coding skills in Python and at least one of these languages: Scala, C++, Java. Hands-on experience with database management languages (e.g., SQL, PostgreSQL). Hands-on experience in cloud-based infrastructures (AWS/GCP/Azure). Hands-on experience with More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
KPMG UK
to having resided in the UK for at least the past 5 years and being a UK national or dual UK national. Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
background in data modeling, ETL/ELT processes, data integration, and real-time data pipelines. Experience with ingestion frameworks, time-series databases, and streaming analytics. Programming proficiency in Python, Scala, or Java for data engineering and automation. Strong SQL skills and familiarity with NoSQL and graph database technologies. Excellent communication skills, with the ability to engage both technical and non More ❯
with Azure Data Factory. Knowledge of Banking/finance is an advantage. Expert knowledge of relational databases like SQL DB and Oracle. Knowledge and experience with SQL, Python, or Scala is a must. Data formats like JSON, Parquet, XML, and REST API are familiar to you. Experience with CI/CD tools (GitLab/GitHub, Jenkins, Ansible, Nexus) for automated More ❯
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
Nottingham, England, United Kingdom Hybrid/Remote Options
Nottingham Building Society
long-term data strategy. About you - Extensive Technical Expertise: Strong knowledge of Microsoft Fabric components including OneLake, Lakehouse/Warehouse, Delta Lake, Direct Lake, Data Factory, Spark (PySpark/Scala) and Power BI (DAX and semantic modelling). Advanced Programming and Data Engineering Skills: Proficient in Python, SQL and T-SQL with experience in PySpark; familiarity with KQL for real More ❯
databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, Spark SQL, and related services. Apply best practices for development More ❯
databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, Spark SQL, and related services. Apply best practices for development More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, Spark SQL, and related services. Apply best practices for development More ❯
grow our collective data engineering capability. What we’re looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
grow our collective data engineering capability. What we’re looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T-SQL). Skilled in data modelling, quality, and metadata management. Experience with CI/CD and Infrastructure as Code using Azure DevOps and Terraform. Strong More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom
Crimson
Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T-SQL). Skilled in data modelling, quality, and metadata management. Experience with CI/CD and Infrastructure as Code using Azure DevOps and Terraform. Strong More ❯
these requirements. In order to secure one of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
tools. Monitor and optimize system performance, reliability, and scalability. Participate in technical roadmap discussions and provide input on architectural decisions. Required Skills & Experience Strong backend development experience in Java (Scala is a strong plus). Hands on experience with AWS services (e.g., EC2, S3, Lambda, IAM, etc.). Proficiency with Kafka for event streaming and messaging. Experience with Kubernetes, preferably More ❯