data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with DataOps, CI/CD practices, and infrastructure-as-code Knowledge of data governance, data More ❯
Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with DataOps, CI/CD practices, and infrastructure-as-code Knowledge of data governance, data More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Az-Tec Talent
and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model More ❯
and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model More ❯
in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (Spark, MapReduce, Hadoop, Hive, EMR, Kafka More ❯
in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (Spark, MapReduce, Hadoop, Hive, EMR, Kafka More ❯
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CV TECHNICAL LTD
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
Coventry, West Midlands, United Kingdom Hybrid/Remote Options
Coventry Building Society
tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with databases used in Data Warehousing, Data Lakes, and Lakehouse setups. You know how to work with both structured and unstructured data. Experience in testing More ❯
Coventry, Warwickshire, United Kingdom Hybrid/Remote Options
Coventry Building Society
tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms. Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets. Experience in coding SQL, Python, Spark, or Scala to work with data. Experience with databases used in Data Warehousing, Data Lakes, and Lakehouse setups. You know how to work with both structured and unstructured data. Experience in testing More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
KPMG UK
to having resided in the UK for at least the past 5 years and being a UK national or dual UK national. Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
and enhance data performance and infrastructure. Key Skills & Experience: Strong experience with SQL/NoSQL databases, data warehousing, and big data (Hadoop, Spark). Proficient in Python, Java, or Scala with solid OOP and design pattern understanding. Expertise in ETL tools, DevOps and orchestration frameworks (Airflow, Apache NiFi). Hands-on experience with cloud platforms (AWS, Azure, or GCP) and More ❯
grow our collective data engineering capability. What we’re looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
grow our collective data engineering capability. What we’re looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
DevOps teams to deliver robust streaming solutions. Required: • Hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.) • Strong proficiency in Java, Python, or Scala • Solid understanding of event-driven architecture and data streaming patterns • Experience deploying Kafka on cloud platforms such as AWS, GCP, or Azure • Familiarity with Docker, Kubernetes, and CI/CD More ❯
DevOps teams to deliver robust streaming solutions. Required: • Hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.) • Strong proficiency in Java, Python, or Scala • Solid understanding of event-driven architecture and data streaming patterns • Experience deploying Kafka on cloud platforms such as AWS, GCP, or Azure • Familiarity with Docker, Kubernetes, and CI/CD More ❯
focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with More ❯
Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T-SQL). Skilled in data modelling, quality, and metadata management. Experience with CI/CD and Infrastructure as Code using Azure DevOps and Terraform. Strong More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom
Crimson
Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T-SQL). Skilled in data modelling, quality, and metadata management. Experience with CI/CD and Infrastructure as Code using Azure DevOps and Terraform. Strong More ❯
the team and role for you! What You Need To Succeed 6+ years of experience developing and delivering production-grade software and data systems. Proficiency in Python, Java, or Scala - comfortable writing robust, testable, and scalable code. Deep experience with AWS (Lambda, ECS/EKS, EMR, Step Functions, S3, IAM, etc.). Strong knowledge of distributed systems and streaming/ More ❯
full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting More ❯
full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting More ❯