data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Billigence
Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with DataOps, CI/CD practices, and infrastructure-as-code Knowledge of data governance, data More ❯
Data architecture and solution design experience Hands-on experience with modern data tools such as dbt, Fivetran, Matillion, or similar data integration platforms Programming skills in Python, Java, or Scala Relevant cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP Data Engineering certifications) Experience with DataOps, CI/CD practices, and infrastructure-as-code Knowledge of data governance, data More ❯
and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Az-Tec Talent
and collaboratively within client teams. Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model More ❯
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
AI Services, Azure OpenAI, AI Search, Data Lake Storage, Data Factory, Databricks, HDInsight, Azure Synapse Analytics, Azure SQL Database, Functions, and Azure DevOps. Strong programming skills in Python, Java, Scala, R, or .NET/C# . Proficiency in SQL and database design . Solid understanding of data models, data mining, analytics, and segmentation techniques . Experience with ETL pipelines and More ❯
AI Services, Azure OpenAI, AI Search, Data Lake Storage, Data Factory, Databricks, HDInsight, Azure Synapse Analytics, Azure SQL Database, Functions, and Azure DevOps. Strong programming skills in Python, Java, Scala, R, or .NET/C# . Proficiency in SQL and database design . Solid understanding of data models, data mining, analytics, and segmentation techniques . Experience with ETL pipelines and More ❯
grow our collective data engineering capability. What we’re looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
grow our collective data engineering capability. What we’re looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
in consultancy or client-facing roles is highly desirable. Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure DevOps). Desirable Exposure to Python or Scala for data engineering tasks. Knowledge of Power BI or other visualisation tools. Experience with data governance frameworks and metadata management. If you’re interested, get in touch ASAP with a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in consultancy or client-facing roles is highly desirable. Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure DevOps). Desirable Exposure to Python or Scala for data engineering tasks. Knowledge of Power BI or other visualisation tools. Experience with data governance frameworks and metadata management. If you’re interested, get in touch ASAP with a More ❯
DevOps teams to deliver robust streaming solutions. Required: • Hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.) • Strong proficiency in Java, Python, or Scala • Solid understanding of event-driven architecture and data streaming patterns • Experience deploying Kafka on cloud platforms such as AWS, GCP, or Azure • Familiarity with Docker, Kubernetes, and CI/CD More ❯
DevOps teams to deliver robust streaming solutions. Required: • Hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.) • Strong proficiency in Java, Python, or Scala • Solid understanding of event-driven architecture and data streaming patterns • Experience deploying Kafka on cloud platforms such as AWS, GCP, or Azure • Familiarity with Docker, Kubernetes, and CI/CD More ❯
for all major data initiatives. DATA MANAGER - ESSENTIAL SKILLS: Proven experience as a Senior or Lead Data Engineer , Data Manager, or similar leadership role. Strong proficiency in Python (or Scala/Java) and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data More ❯
for all major data initiatives. DATA MANAGER – ESSENTIAL SKILLS: Proven experience as a Senior or Lead Data Engineer , Data Manager, or similar leadership role. Strong proficiency in Python (or Scala/Java) and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data More ❯
for all major data initiatives. DATA MANAGER – ESSENTIAL SKILLS: Proven experience as a Senior or Lead Data Engineer , Data Manager, or similar leadership role. Strong proficiency in Python (or Scala/Java) and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data More ❯
tools (e.g. Kafka, Flink, DBT etc.) , data storage (e.g. Snowflake, Redshift, etc.) and also IaC ( e.g. Terraform, CloudFormation ) Software development experience with one or more languages (e.g. Python, Java, Scala, Go ) Pragmatic approach to solving problems Nice to have: keen interest in modern AI/ML techniques Why Mesh-AI Fast-growing start-up organisation with huge opportunity for career More ❯
More.. LEAD DATA ENGINEER – ESSTENTIAL SKILLS Proven experience as a Senior or Lead Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery More ❯
More.. LEAD DATA ENGINEER – ESSTENTIAL SKILLS Proven experience as a Senior or Lead Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery More ❯
in technology consulting, enterprise and solutions architecture and architectural frameworks, Data Modelling, Experience on ERWIN Modelling. • Hands on experience in ADF, Azure Databricks, Azure Synapse, Spark, PySpark, Python/Scala, SQL. • Hands on Experience in Designing and building Data Lake from Multiple Source systems/Data Providers. • Experience on Data Modelling, architecture, implementation & Testing. • Experienced in designing & implementation of End More ❯
in technology consulting, enterprise and solutions architecture and architectural frameworks, Data Modelling, Experience on ERWIN Modelling. • Hands on experience in ADF, Azure Databricks, Azure Synapse, Spark, PySpark, Python/Scala, SQL. • Hands on Experience in Designing and building Data Lake from Multiple Source systems/Data Providers. • Experience on Data Modelling, architecture, implementation & Testing. • Experienced in designing & implementation of End More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KPMG UK
is required. Demonstrable experience in leading client data engineering and integration projects for major clients Hands-on experience of designing and implementing Quantexa solutions for clients. Technical excellence in Scala, Python and Databricks Skills we’d love to see/Amazing Extras: Experience delivering Quantexa in Financial Services, Fraud Detection, AML, or KYC domains. Exposure to DevOps and CI/ More ❯