system infrastructures, favouring infrastructure-as-code practices using tools such as Terraform and Pulumi. Programming Languages: Proficient in Python and SQL, with additional experience in programming languages like Java, Scala, GoLang, and Rust considered advantageous. CI/CD Implementation: Knowledgeable about continuous integration and continuous deployment practices using tools like GitHub Actions and ArgoCD, enhancing software development and quality assurance. More ❯
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom
Crimson
Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T-SQL). Skilled in data modelling, quality, and metadata management. Experience with CI/CD and Infrastructure as Code using Azure DevOps and Terraform. Strong More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, Spark SQL, and related services. Apply best practices for development More ❯
grow our collective data engineering capability. What we're looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
grow our collective data engineering capability. What we're looking for Solid experience as a Senior/Lead Data Engineer in complex enterprise environments. Strong coding skills in Python (Scala or functional languages a plus). Expertise with Databricks, Apache Spark, and Snowflake (HDFS/HBase also useful). Experience integrating large, messy datasets into reliable, scalable data products. Strong More ❯
these requirements. In order to secure one of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
for all major data initiatives. DATA MANAGER - ESSENTIAL SKILLS: Proven experience as a Senior or Lead Data Engineer , Data Manager, or similar leadership role. Strong proficiency in Python (or Scala/Java) and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data More ❯
support modelling and data analytics · As an AWS Solutions Architect or Lead, or similar · Strong Python experience or other object oriented or a functional programming language such as Java, Scala, C#, or R · Data warehousing – building operation ETL data pipelines across several sources, and constructing relation and dimensional data models · Built and maintained scalable infrastructure through IaC frameworks (e.g., Terraform More ❯
Greater Manchester, North West, United Kingdom Hybrid/Remote Options
Searchability (UK) Ltd
Paternity Charity Volunteer Days Cycle to work scheme And More.. DATA ENGINEER - ESSTENTIAL SKILLS Proven experience building data pipelines using Databricks . Strong understanding of Apache Spark (PySpark or Scala) and Structured Streaming . Experience working with Kafka (MSK) and handling real-time data . Good knowledge of Delta Lake/Delta Live Tables and the Medallion architecture . Hands More ❯
Central London, London, United Kingdom Hybrid/Remote Options
McCabe & Barton
Implement governance and security measures across the platform. Leverage Terraform or similar IaC tools for controlled and reproducible deployments. Databricks Development Develop and optimise data jobs using PySpark or Scala within Databricks. Implement the medallion architecture (bronze, silver, gold layers) and use Delta Lake for reliable data transactions. Manage cluster configurations and CI/CD pipelines for Databricks deployments. Monitoring More ❯
tools (e.g. Kafka, Flink, DBT etc.), data storage (e.g. Snowflake, Redshift, etc.) and also IaC (e.g. Terraform, CloudFormation) Software development experience with one or more languages (e.g. Python, Java, Scala, Go) Pragmatic approach to solving problems Nice to have: keen interest in modern AI/ML techniques Why Mesh-AI Fast-growing start-up organisation with huge opportunity for career More ❯
tools (e.g. Kafka, Flink, DBT etc.), data storage (e.g. Snowflake, Redshift, etc.) and also IaC (e.g. Terraform, CloudFormation) Software development experience with one or more languages (e.g. Python, Java, Scala, Go) Pragmatic approach to solving problems Nice to have: keen interest in modern AI/ML techniques Why Mesh-AI Fast-growing start-up organisation with huge opportunity for career More ❯
and database design (normalisation, indexing, query optimisation). Strong experience with ETL/ELT tools. E.g. Azure Data Factory, Databricks, Synapse Pipelines, SSIS, etc. Experience with Python, PySpark, or Scala for data processing. Familiarity with CI/CD practices. Experience with Data lake, Data warehouse and Medallion architectures. Understanding of API integrations and streaming technologies (event hub). Version control More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Syntax Consultancy Limited
+ data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg- Azure Data Engineer More ❯
EC4N 6JD, Vintry, United Kingdom Hybrid/Remote Options
Syntax Consultancy Ltd
+ data integration patterns. Experience of working with complex data pipelines, large data sets, data pipeline optimization + data architecture design. Implementing complex data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg- Azure Data Engineer More ❯
to demonstrate the following experience: Commercial experience gained in a Data Engineering role on any major cloud platform (Azure, AWS or GCP) Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Some experience with the design, build and maintenance of data pipelines and More ❯
and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Randstad Technologies
and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would More ❯
Edinburgh, City of Edinburgh, United Kingdom Hybrid/Remote Options
Cathcart Technology
technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive More ❯
Employment Type: Permanent
Salary: £80000 - £100000/annum Bonus, Pension and Shares
Preston, Lancashire, North West, United Kingdom Hybrid/Remote Options
Circle Group
processing frameworks and technologies AWS or Azure cloud experience Experience with data modelling, data integration ETL processes and designing efficient data structures Strong programming skills in Python, Java, or Scala Data warehousing concepts and dimensional modelling experience Any data engineering skills in Azure Databricks and Microsoft Fabric would be a bonus This new role involves leading a data team, fostering More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive More ❯
Nottingham, Nottinghamshire, England, United Kingdom Hybrid/Remote Options
BUZZ Bingo
re Looking For Essential Skills & Experience: Proven experience as a Data Engineer or similar role, with strong knowledge of data warehousing and modelling. Proficiency in C#, Python, Java, or Scala . Hands-on experience with ETL tools (e.g., SSIS) and orchestration tools (e.g., Azure Data Factory). Strong SQL skills and experience with relational databases (MSSQL, PostgreSQL, MySQL). Familiarity More ❯
leeds, west yorkshire, yorkshire and the humber, united kingdom
AND Digital
data projects across the full SDLC from design to migration, integration and live service in a multi-vendor environment Strong skills in languages such as Python, R, SQL or Scala, alongside experience of using modern and traditional data technologies including: ElasticSearch, MongoDB, PostgreSQL, mySQL/mariaDB, Oracle, SQL Server, Hadoop, Kafka, Splunk/ELK or other logging and monitoring tools More ❯
Tadworth, Surrey, South East, United Kingdom Hybrid/Remote Options
Pfizer
data science models to solve problems in a business environment setting. Relevant Experience Experience with machine learning technology, such as: big data, Java, Python, R, AWS, LLM models,. Scala and visualization techniques, including Dash, Tableau and Angular. Experience in understanding brand content, strategy, and tactics. Ability to effectively utilize dashboards and data products to derive insights. Experience with supporting More ❯