data retention and archival strategies in cloud environments. Strong understanding and practical implementation of Medallion Architecture (Bronze, Silver, Gold layers) for structured data processing. Advanced programming skills in Python, PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake More ❯
data retention and archival strategies in cloud environments. Strong understanding and practical implementation of Medallion Architecture (Bronze, Silver, Gold layers) for structured data processing. Advanced programming skills in Python, PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake More ❯
london (city of london), south east england, united kingdom
HCLTech
data retention and archival strategies in cloud environments. Strong understanding and practical implementation of Medallion Architecture (Bronze, Silver, Gold layers) for structured data processing. Advanced programming skills in Python, PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflake More ❯
Databricks, or equivalent) Proficiency in ELT/ETL development using tools such as Data Factory, Dataflow Gen2, Databricks Workflows, or similar orchestration frameworks Experience with Python and/or PySpark for data transformation, automation, or pipeline development Familiarity with cloud services and deployment automation (e.g., Azure, AWS, Terraform, CI/CD, Git) Ability to deliver clear, insightful, and performant More ❯
record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational More ❯
Wiltshire, England, United Kingdom Hybrid / WFH Options
Data Science Talent
and DevOps and collaborate with the Software Delivery Manager and data engineering leadership. What you'll need Hands-on Databricks experience Strong Azure Cloud knowledge Proficient in SQL, Python, PySpark ETL & pipeline design (Matillion preferred, alternatives acceptable) Practical data modelling & pipeline architecture Terraform or Bicep for IaC About the company The company is one of the longest-established financial More ❯
swindon, wiltshire, south west england, united kingdom Hybrid / WFH Options
Data Science Talent
and DevOps and collaborate with the Software Delivery Manager and data engineering leadership. What you'll need Hands-on Databricks experience Strong Azure Cloud knowledge Proficient in SQL, Python, PySpark ETL & pipeline design (Matillion preferred, alternatives acceptable) Practical data modelling & pipeline architecture Terraform or Bicep for IaC About the company The company is one of the longest-established financial More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
engagement.* Drive innovation through advanced analytics and research-based problem solving. To be successful you should have: 10 years hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Previous experience in implementing best practices for data engineering, including data governance, data quality, and data security. Proficiency More ❯
Liverpool, Merseyside, North West, United Kingdom Hybrid / WFH Options
Forward Role
regular company events What you'll need: Solid experience in data engineering, management and analysis Strong experience with Azure Data Warehouse solutions and AWS Databricks platforms Exceptional Python/PySpark + additional languages for data processing Strong SQL with experience across both relational databases (SQL Server, MySQL) and NoSQL solutions (MongoDB, Cassandra) Hands-on knowledge of AWS S3 and More ❯
DevOps best practices. Collaborate with BAs on source-to-target mapping and build new data model components. Participate in Agile ceremonies (stand-ups, backlog refinement, etc.). Essential Skills: PySpark and SparkSQL. Strong knowledge of relational database modelling Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes). Azure platform experience. ADF or Synapse pipelines for orchestration. PythonMore ❯
For further details or to enquire about other roles, please contact Nick Mandella at Harnham. KEYWORDS Python, SQL, AWS, GCP, Azure, Cloud, Databricks, Docker, Kubernetes, CI/CD, Terraform, Pyspark, Spark, Kafka, machine learning, statistics, Data Science, Data Scientist, Big Data, Artificial Intelligence, private equity, finance. More ❯
Bedford, Bedfordshire, England, United Kingdom Hybrid / WFH Options
Reed Talent Solutions
data tooling such as Synapse Analytics, Microsoft Fabric, Azure Data Lake Storage/One Lake, and Azure Data Factory. Understanding of data extraction from vendor REST APIs. Spark/Pyspark or Python skills a bonus or a willingness to develop these skills. Experience with monitoring and failure recovery in data pipelines. Excellent problem-solving skills and attention to detail. More ❯
technical stakeholders. Business acumen with a focus on delivering data-driven value. Strong understanding of risk, controls, and compliance in data management. Technical Skills: Hands-on experience with Python, PySpark, and SQL . Experience with AWS (preferred). Knowledge of data warehousing (DW) concepts and ETL processes. Familiarity with DevOps principles and secure coding practices. Experience: Proven track record More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Forward Role
financial datasets Python experience, particularly for data processing and ETL workflows Hands-on experience with cloud platforms- Azure Experience designing and maintaining data pipelines using tools like Databricks and PySpark Knowledge of data warehousing solutions - Snowflake experience would be brilliant Understanding of CI/CD processes for deploying data solutions Some exposure to big data technologies and distributed processing More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
True North Group
business needs. Contribute to pre-sales, proposals, and thought leadership activities. Requirements Strong experience designing and maintaining data platforms and ETL/ELT solutions. Solid knowledge of Databricks, Python, PySpark, Spark SQL, Azure and/or AWS. Data modelling expertise (Inmon, Kimball, Data Vault). Familiar with DataOps practices and pipeline monitoring. Experience with sensitive data and applying security More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Amtis Professional Ltd
Troubleshoot issues and continuously improve data infrastructure Explore AI-driven enhancements to boost data accuracy and productivity Requirements: Strong experience with: Azure Databricks, Data Factory, Blob Storage Python/PySpark SQL Server, Parquet, Delta Lake Deep understanding of: ETL/ELT, CDC, stream processing Lakehouse architecture and data warehousing Scalable pipeline design and database optimisation A proactive mindset, strong More ❯
and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across both technical and non-technical teams, with the ability More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical teams, with the ability More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical teams, with the ability to More ❯
engineering and Azure cloud data technologies. You must be confident working across: Azure Data Services, including: Azure Data Factory Azure Synapse Analytics Azure Databricks Microsoft Fabric (desirable) Python and PySpark for data engineering, transformation, and automation ETL/ELT pipelines across diverse structured and unstructured data sources Data lakehouse and data warehouse architecture design Power BI for enterprise-grade More ❯
complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business More ❯
Atherstone, Warwickshire, England, United Kingdom Hybrid / WFH Options
Aldi
end-to-end ownership of demand delivery Provide technical guidance for team members Providing 2nd or 3rd level technical support About You Experience using SQL, SQL Server DB, Python & PySpark Experience using Azure Data Factory Experience using Data Bricks and Cloudsmith Data Warehousing Experience Project Management Experience The ability to interact with the operational business and other departments, translating More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯