London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control tools. Experience in Agile environments, with solid CI/CD More ❯
Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory … in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with Azure services like Azure Data More ❯
create bespoke, scalable data solutions Support data migration efforts from Azure to Databricks Use Terraform to manage and deploy cloud infrastructure Build robust data workflows in Python (e.g., pandas, PySpark) Ensure the platform is scalable, efficient, and ready for future AI use cases REQUIRED SKILLS & EXPERIENCE Strong experience with Azure and Databricks environments Advanced Python skills for data engineering … pandas, PySpark) Proficiency in designing and maintaining ETL pipelines Experience with Terraform for infrastructure automation Track record of working on cloud migration projects, especially Azure to Databricks Comfortable working onsite in London 2 days/week and engaging cross-functionally Strong communication and problem-solving abilities NICE TO HAVES Experience with Qlik or other data visualisation tools Exposure to More ❯
Bracknell, Berkshire, South East, United Kingdom Hybrid / WFH Options
Halian Technology Limited
business intelligence, reporting, and regulatory needs Lead the integration and optimisation of large-scale data platforms using Azure Synapse and Databricks Build and maintain robust data pipelines using Python (PySpark) and SQL Collaborate with data engineers, analysts, and stakeholders to ensure data quality, governance, and security Ensure all solutions adhere to financial regulations and internal compliance standards Key Skills … Experience: Proven experience as a Data Architect within the financial services sector Hands-on expertise with Azure Synapse Analytics and Databricks Strong programming and data engineering skills in Python (PySpark) and SQL Solid understanding of financial data and regulatory compliance requirements Excellent stakeholder communication and documentation skills More ❯
in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data More ❯
and managing project changes and interventions to achieve project outputs. Documenting all aspects of the project for future reference and audits. Technical Responsibilities: Developing SQL scripts (store procedures) and PySpark notebooks. Creating and managing ingestion, ETL & ELT processes. Designing and configuring Synapse pipelines. Data modelling in various storage systems. Analysing existing data designs and suggesting improvements for performance, stability … Experience in Project Management within the Defence & Security sector. Strong technical skills in API, Java, Python, Web Development, SQL, and Azure. Proficiency in developing and managing SQL scripts and PySpark notebooks. Understanding of ETL & ELT processes and Synapse pipeline design and configuration. Experience in data modelling and improving existing data designs. Knowledge of real-time data processing. Capable of More ❯
and managing project changes and interventions to achieve project outputs. Documenting all aspects of the project for future reference and audits. Technical Responsibilities: Developing SQL scripts (store procedures) and PySpark notebooks. Creating and managing ingestion, ETL & ELT processes. Designing and configuring Synapse pipelines. Data modelling in various storage systems. Analysing existing data designs and suggesting improvements for performance, stability … Experience in Project Management within the Defence & Security sector. Strong technical skills in API, Java, Python, Web Development, SQL, and Azure. Proficiency in developing and managing SQL scripts and PySpark notebooks. Understanding of ETL & ELT processes and Synapse pipeline design and configuration. Experience in data modelling and improving existing data designs. Knowledge of real-time data processing. Capable of More ❯
Northampton, Northamptonshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
office 2/3 days in a week) Duration: 6 months Inside IR35 Job description: This will be a Tech Lead who is proficient in developing complex logic using pyspark in AWS along with helping/leading the team. 7+ years of experienced in designing, developing complex logic for data pipelines using pyspark in AWS along with helping …/leading the team. He/she needs to experienced and skilled in PySpark, Glue, Python, SQL and Data processing . This involves designing ETL processes, ensuring data security, and collaborating with other teams for data analysis and business requirements. Skilled in scalable, reliable, and efficient data solutions , often using AWS services like S3, Redshift, EMR, Glue, and Kinesis More ❯
Work with the team to support ETL processes What you'll need to succeed Seasoned knowledge of the Azure Databricks platform and associated functionalities Strong Python programming knowledge, ideally Pyspark A logical and analytical approach to problem-solving Awareness of the modern data stack and associated methodologies What you'll get in return A rewarding contract providing exposure to More ❯
Employment Type: Contract
Rate: £500.0 - £650.0 per day + £500 to £650 per day
of ETL Processing/Data Warehousing testing; including Databricks and Data Factory. - Hands-on experience with SQL or Azure SQL. - Experience using automated testing on Python frameworks (Pystest/Pyspark) - Experience with Specflow and other frameworks. If you are interested in this Data Tester role please apply with your most recent CV alterntaviely reach out to me jordan . More ❯
NHS Arden and Greater East Midlands Commissioning Support Unit
have strong coding skills to support the automation of reporting and the development of pipelines to feed dashboards. In particular, to include proficient in coding using R studio and pyspark, able to use Github, and able to develop dashboards using tableau, databricks, and/or FDP. The post-holder should have strong requirements gathering skills to understand and design More ❯
NHS Arden and Greater East Midlands Commissioning Support Unit
have strong coding skills to support the automation of reporting and the development of pipelines to feed dashboards. In particular, to include proficient in coding using R studio and pyspark, able to use Github, and able to develop dashboards using tableau, databricks, and/or FDP. The post-holder should have strong requirements gathering skills to understand and design More ❯
Skills & Experience:4+ years of experience in AI/ML engineering or data-intensive systems.Strong proficiency in Python for AI, ML, and data engineering tasks.Deep experience with Apache Spark (PySpark or Scala-based implementations).Solid understanding and hands-on experience with modelling intelligent agents, including symbolic, neural, or hybrid approaches.Experience deploying to and managing workloads on MCP or distributed More ❯
and end users to define, test and deliver technical and functional requirements. - You will need experience implementing Master Data Management programmes, this is mandatory for the role. - SQL and PySpark knowledge - Azure Databricks experience. - Experience with Data Querying, and Data Profiling. - Experience working with Large data sets especially analysing and cleansing said data. - Strong Communication Skills. - Experience working within More ❯
TECHNICAL PROGRAMME MANAGER - DATA INGESTION (PHARMA/SNOWFLAKE) UP TO £560 PER DAY HYBRID (1/2 DAYS PER WEEK IN SPAIN & GERMANY) 6 MONTHS THE COMPANY: A global data and analytics consultancy are delivering a large-scale data ingestion More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Proven experience as a Programme or Delivery Manager on data-centric programmes Solid understanding of data ingestion processes and Snowflake data warehousing Familiarity with AWS Glue, S3, DBT, SnapLogic, PySpark (not hands-on, but able to converse technically) Strong governance and delivery background in a data/tech environment Excellent communication and stakeholder management skills (must be assertive) Pharma More ❯