via the Apply link on this page. For further details or to enquire about other roles, please contact Nick Mandella at Harnham. KEYWORDS Python, SQL, AWS, GCP, Azure, Cloud, Databricks, Docker, Kubernetes, CI/CD, Terraform, Pyspark, Spark, Kafka, machine learning, statistics, Data Science, Data Scientist, Big Data, Artificial Intelligence, private equity, finance. More ❯
on-prem Analysis Services to the cloud, moving our WMS, ERP and bespoke apps to scalable platforms, and expanding AI/ML initiatives. We are evaluating both Fabric and Databricks as a potential strategic fit. As our hands-on Data & Analytics Engineer, youll play a key role in shaping and influencing that decision, helping assess the best platform for our … ownership across tooling, standards, engineering and enablement rather than being confined to a narrow BI silo. Key Responsibilities Data Platform Build-out Design relational & Lakehouse schemas in Fabric or Databricks Lead the re-architecture of SSAS cubes to modern Lakehouse models Set up medallion architecture and govern data pipelines. Contribute to the evaluation and selection of data platform architecture based … dimensional modelling, 3NF) Experience designing and building modern data pipelines and Lakehouse architectures Hands-on experience with at least one enterprise-grade data platform (e.g., Microsoft Fabric, Azure Synapse, Databricks, or equivalent) Proficiency in ELT/ETL development using tools such as Data Factory, Dataflow Gen2, Databricks Workflows, or similar orchestration frameworks Experience with Python and/or PySpark for More ❯
on-prem Analysis Services to the cloud, moving our WMS, ERP and bespoke apps to scalable platforms, and expanding AI/ML initiatives. We are evaluating both Fabric and Databricks as a potential strategic fit. As our hands-on Data & Analytics Engineer, you’ll play a key role in shaping and influencing that decision, helping assess the best platform for … ownership across tooling, standards, engineering and enablement rather than being confined to a narrow BI silo. Key Responsibilities Data Platform Build-out Design relational & Lakehouse schemas in Fabric or Databricks Lead the re-architecture of SSAS cubes to modern Lakehouse models Set up medallion architecture and govern data pipelines. Contribute to the evaluation and selection of data platform architecture based … dimensional modelling, 3NF) Experience designing and building modern data pipelines and Lakehouse architectures Hands-on experience with at least one enterprise-grade data platform (e.g., Microsoft Fabric, Azure Synapse, Databricks, or equivalent) Proficiency in ELT/ETL development using tools such as Data Factory, Dataflow Gen2, Databricks Workflows, or similar orchestration frameworks Experience with Python and/or PySpark for More ❯
Rugby, Warwickshire, West Midlands, United Kingdom Hybrid / WFH Options
Data Careers
DevOps for version control and deployment Confident presenting to stakeholders and running end-user training sessions Passion for clean design and data storytelling Any exposure to Microsoft Fabric or Databricks is a bonus More ❯
machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for More ❯
machines - both Windows and Linux. Familiarity with server patching and maintenance. * Strong understanding of security best practices within Azure and ideally AWS. * Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. * Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fynity
You will be adaptable and able to work with different tools and technologies, depending on the client and project needs but will ideally possess stength in: MS Fabric stack Databricks Azure SQL Tableau/Power BI A knowledge of data modelling and of general IT architecture and systems integration is also required. Other technologies such as Azure Data Factory, RedShift More ❯
You will be adaptable and able to work with different tools and technologies, depending on the client and project needs but will ideally possess stength in: MS Fabric stack Databricks Azure SQL Tableau/Power BI A knowledge of data modelling and of general IT architecture and systems integration is also required. Other technologies such as Azure Data Factory, RedShift More ❯
Liverpool, Merseyside, North West, United Kingdom Hybrid / WFH Options
Forward Role
scheme, long service recognition, and regular company events What you'll need: Solid experience in data engineering, management and analysis Strong experience with Azure Data Warehouse solutions and AWS Databricks platforms Exceptional Python/PySpark + additional languages for data processing Strong SQL with experience across both relational databases (SQL Server, MySQL) and NoSQL solutions (MongoDB, Cassandra) Hands-on knowledge More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Avencia Consulting
within the insurance sector, ideally with London Market experience. Strong knowledge of insurance data models, including policy, claims, and risk-related structures. Expertise in Azure Cloud Services (Data Factory, Databricks, Key Vault, Azure SQL) and data warehousing methodologies (Kimball, Inmon). Proficiency in T-SQL for data transformation and summarisation, with Python skills for building data pipelines. Advanced Excel skills More ❯
Experience building production-grade data pipelines (primarily in Python and SQL) in cloud environments, with an emphasis on scalability, code clarity, and long-term maintainability Hands-on experience with Databricks and/or Spark, especially Delta Lake, Unity Catalog, and MLflow Deep familiarity with cloud platforms, particularly AWS and Google Cloud Proven ability to manage data architecture and production pipelines More ❯
Python and AI/ML frameworks such as PyTorch, LangChain, LangGraph, GraphRAG, and AutoGen. Experience with modern vector and graph databases (e.g., ChromaDB, Neo4j) and LLMOps platforms (e.g., Azure, Databricks, Azure OpenAI). Proven track record of delivering scalable AI solutions in enterprise settings, preferably in life sciences. Excellent communication and interpersonal skills, with the ability to lead projects and More ❯
strategy roles, ideally within a complex, regulated, or customer-centric environment Degree in a quantitative field (e.g., Data Science, Economics, Engineering, Business Analytics) Deep understanding of data platforms (e.g., Databricks, Tableau Cloud), data governance, and AI/ML applications in business contexts Strategic thinker with a bias for action and a passion for unlocking business value through data Excellent communication More ❯
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Deloitte LLP
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Deloitte LLP
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Deloitte LLP
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar More ❯
like web applications on Azure cloud infrastructure with Docker, Kubernetes, automation testing and DevOps (ideally terraform + github actions). Create, monitor and maintain data pipelines and batch processes (databricks, pyspark, lakehouses, kafka). Productionise quant models into software applications, ensuring robust day-to-day operation, monitoring and back testing are in place. Stakeholder engagement: Elicit, validate and translate business More ❯
Leicester, Leicestershire, England, United Kingdom
Harnham - Data & Analytics Recruitment
and manage ETL pipelines and APIs to improve data delivery and accuracy Contribute to the overall data strategy alongside BI and analytics colleagues Tech You'll Use SQL Python Databricks Experience with web scraping or similar techniques would be a bonus. Interview Process Initial interview with the Data Manager Technical discussion/assessment (you'll walk through your approach to More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
True North Group
to meet business needs. Contribute to pre-sales, proposals, and thought leadership activities. Requirements Strong experience designing and maintaining data platforms and ETL/ELT solutions. Solid knowledge of Databricks, Python, PySpark, Spark SQL, Azure and/or AWS. Data modelling expertise (Inmon, Kimball, Data Vault). Familiar with DataOps practices and pipeline monitoring. Experience with sensitive data and applying More ❯
through advanced analytics and research-based problem solving. To be successful you should have: 10 years hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Previous experience in implementing best practices for data engineering, including data governance, data quality, and data security. Proficiency in data processing and More ❯
through advanced analytics and research-based problem solving. To be successful you should have: 10 years hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Previous experience in implementing best practices for data engineering, including data governance, data quality, and data security. Proficiency in data processing and More ❯
in BFSI or enterprise-scale environments is a plus. Preferred: Exposure to cloud platforms (AWS, Azure, GCP) and their data services. Knowledge of Big Data platforms (Hadoop, Spark, Snowflake, Databricks). Familiarity with data governance and data catalog tools. More ❯