Solihull, West Midlands, West Midlands (County), United Kingdom
Pontoon
public sector setting. Strong programming skills: Python, R, SQL, and ideally knowledge of Scala, Java, or C++. Deep familiarity with the Microsoft technology stack - including Azure Data Factory, Synapse, Databricks, Power BI, Azure ML. Sound understanding of machine learning methods (KNN, Naïve Bayes, SVM, Decision Forests). Solid statistical and mathematical foundations (regression, distributions, linear algebra, multivariable calculus). Excellent More ❯
public sector setting. Strong programming skills: Python, R, SQL, and ideally knowledge of Scala, Java, or C++. Deep familiarity with the Microsoft technology stack - including Azure Data Factory, Synapse, Databricks, Power BI, Azure ML. Sound understanding of machine learning methods (KNN, Na ve Bayes, SVM, Decision Forests). Solid statistical and mathematical foundations (regression, distributions, linear algebra, multivariable calculus). More ❯
Rochdale, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Footasylum Ltd
share best practice. Within the teams we recognise individual skillsets and encourage knowledge sharing sessions and self-development. About You Experience with finance/financial systems and concepts Azure Databricks Azure Data Factory Excellent SQL skills Good Python/Spark/pyspark skills Experience of Kimball Methodology and star schemas (dimensional model). Experience of working with enterprise data warehouse More ❯
in BFSI or enterprise-scale environments is a plus. Preferred: Exposure to cloud platforms (AWS, Azure, GCP) and their data services. Knowledge of Big Data platforms (Hadoop, Spark, Snowflake, Databricks). Familiarity with data governance and data catalog tools. More ❯
and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders to deliver scalable, cloud-based solutions. What We're Looking More ❯
and hands-on experience with Spark. Experience building, maintaining, and debugging DBT pipelines. Strong proficiency in developing, monitoring, and debugging ETL jobs. Deep understanding of SQL and experience with Databricks, Snowflake, BigQuery, Azure, Hadoop, or CDP environments. Hands-on technical support experience, including escalation management and adherence to SLAs. Familiarity with CI/CD technologies and version control systems like More ❯
order to be considered for this role you will need to have - Experience with data management and data architectures (SQL/NoSQL), database systems, modern data warehouse platforms (Snowflake, Databricks, BigQuery), ETL/ELT pipeline development, and data lake/warehouse implementations for integrating structured and unstructured laboratory and sequencing data sources. Master Full-Stack technologies, such as Python, JavaScript More ❯
Do Develop and maintain BI solutions from end-to-end, collaborating with scientists and analysts. Build automated data models and pipelines for large, complex datasets, with a focus on Databricks in Azure. Drive data standardization and best practices to support a strong data culture. What You'll Bring Proven expertise in Power BI, DAX, Power Query (M), Power Apps, and More ❯
order to be considered for this role you will need to have - • Experience with data management and data architectures (SQL/NoSQL), database systems, modern data warehouse platforms (Snowflake, Databricks, BigQuery), ETL/ELT pipeline development, and data lake/warehouse implementations for integrating structured and unstructured laboratory and sequencing data sources. • Master Full-Stack technologies, such as Python, JavaScript More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Infoplus Technologies UK Ltd
of relational databases - Microsoft SQL Server and PostgreSQL databases/queries. Knowledge of AWS skills, navigating through data analytics services/databases, and experience with data platforms such as Databricks/Azure Datahub and similar ones. Data visualization and reporting tools using PowerBI, Microsoft Office 365 skills on Excel integration and Sharepoint. Experience with large-scale enterprise-wide IT environments More ❯
Cambridge, Cambridgeshire, East Anglia, United Kingdom
InfinityQuest Ltd,
Job Description Agent Design & Delivery : Architect and implement production-ready AI agents that integrate with enterprise platforms (SAP, Salesforce, ServiceNow, CI/CD, Databricks), with human-in-the-loop controls, observability, and auditability. Hybrid Skillset : Strong grounding in traditional machine learning and data workflows (forecasting, scoring models, risk detection, NLP etc) alongside Gen AI/LLM-based architectures. Able to More ❯
Job Description Agent Design & Delivery : Architect and implement production-ready AI agents that integrate with enterprise platforms (SAP, Salesforce, ServiceNow, CI/CD, Databricks), with human-in-the-loop controls, observability, and auditability. Hybrid Skillset : Strong grounding in traditional machine learning and data workflows (forecasting, scoring models, risk detection, NLP etc) alongside Gen AI/LLM-based architectures. Able to More ❯
growth & improvement. Experience of using digital analytics tools such as Google Analytics (or equivalent). A minimum of intermediate SQL ability with experience of using tools such Google BigQuery, Databricks (or equivalent). Use of visualisation tools (Looker Studio/Power BI/Tableau). Strong stakeholder engagement and ability to work collaboratively across multiple teams. Proactive nature, experience being More ❯
Glasgow Area, Scotland, United Kingdom Hybrid / WFH Options
Infoplus Technologies UK Ltd
Management: tracing and performance monitoring across huge distributed systems Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing More ❯
regulatory reporting systems in the banking sector. Strong analytical skills with the ability to communicate effectively across teams. Excellent time management and collaboration skills. Desirable Skills Exposure to Azure, Databricks , and middleware (IIB, Kafka, MQ). Experience in creating and managing data marts . Familiarity with modern data integration platforms . Why Apply? This is a fantastic opportunity to work More ❯
written and verbal Self-motivated with a proactive approach to continuous learning Familiarity with query performance tuning and execution plans Experience with Python, Snowflake, PowerShell Exposure to Apache Spark, Databricks, or other streaming platforms Awareness of cloud computing concepts and data warehousing Understanding of EMIR or MiFIR regulatory reporting solutions Why join us Career coaching, mentoring and access to upskilling More ❯
with Celery for distributed task execution and background job processing, particularly in data pipeline or microservices environments Hands-on experience with Azure cloud services, especially Azure Data Factory, Azure Databricks, Azure Storage, and Azure Synapse. Proficiency in designing and deploying CI/CD pipelines using Azure DevOps (YAML pipelines, release management, artifact handling). More ❯
DATABRICKS ENGINEER 6-MONTH CONTRACT £550-£600 PER DAY This role offers a great opportunity for an Azure Databricks Engineer to join a renewable energy firm based in London. You'll play a hands-on role in developing and optimising modern data lakehouse solutions on Azure, while supporting critical analytics and data delivery systems. The environment encourages technical ownership, collaboration … You'll be responsible for data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems. Your responsibilities will include: Designing and implementing scalable data lakehouse architectures using Databricks on Azure. Building efficient ETL/ELT pipelines for structured and unstructured data. Working with stakeholders to ensure high-quality, accessible data delivery. Optimising SQL workloads and data flows for … Automating infrastructure deployment using Terraform and maintaining CI/CD practices. Supporting secure and performant data access via cloud-based networking. KEY SKILLS AND REQUIREMENTS Strong experience with Azure Databricks in production environments. Background with Azure Data Factory, Azure Functions, and Synapse Analytics. Proficient in Python and advanced SQL, including query tuning and optimisation. Hands-on experience with big data More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
AMS CWS
the Project Manager include: Experience in large-scale cloud migration programmes. Proven experience delivering data migration or reporting projects within the financial services domain. Knowledge of Power BI, Azure Databricks, and Microsoft Fabric. Excellent stakeholder and third party management experience. Familiarity with data governance practices and tools. Why Tesco Insurance and Money Services? Seeing your impact all around you: there More ❯
Stevenage, Hertfordshire, South East, United Kingdom
Queen Square Recruitment Limited
building and managing Docker containers . Strong Linux Infrastructure knowledge. Solid background in Azure Infrastructure engineering . Nice to Have: Hands-on experience with Domino , Azure DevOps , Python , GitHub , Databricks . Familiarity with Agile Scrum methodologies . Why Join? This is an exciting opportunity to lead technical innovation within HPC, working on impactful projects at scale. Youll have the chance More ❯
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across both technical and non More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical More ❯
Job Description Agent Design & Delivery : Architect and implement production-ready AI agents that integrate with enterprise platforms (SAP, Salesforce, ServiceNow, CI/CD, Databricks), with human-in-the-loop controls, observability, and auditability. Hybrid Skillset : Strong grounding in traditional machine learning and data workflows (forecasting, scoring models, risk detection, NLP etc) alongside Gen click apply for full job details More ❯
Skills: Proven experience in enterprise data modelling. Proficiency with tools such as Erwin or dbt. Knowledge of relational, dimensional, and NoSQL data modelling. Familiarity with cloud platforms (Snowflake, AWS, Databricks). Strong communication and documentation skills. Eligibility: Contractor must be BPSS eligible. PAYE through Umbrella only. More ❯