London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
availability of data assets. Experience: 3-5 years of hands-on experience as a Data Engineer working in Azure cloud environments. Strong proficiency in Azure Data Factory, Synapse Analytics, Databricks, and Azure SQL. Solid understanding of data modeling, ETL/ELT design, and data warehousing principles. Proficiency in SQL and Python for data transformation and automation. Experience working with version More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
. Familiarity with Snowflake cost monitoring, governance, replication , and environment management. Strong understanding of data modelling (star/snowflake schemas, SCDs, lineage). Proven Azure experience (Data Factory, Synapse, Databricks) for orchestration and integration. Proficient in SQL for complex analytical transformations and optimisations. Comfortable working in agile teams and using Azure DevOps for CI/CD workflows. Nice to Have More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
technologies in data engineering, and continuously improve your skills and knowledge. Profile The Data Engineer will have mastery of data management and processing tools, including Power BI, Data Factory, Databricks, SQL Server, and Oracle. Proficient in SQL and experienced in database administration. Familiarity with cloud platforms such as Azure and AWS. Excellent problem-solving and analytical skills, with strong attention More ❯
Newark, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
The Wildlife Trust
people focused data engineering and analysis. You will have experience in a data engineering role, ideally with practical experience or the ability to upskill in cloud services like Azure, Databricks and ESRI, as well as excellent proven proficiency, SQL and Python. Ideally you would have a familiarity with developing pipelines which support Analysts who use RStudio Power BI and/ More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Additional Resources Ltd
Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus More ❯
Warwick, Warwickshire, West Midlands, United Kingdom
Stackstudio Digital Ltd
effectively with stakeholders, ensuring clear communication throughout project lifecycles. Skills, Experience, and Abilities Required: Essential: Expert-level experience with Azure Data Factory (ADF) Hands-on experience with Snowflake and Databricks Proficiency in Python programming Strong Azure and SQL knowledge Excellent debugging and troubleshooting abilities Good communication skills Desirable: 6 to 8 years of relevant experience in data engineering or related More ❯
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in More ❯
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background More ❯
track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience implementing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Executive Facilities
Experience domains. Proficiency in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) and ETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and More ❯
frameworks (e.g., Pandas, NumPy, Scikit-learn, PyTorch, TensorFlow). Experience developing and fine-tuning LLMs and working with generative AI tools. Hands-on experience with Microsoft Azure (Azure ML, Databricks, or other AI services). Ability to handle, process, and analyse large, complex datasets. Strong understanding of data ethics, governance, and responsible AI principles. Must have SC Clearance and More ❯
backend components of a next-gen trading platform. Work with Java and/or Kotlin to deliver robust solutions. Deploy containerised applications using Kubernetes and Docker. Leverage MongoDB and Databricks for data processing and analytics. Integrate with relational databases and support legacy data migration. Collaborate with stakeholders and contribute to technical decision-making. Ensure code quality through testing, debugging, and More ❯
published data models and reports. Experience required: Strong background in data engineering, warehousing, and data quality. Proficiency in Microsoft 365, Power BI, and other BI tools Familiarity with Azure Databricks and Delta Lake is desirable. Ability to work autonomously in a dynamic environment and contribute to team performance. Strong communication, influencing skills, and a positive, can-do attitude. Knowledge of More ❯
business impact. What You'll Bring: 8+ years in Data Analytics/Science. Expertise in SQL and dashboarding tools ( Tableau/Qlik ). Familiarity with big data tools ( Snowflake, Databricks ) and ETL . Experience with A/B testing and Python/R is preferred. Contract Details: Location: London, UK Duration: 10 Months Rate: Up to £277 (Umbrella) Ready to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
Experience in Data Analytics or Data Science, ideally in Customer Success, Operations or Digital Team. Strong SQL skills and experience with dashboarding tools such as Tableau. Familiarity with Snowflake, Databricks, Spark, and ETL processes. Python or automation is a plus. Knowledge of A/B testing and statistics. Strong communication and storytelling skills to influence diverse audiences. Experience with call More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Akkodis
Akkodis is partnering with a reputable client in Finance and they are looking for a Data Engineer with strong Databricks experience to expand their IT Team. They are currently improving their CX transformation programme and making changes to their institutional reporting suite. Job title: Data Engineer - Data BricksSTART :ASAPLocation: Liverpool Street - 3 days a eek (NON Negotiable), the rest WFHDuration … months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models and ETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven experience with … Databricks and Snowflake.Strong proficiency in SQL and Python.Extract Load Transform procedures, Data Lineage & Analysis.Familiarity with cloud platforms (e.g., Azure, AWS, or GCP).Experience with CI/CD and version control tools (e.g., Git).Excellent problem-solving and communication skills. Nice to have: Financial data knowledge on funds, securities, positions, trades and background in Asset Management. This opportunity is for a More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Oliver James
background working with large datasets, with proficiency in SQL, Python, and PySpark * Experience managing and mentoring engineers with varying levels of experience * Hands-on experience deploying pipelines within Azure Databricks, ideally following the Medallion Architecture framework Hybrid working: Minimum two days per week on-site in London. If this sounds like something you'd be interested in, please don't More ❯
and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Adecco
within a consumer function. Key Skills & Experience: Previous experience working within Collections Strong knowledge of Credit Reference Agencies 2 years' minimum of experience in Credit Risk Advanced talent with Databricks, Python, SQL , Power BI Expert-level skills in Microsoft Excel Proven ability to translate complex data sets into clear, actionable insights for senior leadership Excellent communication skills and senior stakeholder More ❯
Bournemouth, Dorset, England, United Kingdom Hybrid / WFH Options
Sanderson
GDPR, BCBS 239, DORA). Oversee data lineage, cataloging, and metadata management using tools such as Collibra, Informatica, Microsoft Purview, or Alation. Support governance integration within platforms like Azure, Databricks, Microsoft Fabric, and BI tools. Promote data literacy and deliver training across the organisation. Provide audit and regulatory evidence, supporting internal/external audits and exams. Report on data governance More ❯
global initiatives, ensuring consistency with enterprise data architecture standards and integration principles. Translate business and information requirements into scalable, secure, and performant data solutions leveraging the enterprise platforms (Azure, Databricks, Power BI, SAP BTP, Salesforce, etc.). Ensure data models, pipelines, and analytics solutions are built with reuse, interoperability, and data quality in mind. Enterprise Alignment Embed the Data & AI … Key Interfaces Enterprise Architecture Domains: Integration, Cybersecurity, Application, Infrastructure Data & AI CoE and Data Governance Teams Business Domains: Commercial, Supply Chain, Enabling Functions (Finance, HR, Legal, etc.) Vendors: Microsoft, Databricks, SAP, Salesforce, Informatica, Collibra Skills and expeirence: Essential Proven experience designing enterprise-scale data and analytics solutions across hybrid cloud environments. Strong understanding of data modelling, data integration, metadata management … and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. Desirable Experience More ❯
global initiatives, ensuring consistency with enterprise data architecture standards and integration principles. Translate business and information requirements into scalable, secure, and performant data solutions leveraging the enterprise platforms (Azure, Databricks, Power BI, SAP BTP, Salesforce, etc.). Ensure data models, pipelines, and analytics solutions are built with reuse, interoperability, and data quality in mind. Enterprise Alignment Embed the Data & AI … Key Interfaces Enterprise Architecture Domains: Integration, Cybersecurity, Application, Infrastructure Data & AI CoE and Data Governance Teams Business Domains: Commercial, Supply Chain, Enabling Functions (Finance, HR, Legal, etc.) Vendors: Microsoft, Databricks, SAP, Salesforce, Informatica, Collibra Skills and expeirence: Essential Proven experience designing enterprise-scale data and analytics solutions across hybrid cloud environments. Strong understanding of data modelling, data integration, metadata management … and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. Desirable Experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Huxley
prioritise and deliver value What We're Looking For 10+ years of Agile/Scrum delivery experience in banking or financial services Strong understanding of data platforms, ideally with Databricks, Data Lakehouse, or Medallion Architecture Experience working with Finance data/reporting in a corporate or investment banking context Excellent stakeholder management and communication skills Confident using JIRA and leading More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
data-driven decision-making. Responsibilities for the Senior Data Engineer: Design, build, and maintain scalable data pipelines and architectures, ensuring reliability, performance, and best-in-class engineering standards Leverage Databricks, Spark, and modern cloud platforms (Azure/AWS) to deliver clean, high-quality data for analytics and operational insights Lead by example on engineering excellence, mentoring junior engineers and driving … protection of customer data Continuously improve existing systems, introducing new technologies and methodologies that enhance efficiency, scalability, and cost optimisation Essential Skills for the Senior Data Engineer: Proficient with Databricks and Apache Spark, including performance tuning and advanced concepts such as Delta Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing … Understanding of machine learning model enablement and operationalisation within data architectures Experience working within Agile delivery environments If you are an experienced Senior Data Engineer with strong expertise in Databricks , Azure , and data architecture design , looking for a hands-on role where you can influence technical direction and drive meaningful change, please apply in the immediate instance. Data Engineer, Senior More ❯