London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
availability of data assets. Experience: 3-5 years of hands-on experience as a Data Engineer working in Azure cloud environments. Strong proficiency in Azure Data Factory, Synapse Analytics, Databricks, and Azure SQL. Solid understanding of data modeling, ETL/ELT design, and data warehousing principles. Proficiency in SQL and Python for data transformation and automation. Experience working with version More ❯
robust data platforms. Innovation Stay current with emerging Azure technologies and best practices in data architecture. Required Skills & Experience Technical Expertise Extensive experience with Azure Data Services: Data Factory, Databricks, Synapse, Data Lake, Azure SQL. Strong understanding of data modeling, data warehousing, and distributed computing. Proficiency in Python, SQL, and Spark for data engineering tasks. Financial Services Domain Proven track More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
sets. Troubleshoot and resolve data issues proactively. Required Qualifications: 3-5+ years of experience as a Data Engineer or similar role. Strong experience with Azure data services (ADF, Databricks, ADLS, Synapse, Event Hub, etc.). Proficiency in SQL and experience with Python/PySpark . Hands-on experience building ETL/ELT pipelines in cloud environments. Solid understanding of More ❯
Newark, Nottinghamshire, East Midlands, United Kingdom Hybrid/Remote Options
The Wildlife Trust
people focused data engineering and analysis. You will have experience in a data engineering role, ideally with practical experience or the ability to upskill in cloud services like Azure, Databricks and ESRI, as well as excellent proven proficiency, SQL and Python. Ideally you would have a familiarity with developing pipelines which support Analysts who use RStudio Power BI and/ More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
The Data Engineer will demonstrated experience as a Data Engineer, with a strong track record of success. Mastery of data management and processing tools, including Power BI, Data Factory, Databricks, SQL Server, and Oracle. Proficient in SQL and experienced in database administration. Familiarity with cloud platforms such as Azure and AWS. Excellent problem-solving and analytical skills, with strong attention More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Additional Resources Ltd
Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus More ❯
Warwick, Warwickshire, West Midlands, United Kingdom
Stackstudio Digital Ltd
effectively with stakeholders, ensuring clear communication throughout project lifecycles. Skills, Experience, and Abilities Required: Essential: Expert-level experience with Azure Data Factory (ADF) Hands-on experience with Snowflake and Databricks Proficiency in Python programming Strong Azure and SQL knowledge Excellent debugging and troubleshooting abilities Good communication skills Desirable: 6 to 8 years of relevant experience in data engineering or related More ❯
validation routines , including monitoring, alerting, and automated checks. Optimise data workflows for performance, cost-efficiency, and maintainability , using platforms such as Azure Data Factory, AWS Data Pipeline, Glue, Lambda, Databricks, and Apache Spark. Support the integration of transformed data into visualisation and analytical platforms , including Power BI, ServiceNow, and Amazon QuickSight. Ensure compliance with data governance, security, and privacy standards More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid/Remote Options
VANLOQ LIMITED
solutions. Support the migration and modernisation of data infrastructure using cloud-based technologies. Skills & Experience: Strong proficiency in Python for data engineering and automation tasks. Hands-on experience with Databricks or Snowflake (experience in both is a plus). Proven track record in developing and maintaining high-quality data solutions within complex environments. Background in financial services and/or More ❯
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in More ❯
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background More ❯
track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience implementing More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Executive Facilities
Experience domains. Proficiency in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) and ETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and More ❯
routines, and monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or Apache Spark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance More ❯
frameworks (e.g., Pandas, NumPy, Scikit-learn, PyTorch, TensorFlow). Experience developing and fine-tuning LLMs and working with generative AI tools. Hands-on experience with Microsoft Azure (Azure ML, Databricks, or other AI services). Ability to handle, process, and analyse large, complex datasets. Strong understanding of data ethics, governance, and responsible AI principles. Must have SC Clearance and More ❯
backend components of a next-gen trading platform. Work with Java and/or Kotlin to deliver robust solutions. Deploy containerised applications using Kubernetes and Docker. Leverage MongoDB and Databricks for data processing and analytics. Integrate with relational databases and support legacy data migration. Collaborate with stakeholders and contribute to technical decision-making. Ensure code quality through testing, debugging, and More ❯
business impact. What You'll Bring: 8+ years in Data Analytics/Science. Expertise in SQL and dashboarding tools ( Tableau/Qlik ). Familiarity with big data tools ( Snowflake, Databricks ) and ETL . Experience with A/B testing and Python/R is preferred. Contract Details: Location: London, UK Duration: 10 Months Rate: Up to £277 (Umbrella) Ready to More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Sanderson
on the look out for a Senior Data Engineer who has experience in: AWS - S3 for storage, Lambda functions, Athena (thus strong SQL). Open to interchangable Azure experience. Databricks Python Devops experience a bonus e.g. Terraform, Drone, Kubernetes cluster management for microservice style API data consumption Consultative behaviour - strong stakeholder engagement skills, providing true consultancy and leading the user More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Morgan McKinley
Experience in Data Analytics or Data Science, ideally in Customer Success, Operations or Digital Team. Strong SQL skills and experience with dashboarding tools such as Tableau. Familiarity with Snowflake, Databricks, Spark, and ETL processes. Python or automation is a plus. Knowledge of A/B testing and statistics. Strong communication and storytelling skills to influence diverse audiences. Experience with call More ❯
Worthing, West Sussex, England, United Kingdom Hybrid/Remote Options
Adecco
organisations.Strong understanding of data governance, data management, and metadata practises.Hands-on experience with data quality tooling (e.g., Microsoft Purview, Informatica, Collibra, Talend).Familiarity with cloud-based data architectures (Azure, Databricks, Power BI).Strong analytical and problem-solving skills, with experience in designing and implementing data quality KPIs and dashboards.Excellent stakeholder engagement and communication skills.Experience in regulated industries (e.g., utilities, finance More ❯
Worthing, Sussex, United Kingdom Hybrid/Remote Options
Adecco
of data governance, data management, and metadata practises. Hands-on experience with data quality tooling (e.g., Microsoft Purview, Informatica, Collibra, Talend). Familiarity with cloud-based data architectures (Azure, Databricks, Power BI). Strong analytical and problem-solving skills, with experience in designing and implementing data quality KPIs and dashboards. Excellent stakeholder engagement and communication skills. Experience in regulated industries More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
Akkodis is partnering with a reputable client in Finance and they are looking for a Data Engineer with strong Databricks experience to expand their IT Team. They are currently improving their CX transformation programme and making changes to their institutional reporting suite. Job title: Data Engineer - Data BricksSTART :ASAPLocation: Liverpool Street - 3 days a eek (NON Negotiable), the rest WFHDuration … months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models and ETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven experience with … Databricks and Snowflake.Strong proficiency in SQL and Python.Extract Load Transform procedures, Data Lineage & Analysis.Familiarity with cloud platforms (e.g., Azure, AWS, or GCP).Experience with CI/CD and version control tools (e.g., Git).Excellent problem-solving and communication skills. Nice to have: Financial data knowledge on funds, securities, positions, trades and background in Asset Management. This opportunity is for a More ❯
and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Adecco
within a consumer function. Key Skills & Experience: Previous experience working within Collections Strong knowledge of Credit Reference Agencies 2 years' minimum of experience in Credit Risk Advanced talent with Databricks, Python, SQL , Power BI Expert-level skills in Microsoft Excel Proven ability to translate complex data sets into clear, actionable insights for senior leadership Excellent communication skills and senior stakeholder More ❯