London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
availability of data assets. Experience: 3-5 years of hands-on experience as a Data Engineer working in Azure cloud environments. Strong proficiency in Azure Data Factory, Synapse Analytics, Databricks, and Azure SQL. Solid understanding of data modeling, ETL/ELT design, and data warehousing principles. Proficiency in SQL and Python for data transformation and automation. Experience working with version More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
. Familiarity with Snowflake cost monitoring, governance, replication , and environment management. Strong understanding of data modelling (star/snowflake schemas, SCDs, lineage). Proven Azure experience (Data Factory, Synapse, Databricks) for orchestration and integration. Proficient in SQL for complex analytical transformations and optimisations. Comfortable working in agile teams and using Azure DevOps for CI/CD workflows. Nice to Have More ❯
robust data platforms. Innovation Stay current with emerging Azure technologies and best practices in data architecture. Required Skills & Experience Technical Expertise Extensive experience with Azure Data Services: Data Factory, Databricks, Synapse, Data Lake, Azure SQL. Strong understanding of data modeling, data warehousing, and distributed computing. Proficiency in Python, SQL, and Spark for data engineering tasks. Financial Services Domain Proven track More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
technologies in data engineering, and continuously improve your skills and knowledge. Profile The Data Engineer will have mastery of data management and processing tools, including Power BI, Data Factory, Databricks, SQL Server, and Oracle. Proficient in SQL and experienced in database administration. Familiarity with cloud platforms such as Azure and AWS. Excellent problem-solving and analytical skills, with strong attention More ❯
Newark, Nottinghamshire, East Midlands, United Kingdom Hybrid/Remote Options
The Wildlife Trust
people focused data engineering and analysis. You will have experience in a data engineering role, ideally with practical experience or the ability to upskill in cloud services like Azure, Databricks and ESRI, as well as excellent proven proficiency, SQL and Python. Ideally you would have a familiarity with developing pipelines which support Analysts who use RStudio Power BI and/ More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Robert Half
The Data Engineer will demonstrated experience as a Data Engineer, with a strong track record of success. Mastery of data management and processing tools, including Power BI, Data Factory, Databricks, SQL Server, and Oracle. Proficient in SQL and experienced in database administration. Familiarity with cloud platforms such as Azure and AWS. Excellent problem-solving and analytical skills, with strong attention More ❯
City Of Westminster, London, United Kingdom Hybrid/Remote Options
Additional Resources
Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus More ❯
Westminster, City of Westminster, Greater London, United Kingdom Hybrid/Remote Options
Additional Resources
Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Additional Resources Ltd
Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and data lake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus More ❯
Warwick, Warwickshire, West Midlands, United Kingdom
Stackstudio Digital Ltd
effectively with stakeholders, ensuring clear communication throughout project lifecycles. Skills, Experience, and Abilities Required: Essential: Expert-level experience with Azure Data Factory (ADF) Hands-on experience with Snowflake and Databricks Proficiency in Python programming Strong Azure and SQL knowledge Excellent debugging and troubleshooting abilities Good communication skills Desirable: 6 to 8 years of relevant experience in data engineering or related More ❯
validation routines , including monitoring, alerting, and automated checks. Optimise data workflows for performance, cost-efficiency, and maintainability , using platforms such as Azure Data Factory, AWS Data Pipeline, Glue, Lambda, Databricks, and Apache Spark. Support the integration of transformed data into visualisation and analytical platforms , including Power BI, ServiceNow, and Amazon QuickSight. Ensure compliance with data governance, security, and privacy standards More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid/Remote Options
VANLOQ LIMITED
solutions. Support the migration and modernisation of data infrastructure using cloud-based technologies. Skills & Experience: Strong proficiency in Python for data engineering and automation tasks. Hands-on experience with Databricks or Snowflake (experience in both is a plus). Proven track record in developing and maintaining high-quality data solutions within complex environments. Background in financial services and/or More ❯
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in More ❯
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
data pipelines Automating and orchestrating workflows (AWS Glue, Azure Data Factory, GCP Dataflow) Working across leading cloud platforms (AWS, Azure, or GCP) Implementing and optimising modern data architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background More ❯
track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience implementing More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Executive Facilities
Experience domains. Proficiency in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) and ETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and More ❯
routines, and monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or Apache Spark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance More ❯
routines, and monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or Apache Spark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance More ❯
frameworks (e.g., Pandas, NumPy, Scikit-learn, PyTorch, TensorFlow). Experience developing and fine-tuning LLMs and working with generative AI tools. Hands-on experience with Microsoft Azure (Azure ML, Databricks, or other AI services). Ability to handle, process, and analyse large, complex datasets. Strong understanding of data ethics, governance, and responsible AI principles. Must have SC Clearance and More ❯
frameworks (e.g., Pandas, NumPy, Scikit-learn, PyTorch, TensorFlow). Experience developing and fine-tuning LLMs and working with generative AI tools. Hands-on experience with Microsoft Azure (Azure ML, Databricks, or other AI services). Ability to handle, process, and analyse large, complex datasets. Strong understanding of data ethics, governance, and responsible AI principles. Must have SC Clearance and More ❯
backend components of a next-gen trading platform. Work with Java and/or Kotlin to deliver robust solutions. Deploy containerised applications using Kubernetes and Docker. Leverage MongoDB and Databricks for data processing and analytics. Integrate with relational databases and support legacy data migration. Collaborate with stakeholders and contribute to technical decision-making. Ensure code quality through testing, debugging, and More ❯
backend components of a next-gen trading platform. Work with Java and/or Kotlin to deliver robust solutions. Deploy containerised applications using Kubernetes and Docker. Leverage MongoDB and Databricks for data processing and analytics. Integrate with relational databases and support legacy data migration. Collaborate with stakeholders and contribute to technical decision-making. Ensure code quality through testing, debugging, and More ❯
business impact. What You'll Bring: 8+ years in Data Analytics/Science. Expertise in SQL and dashboarding tools ( Tableau/Qlik ). Familiarity with big data tools ( Snowflake, Databricks ) and ETL . Experience with A/B testing and Python/R is preferred. Contract Details: Location: London, UK Duration: 10 Months Rate: Up to £277 (Umbrella) Ready to More ❯