Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Bristol (4 Days per Week in Office) I am currently seeking a Contract AWS Data Engineer for a scale-up company with several upcoming greenfield projects. Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested, please click "Apply" with an updated copy of your CV, and I will contact you to discuss further. More ❯
Week in Office) I am currently on the lookout for a Contract AWS Data Engineer with a scale-up who have number of greenfield projects coming up.Tech Stack: AWS Databricks Lakehouse PySpark SQL ClickHouse/MySQL/DynamoDB If you are interested in this position, please click apply with an updated copy of you CV and I will call you More ❯
and drive strategic decision-making What You'll Bring Strong grounding in data governance and MDM Experience with tools like Azure Purview Big data cloud familiarity (e.g., Azure, Synapse, Databricks) Excellent stakeholder collaboration and analytical problem-solving skills Agile, iterative project delivery expertise What You'll Get Competitive salary 26 days of annual leave (rising to 30 over time), plus More ❯
and drive strategic decision-making What You'll Bring Strong grounding in data governance and MDM Experience with tools like Azure Purview Big data cloud familiarity (e.g., Azure, Synapse, Databricks) Excellent stakeholder collaboration and analytical problem-solving skills Agile, iterative project delivery expertise What You'll Get Competitive salary 26 days of annual leave (rising to 30 over time), plus More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Arc IT Recruitment
looking to blend strategic leadership with hands-on technical skills. The role and skill sets... Ownership & Operation: Lead and operate core AWS services, CI/CD DevOps services, and Databricks data platform services. Scale-Up Experience: Been there, done that in a 200-2000+ sized org during their growth journey. Technically Proficient: Comfortable with large infrastructure changes using Pulumi More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
architectural best practices and deliver high-quality data solutions. This is an opportunity to own technical delivery, influence client architecture, and work with cutting-edge technologies such as Kafka, Databricks, Unity Catalog, and Cloud platforms like AWS, Azure, and GCP . Key Skills of the Lead Data Solution Architect: Proven experience as a Lead Data Solutions Architect , leading end-to More ❯
a commercial environment, agreat understanding of Java and React, and wider experience in different tools and frameworks and across the stack. Knowledge of Data technologies such as Python and Databricks An enthusiasm for broadening your skillset and learning new technologies Commercial experience working with agile methodologies. Experience dealing with challenges from stakeholders on technical issues and influencing technical decisions in More ❯
Query - experience for data transformation DAX - experience writing DAX measures Familiarity with tools such as Tabular Editor, Dax Studio SQL - ability to effectively query SQL databases Any exposure to Databricks would be beneficial Understanding of relational databases and Kimball approach to data warehousing Experience of designing, building and maintaining best practise semantic models Microsoft Certification such as PL-300 and More ❯
Hands-on experience with C4, ArchiMate, and tools like Ardoq, LeanIX, or Orbus Technical expertise in Java, .NET, Azure, CI/CD, APIs, and data platforms (SQL, NoSQL, Snowflake, Databricks) Familiarity with pensions platforms (e.g. Bravura Sonata, TCS BaNCS, FNZ) and cloud migration frameworks Ability to lead complex migrations and design traceable, executable solutions Passion for inclusive collaboration, mentoring, and More ❯
Who we are looking for State Street Associates (SSA) is looking for a highly skilled and motivated Data Scientist with hands-on experience working with Oracle database and Databricks Spark on AWS. This role will be responsible for managing and analyzing terabytes of data hosted in enterprise data warehouse and leveraging large-scale data platforms to advancing new research and … to compliance with company policies and regulatory requirements. Work with terabytes of structured and semi-structured data from our enterprise data warehouse, primarily stored in Oracle and integrated with Databricks Spark on AWS. Write efficient, production-quality SQL and PL/SQL queries for data extraction and transformation in Oracle. Leverage Databricks Spark and PySpark to process large datasets and … control (roles, privileges, user management, and data security). Working knowledge of AWS core services, including S3, EC2/EMR, IAM, Athena, Glue or Redshift. Hands-on experience with Databricks Spark on large datasets, using PySpark, Scala, or SQL. Familiarity with Delta Lake, Unity Catalog or similar data lakehouse technologies. Proficient in Linux environments, including experience with shell scripting, basic More ❯
RFP responses as well as thought leadership articles that we issue to the market. Experience in Data Platform Technologies, including some of the following capabilities: Understanding of Snowflake/Databricks architecture, including basic data warehousing concepts and data sharing capabilities The ability to write SQL and/or Python queries for the purposes of transforming, joining and aggregating data Ability … to analyse datasets and generate insights using Snowflake and/or Databricks analytical tools and features Practical knowledge of common data engineering and BI/data visualisation integrations such as dbt, Azure Data Factory, PowerBI and Sigma Experience using native functionality to deploy and interact with Large Language Models Ability to develop, test and deploy machine learning models Work to … Some hands-on coding experience with SQL, Python or Scala would be advantageous but not compulsory Relevant experience in Data Platform Technologies (knowledge of any or all including Snowflake, Databricks, Microsoft Fabric would be beneficial) Previous consulting experience would be a plus, but so would the curiosity and ambition to develop your career with a consulting role What we look More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
working model (1-2 days in office) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use … in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited More ❯
working model (1-2 days in office) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use … in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start Role: Senior Data Engineer Location: This is a hybrid engagement represented by 2 days/week onsite, either in Central London or Glasgow. Start Date: Must be able to start mid-August. Salary: £80k-£90k (Senior) | £90k-£95k (Lead) About The Role Our partner is looking … thinking environment. You’ll be involved in designing and building production-grade ETL pipelines, driving DevOps practices across data systems and contributing to high-availability architectures using tools like Databricks, Spark and Airflow- all within a modern AWS ecosystem. Responsibilities Architect and build scalable, secure data pipelines using AWS, Databricks and PySpark. Design and implement robust ETL/ELT solutions … reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with workflow orchestration tools More ❯
Goldthorpe, Rotherham, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Reed
We are on the lookout for a Data Engineer with a robust background in SQL Server and ideally Azure Data Factory. The ideal candidate will have a wealth of experience in business intelligence, data modelling, and visualisation, and will be More ❯
Middlesbrough, Cleveland, England, United Kingdom Hybrid / WFH Options
Reed
We are on the lookout for a Data Engineer with a robust background in SQL Server and ideally Azure Data Factory. The ideal candidate will have a wealth of experience in business intelligence, data modelling, and visualisation, and will be More ❯
collaborate with cross-functional teams to manage complex ETL processes, implement best practices in code management, and ensure seamless data flow across platforms. Projects may include connecting SharePoint to Databricks, optimising Spark jobs, and managing GitHub-based code promotion workflows. This is a hybrid role based in London, with 1-2 days per week in the office. What You'll … Succeed You'll bring 5+ years of data engineering experience, with expert-level skills in Python and/or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Qurated
knowledge of cloud platforms (preferably AWS) Experience in Financial Services , ideally Payments Expertise in event-driven architecture , microservices , and streaming platforms (e.g. Kafka) Hands-on experience with tools like Databricks , Snowflake , S3 , and broader data ecosystem knowledge Background in software engineering - understands what good implementation looks like Confident communicator who can teach , influence , and represent architecture at senior levels This More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
focused on designing modern, scalable, and secure data platforms for enterprise clients. You'll play a key role in shaping data architecture across the full Azure stack- including Azure Databricks and Azure Data Factory (ADF) -and will guide engineering teams in delivering robust, future-proof solutions using lakehouse and medallion architecture principles . Key Responsibilities Design end-to-end data … architectures using Azure services, including Azure Databricks, ADF, Synapse Analytics , and Data Lake Storage Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD … and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensional modelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with More ❯
Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and Logic Apps. You'll work across the full data lifecycle - from ingestion to transformation and delivery - enabling smarter, faster insights. Key Responsibilities: * Develop and maintain data pipelines using … Collaborate with cross-functional teams in an agile environment. Collaboration With: * Data Engineers, Architects, Product Owners, Test Analysts, and BI Teams. Skills & Experience: * Proficiency in Azure tools (Data Factory, Databricks, Synapse, etc.). * Strong SQL and experience with data warehousing (Kimball methodology). * Programming skills in Python, Scala, or PySpark. * Familiarity with Power BI, SharePoint, and data integration technologies. * Understanding More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
London. Key Responsibilities - Azure Data Engineer: Design, build and maintain scalable and secure data pipelines on the Azure platform. Develop and deploy data ingestion processes using Azure Data Factory, Databricks (PySpark), and Azure Synapse Analytics. Optimise ETL/ELT processes to improve performance, reliability and efficiency. Integrate multiple data sources including Azure Data Lake (Gen2), SQL-based systems and APIs. … incl. GDPR and ISO standards). Required Skills & Experience - Azure Data Engineer: Proven commercial experience as a Data Engineer delivering enterprise-scale solutions in Azure Azure Data Factory Azure Databricks (PySpark) Azure Synapse Analytics Azure Data Lake Storage (Gen2) SQL & Python Understanding of CI/CD in a data environment, ideally with tools like Azure DevOps. Experience working within consultancy More ❯
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
Databricks Data Engineer (roles available at Lead, Senior and Mid Level), £ 50000 - 85000 + benefits. SQL, ETL, Data Warehousing, Databricks etc. Home Based with one day a month at the office in Nottingham. Strong commercial knowledge of Databricks is required for this role. Expanding SaaS product company are looking for a number of Data Engineer as they continue to grow. … Data Engineers and BA's to understand the data needs of the business. Skills Required Include - Previous experience as Data Engineer in a delivery focused environment. Excellent knowledge of Databricks Experience analysing complex business problems and designing workable technical solutions. Excellent knowledge of the SDLC, including testing and delivery in an agile environment. Excellent knowledge of SQL and ETL Experience … expand its data team. In these roles you will use your technical skills and soft skills/people skills allowing the data team to further develop. Strong, hands-on databricks skills are mandatory for this role. This role is home based with one day a month at their office in Nottingham. Salary is in the range £ 50000 - 85000 + benefits More ❯
decision making for Cox Automotive. You'll collaborate with a talented team, using open-source tools such as R, Python, and Spark, data visualisation tools like Power BI, and Databricks data platform. Key Responsibilities: Develop and implement analytics strategies that provide actionable insights for our business and clients. Apply the scientific method to create robust, reproducible solutions Collaborate with stakeholders … seamlessly with team members and external clients. Proficiency in R or Python. Solid understanding of SQL; experience working with Spark (Java, Python, or Scala variants) and cloud platforms like Databricks is a plus. Strong statistical knowledge, including hypothesis testing, confidence intervals, and A/B testing. Ability to understand and communicate the commercial impact of data activities. Why Join Us More ❯
Data Engineer (Databricks) - Leeds Our client is a global innovator and world leader with a highly recognizable name within technology. They are looking for Data Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark, and SQL experience, a clear understanding of Databricks, and a passion for Data More ❯