to prototype development and product improvement. End-to-end implementation of data science pipelines. Experience and QualificationsKnowledge/Skills Expertise in tools such as SQL, Azure Data Factory, Azure Databricks, Azure Synapse, R, Python and Power BI. Exceptional ability to communicate strategic and technical concepts to a diverse audience and translate business needs into technical requirements Experience of liaising effectively More ❯
Lincoln, Lincolnshire, East Midlands, United Kingdom Hybrid / WFH Options
Frontier Agriculture Limited
and presenting to senior stakeholders. Comprehensive knowledge of data management principles, including governance, quality, security, and lifecycle management. Familiarity with data tools and platforms: SQL, MS Azure cloud technologies, Databricks, Power BI, EDP, MDM (Master Data Management). Knowledge of data modelling, ETL pipelines, and data warehousing principles. Understanding of data privacy and compliance regulations (e.g. GDPR) Experience with agile More ❯
Employment Type: Permanent, Work From Home
Salary: Competitive + Benefits + 33 Days Holiday + Employee Assistance Program
data engineering or a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language More ❯
Senior Databricks Data Engineer (flexible location) Bibby Financial Services have an exciting opportunity available for a reliable Senior Databricks Data Engineer to join our team. You will join us on a full time, permanent basisand in return, you will receive a competitive salary of £60,000 - £70,000 per annum. About the role: As our Senior Databricks Data Engineer, you … coach, support and organise to ensure we sustain a predictable pipeline of delivery, whilst ensuring all appropriate governance and best practice is adhered to. Your responsibilities as our Senior Databricks Data Engineer will include: Understand the business/product strategy and supporting goals with the purpose of ensuring data interpretation aligns Provide technical leadership on how to break down initiatives … databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage) using best practice tools and techniques What we are looking for in our ideal Senior Databricks Data Engineer: A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Alternatively, relevant experience in the data engineering field Databricks, including Unity Catalog Terraform More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
supporting automated workflows in Alteryx Designer. Experience deploying workflows to the Production Gallery. Knowledge of database fundamentals, data design, SQL, and data warehouse concepts is beneficial. Exposure to PowerBI, Databricks, Azure, and Profisee is advantageous. Knowledge of Json, Python, XML, and R is a plus. Experience with non-relational and unstructured data is beneficial. Familiarity with Azure DevOps or GitHub More ❯
engineering or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow , Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Morson Talent
Strong DAX skills and the ability to write/optimise complex T-SQL queries. • Experience in applying AI tools to enhance analytics or business operations (desirable). • Familiarity with Databricks, and predictive analytics is beneficial. • Experience with Python for forecasting or scenario modelling. • Experience in insurance or financial services is an advantage. • Excellent analytical and problem-solving capabilities. • Strong communication More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
Broster Buchanan Ltd
build and demonstrate a solid understanding of the concepts of real time data streaming, batch data processing and data transformation processes Experience with core tools such as Data Factory, Databricks, Synapse, Kafka and Python Any exposure to data migration/ETL would be highly beneficial, with SQL/T-SQL, SSIS, SSRS and SSAS, as there is a large data More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
Key Experience Required: Proven experience as a Data Solution Architect on complex, multi-disciplinary consulting engagements Deep knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data More ❯
Python capabilities - minimum 2-3 years hands-on experience Comprehensive Data Engineering background - proven track record in enterprise data solutions Experience with ETL processes and data transformation, preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batch processing from relational database sources Communication & Collaboration Skills of the Data Engineer Outstanding stakeholder engagement abilities across technical More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Softwire
firm Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Person Centred Software Ltd
in data science with a strong track record in predictive modelling, correlations, and trend analysis. Hands-on expertise in the Azure data and ML stack (AML, Data Factory, Synapse, Databricks, Data Lake). Advanced Python and SQL skills, plus experience building and deploying supervised and unsupervised models. Strong statistical knowledge, especially in benchmarking techniques and data normalization. Experience translating complex More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
experience (including QGIS) FME Advanced Database and SQL skills Certifications : AWS or FME certifications are a real plus. Experience with ETL tools such as AWS Glue, Azure Data Factory, Databricks or similar is a bonus. The role comes with excellent benefits to support your well-being and career growth. KEYWORDS Principal Geospatial Data Engineer, Geospatial, GIS, QGIS, FME, AWS, On More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Pertemps
analytical background, including Excel, MS Access, SQL, Power BI, ArcGIS Online, and FME. Knowledge and capability in a wide variety of additional analytical tools and ETL (e.g. Python, SQL, Databricks, Microsoft Flow, Azure Data Factory). Extensive previous experience in data management. Degree in Environmental Science, Data Management, Hydrology, or a related field. Experience with remote sensing technologies and water More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
enhance analytics or operations Comfortable coding in Python Strong DAX and T-SQL skills, including query optimization Industry experience in insurance or financial services is a plus Familiarity with Databricks is advantageous More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
processes effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting More ❯
facing role. Senior Data Engineer: Must-Have Experience: Strong SQL skills and experience with SSIS, SSRS, SSAS Data warehousing, ETL processes, best practice data management Azure cloud technologies (Synapse, Databricks, Data Factory, Power BI) Python/PySpark Proven ability to work in hybrid data environments Ability to manage and lead on and offshore teams Exceptional stakeholder management and communication skills More ❯
of these Senior Data Engineer roles you must be able to demonstrate the following experience: Experience in prominent languages such as Python, Scala, Spark, SQL. Good experience in using Databricks Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure Understanding of More ❯
large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to evaluate More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Anson Mccade
large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to evaluate More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Anson Mccade
large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to evaluate More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Anson Mccade
large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to evaluate More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to evaluate More ❯
In-depth experience in AI application development. You have previously planned and developed end-to-end solutions. Proficiency in C#, .NET Core, and Python. Experience with Azure Data Factory, Databricks, and either Azure Cosmos DB or Azure SQL Database is beneficial. Experience in designing and maintaining robust CI/CD pipelines, applying Infrastructure as Code (IaC) principles with exposure to More ❯