London, England, United Kingdom Hybrid / WFH Options
AXIS Capital
MDW) technologies, big data, and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. Automate loads using Databricks workflows and Jobs. Develop, test … including role-based access control (RBAC), encryption, and compliance with industry standards. Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions. Monitor and optimize Databricks performance, including cost management guidance and cluster tuning. Stay up to date with Azure cloud innovations and recommend improvements to existing architectures. Assist data analysts with technical input. You may … solutions on Microsoft Azure. Unity Catalog Mastery: In-depth knowledge of setting up, configuring, and utilizing Unity Catalog for robust data governance, access control, and metadata management in a Databricks environment. Demonstrated ability to optimize and tune Databricks notebooks and workflows to maximize performance and efficiency. Experience with performance troubleshooting and best practices for scalable data processing is essential. Additional More ❯
Ibstock, England, United Kingdom Hybrid / WFH Options
Ibstock Plc
new, innovative and sustainable products and solutions, both for today and for a new era of building. To support our progress, we are currently recruiting for a Data Engineer (Databricks & BI) to come and join our team at Ibstock Head Office, LE67 6HS. (Applicants must have Right to work in the UK - We are unable to provide sponsorship for this … role) Job Purpose: The Data Engineer/BI Developer will play a critical role in developing and refining our modern data lakehouse platform, with a primary focus on Databricks for data engineering, transformation, and analytics. The role will involve designing, developing, and maintaining scalable data solutions to support business intelligence and reporting needs. This platform integrates Databricks, Power BI, and … JDE systems to provide near real-time insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate and transform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group
and optimization of data pipelines and resources. Knowledge, skills and experience Essential: Cloud Platforms: Experience with Azure, AWS, or Google Cloud for data engineering. Cloud Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or similar. Data Modelling & Warehousing: Experience with cloud-based data architecture. CI/CD & DevOps: Knowledge of More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group plc
data pipelines and cloud resources. Knowledge, skills and experience Essential: Cloud Platforms: Demonstrable experience with Azure, AWS, or Google Cloud for data engineering. Cloud-Based Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics for scalable data solutions. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or equivalent cloud technologies. Data Modelling & Warehousing: Experience with modern cloud-based data More ❯
London, England, United Kingdom Hybrid / WFH Options
Datapao
biggest multinational companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. Additionally, at this seniority level, we expect you to … Technical Expertise You (ideally) have 5+ years of experience in Data Engineering , with a focus on cloud platforms (AWS, Azure, GCP); You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You … and optimize data like a pro; You know your way around CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation, or Bicep); You’ve hands-on experience integrating Databricks with BI tools (Power BI, Tableau, Looker). Consulting & Client-Facing Skills Ideally, you bring a proven history in consulting , from scoping to gathering requirements, designing solutions, and communicating effectively More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
new, innovative and sustainable products and solutions, both for today and for a new era of building. To support our progress, we are currently recruiting for a Data Engineer (Databricks & BI) to come and join our team at Ibstock Head Office, LE67 6HS. (Applicants must have Right to work in the UK - We are unable to provide sponsorship for this … role) Job Purpose: The Data Engineer/BI Developer will play a critical role in developing and refining our modern data lakehouse platform, with a primary focus on Databricks for data engineering, transformation, and analytics. The role will involve designing, developing, and maintaining scalable data solutions to support business intelligence and reporting needs. This platform integrates Databricks, Power BI, and … JDE systems to provide near real-time insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate and transform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Jira ensuring timely and high-quality deliverables. Strong written and verbal communication skills for effective team collaboration One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
leadership skills, with the ability to convey complex technical concepts to both technical and non-technical stakeholders. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: DOD 8570 IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Familiarity with Apache … Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity More ❯
Salary: 50.000 - 60.000 € per year Requirements: • 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles, ETL … and German (min. B2 levels) • Ability to work independently as well as part of a team in an agile environment Responsibilities: As a Data Engineer with a focus on Databricks, you will play a key role in building modern, scalable, and high-performance data solutions for our clients. You'll be part of our growing Data & AI team and work … hands-on with the Databricks platform, supporting clients in solving complex data challenges. • Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python • Building efficient and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses • Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity More ❯
in the power of agile and cross-functional collaboration, bringing together people from various backgrounds to create outstanding digital solutions. Aufgaben As a Data Engineer with a focus on Databricks , you will play a key role in building modern, scalable, and high-performance data solutions for our clients. You'll be part of our growing Data & AI team and work … hands-on with the Databricks platform, supporting clients in solving complex data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks … validation processes and adherence to data governance policies Collaborating with data scientists and analysts to understand data needs and deliver actionable solutions Staying up to date with advancements in Databricks, data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache More ❯
in the power of agile and cross-functional collaboration, bringing together people from various backgrounds to create outstanding digital solutions. Aufgaben As a Data Engineer with a focus on Databricks , you will play a key role in building modern, scalable, and high-performance data solutions for our clients. You'll be part of our growing Data & AI team and work … hands-on with the Databricks platform, supporting clients in solving complex data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks … validation processes and adherence to data governance policies Collaborating with data scientists and analysts to understand data needs and deliver actionable solutions Staying up to date with advancements in Databricks, data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache More ❯
in the power of agile and cross-functional collaboration, bringing together people from various backgrounds to create outstanding digital solutions. Aufgaben As a Data Engineer with a focus on Databricks , you will play a key role in building modern, scalable, and high-performance data solutions for our clients. You'll be part of our growing Data & AI team and work … hands-on with the Databricks platform, supporting clients in solving complex data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks … validation processes and adherence to data governance policies Collaborating with data scientists and analysts to understand data needs and deliver actionable solutions Staying up to date with advancements in Databricks, data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache More ❯
how your job looks like You are part of the IT Data & AI Team and play a key role in shaping, implementing, and enhancing our Data Lake using Azure Databricks Source Integration : Establish connections to diverse systems via APIs , ODBC , Event Hubs , and sFTP protocols. ETL/ELT Pipelines : Design and optimize data pipelines using Azure Data Factory and Databricks … role in driving our digital transformation. The experience and knowledge you bring Experience : At least 5+ years as an Azure Data Engineer. Technical Expertise : Proficiency with Azure Data Factory , Databricks and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure … Databricks . Education : Bachelor's degree in IT, Computer Science, or a related field. Microsoft Certified: Azure Data Engineer Associate Language Skills : Fluency in Dutch and English is mandatory. French is a plus. Knowledge of PowerBI is a plus. The key competences we are looking for Analytical mindset with attention to detail. Pragmatic and customer-oriented approach to problem-solving. More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
implement effective solutions Excellent communication, teamwork, and organizational skills, with a focus on innovation and continuous improvement One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Somerset Bridge Group
sector, shaping data pipelines and platforms that power smarter decisions, better pricing, and sharper customer insights. The Data Engineer will design, build, and optimise scalable data pipelines within Azure Databricks, ensuring high-quality, reliable data is available to support pricing, underwriting, claims, and operational decision-making. This role is critical in modernising SBG’s cloud-based data infrastructure, ensuring compliance … regulations, and enabling AI-driven analytics and automation. By leveraging Azure-native services, such as Azure Data Factory (ADF) for orchestration, Delta Lake for ACID-compliant data storage, and Databricks Structured Streaming for real-time data processing, the Data Engineer will help unlock insights, enhance pricing accuracy, and drive innovation. The role also includes optimising Databricks query performance, implementing robust … security controls (RBAC, Unity Catalog), and ensuring enterprise-wide data reliability. Working closely with Data Architects, Pricing Teams, Data Analysts, and IT, this role will ensure our Azure Databricks data ecosystem is scalable, efficient, and aligned with business objectives. Additionally, the Data Engineer will contribute to cost optimisation, governance, and automation within Azure’s modern data platform. Key Responsibilities Data More ❯
London, England, United Kingdom Hybrid / WFH Options
Aimpoint Digital
individuals who want to drive value, work in a fast-paced environment, and solve real business problems. You are a coder who writes efficient and optimized code leveraging key Databricks features. You are a problem-solver who can deliver simple, elegant solutions as well as cutting-edge solutions that, regardless of complexity, your clients can understand, implement, and maintain. You … Strong written and verbal communication skills required Ability to manage an individual workstream independently 3+ years of experience developing and deploying ML models in any platform (Azure, AWS, GCP, Databricks etc.) Ability to apply data science methodologies and principles to real life projects Expertise in software engineering concepts and best practices Self-starter with excellent communication skills, able to work … independently, and lead projects, initiatives, and/or people Willingness to travel. Want to stand out? Consulting Experience Databricks Machine Learning Associate or Machine Learning Professional Certification. Familiarity with traditional machine learning tools such as Python, SKLearn, XGBoost, SparkML, etc. Experience with deep learning frameworks like TensorFlow or PyTorch. Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes More ❯
London, England, United Kingdom Hybrid / WFH Options
Mirai Talent
for building scalable, reliable data pipelines, managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. Apply data modelling techniques such More ❯
London, England, United Kingdom Hybrid / WFH Options
EXL
viable, and aligned with client expectations. Enterprise Solution Design : Architect and lead the delivery of large-scale data platforms (including lakes, lakehouses, and warehouses) using GCP, Cloud Storage, BigQuery, Databricks, Snowflake. Cloud Data Strategy: Own cloud migration and modernisation strategy, leveraging GCP, and tools such as Terraform, Azure DevOps, GitHub, and CI/CD pipelines. Data Modelling: Apply deep hands More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Scott Logic
Social network you want to login/join with: We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly More ❯
London, England, United Kingdom Hybrid / WFH Options
Scott Logic Ltd
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Scott Logic
Join to apply for the Lead Data Engineer role at Scott Logic Join to apply for the Lead Data Engineer role at Scott Logic Get AI-powered advice on this job and more exclusive features. We work with some of More ❯
We work with some of the UK's biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Scott Logic
We work with some of the UK’s biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯