Senior Data Engineer - (Azure/Databricks) Axis Capital London, United Kingdom Apply now Posted 15 days ago Permanent Competitive Senior Data Engineer - (Azure/Databricks) Axis Capital London, United Kingdom Apply now This is your opportunity to join AXIS Capital - a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our outstanding client service, intelligent risk … family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks Job Family Grouping: Chief Underwriting Officer Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data & Analytics department transforms raw data into actionable … MDW) technologies, big data, and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test More ❯
london, south east england, united kingdom Hybrid / WFH Options
Axis Capital
Senior Data Engineer - (Azure/Databricks) Axis Capital London, United Kingdom Apply now Posted 15 days ago Permanent Competitive Senior Data Engineer - (Azure/Databricks) Axis Capital London, United Kingdom Apply now This is your opportunity to join AXIS Capital - a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our outstanding client service, intelligent risk … family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks Job Family Grouping: Chief Underwriting Officer Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data & Analytics department transforms raw data into actionable … MDW) technologies, big data, and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test More ❯
engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data … RESPONSIBILITIES Data Pipeline Development: Lead the design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Architecture: Architect scalable and efficient data solutions using the appropriate architecture design, opting for modern architectures where possible. Data Modeling: Design and optimize data models and … or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience More ❯
engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data Mesh, lakehouse, data vault and data warehouses. Our data … data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data pipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks, python and pyspark. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop and automate ETL workflows … or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and More ❯
SAS Viya). An ability to write complex SQL queries. Project experience using one or more of the following technologies: Tableau, Python, Power BI, Cloud (Azure, AWS, GCP, Snowflake, Databricks). Project lifecycle experience, having played a leading role in the delivery of end-to-end projects, as 14, rue Pergolèse, Paris, Francewell as a familiarity with different development methodologies More ❯
viable, and aligned with client expectations. Enterprise Solution Design : Architect and lead the delivery of large-scale data platforms (including lakes, lakehouses, and warehouses) using GCP, Cloud Storage, BigQuery, Databricks, Snowflake. Cloud Data Strategy: Own cloud migration and modernisation strategy, leveraging GCP, and tools such as Terraform, Azure DevOps, GitHub, and CI/CD pipelines. Data Modelling: Apply deep hands More ❯
london, south east england, united kingdom Hybrid / WFH Options
EXL
viable, and aligned with client expectations. Enterprise Solution Design : Architect and lead the delivery of large-scale data platforms (including lakes, lakehouses, and warehouses) using GCP, Cloud Storage, BigQuery, Databricks, Snowflake. Cloud Data Strategy: Own cloud migration and modernisation strategy, leveraging GCP, and tools such as Terraform, Azure DevOps, GitHub, and CI/CD pipelines. Data Modelling: Apply deep hands More ❯
We work with some of the UK's biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
This is an exciting contract opportunity for an SC Cleared Azure Data Engineer with a strong focus on Databricks to join an experienced team in a new customer engagement working at the forefront of data analytics and AI. This role offers the chance to take a key role in the design and delivery of advanced Databricks solutions within the Azure … ecosystem. Responsibilities: Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables. Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions. Drive best practices for data engineering. Help clients realise the potential of data science, machine learning, and scaled data processing within Azure/Databricks ecosystem. Mentor junior engineers and support … Take ownership for the delivery of core solution components. Support with planning, requirements refinements, and work estimation. Skills & Experiences: Proven experience designing and implementing data solutions in Azure using Databricks as a core platform. Hands-on expertise in Delta Lake, Delta Live Tables and Databricks Workflows. Strong coding skills in Python and SQL, with experience in developing modular, reusable code More ❯
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
Wandsworth, Greater London, UK Hybrid / WFH Options
Hitachi Solutions UK
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hitachi Solutions
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
efficient solutions. Lead teams and projects from requirements to implementation. Identify high-value AI scenarios (Machine Learning and Gen AI). Analyze data using tools like Microsoft Fabric, Azure Databricks, Azure Synapse, and Azure Machine Learning. Implement data pipelines and workflows with Azure ML Pipelines and Azure DevOps. Monitor ML solutions using Azure ML and Application Insights. Deploy AI solutions More ❯
london, south east england, united kingdom Hybrid / WFH Options
ZipRecruiter
/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Desirable Experience: Google Cloud Professional certifications. Experience in alternative cloud data platforms such as Snowflake, Databricks, Azure, or AWS. Understanding of DevOps/DataOps practices, CI/CD pipelines, and monitoring tools. Academic background in Computer Science, Mathematics, or a related technical field. What’s on More ❯
technical voice in the team - without line management responsibilities. What We're Looking For: Essential: Strong experience in Azure-based data engineering roles Strong skills in Azure Data Factory, Databricks, Azure Data Lake, SQL, and Python. Excellent communication and collaboration skills - you'll thrive in a dynamic, agile environment Available to start within a 4-week notice period Desirable: Experience More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
technical voice in the team - without line management responsibilities. What We're Looking For: Essential: Strong experience in Azure-based data engineering roles Strong skills in Azure Data Factory, Databricks, Azure Data Lake, SQL, and Python. Excellent communication and collaboration skills - you'll thrive in a dynamic, agile environment Available to start within a 4-week notice period Desirable: Experience More ❯
and is seeking a skilled Data Engineer to join its growing data team. This role plays a key part in building and deploying modern data solutions based on Azure Databricks, enabling faster and more informed business decisions. You'll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design scalable data pipelines, implement efficient … future of data within the organisation while working with advanced cloud technologies. Key Responsibilities and Deliverables Design, develop, and optimise end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implement Medallion Architecture to structure raw, enriched, and curated data layers efficiently. Build scalable ETL/ELT processes with Azure Data Factory and PySpark. Support data … emerging data technologies (e.g., Kafka/Event Hubs for streaming, Knowledge Graphs). Promote best practices in data engineering across the team. Skills & Experience Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL, with advanced query optimisation skills. Proven experience building scalable More ❯
detail, the ability to manage multiple projects, and a collaborative approach to working with regional and group teams. Key Responsibilities: Develop and maintain scalable data solutions on Azure (ADF, Databricks, Synapse, etc.) Maintain and enhance the integrity of the Data Lakehouse in partnership with regional and group data teams. Partner with analytical and reporting teams to ensure data is presented … evolution Support delivery of data models, APIs, ML integrations, and reporting tools Key Skills: Hands on experience designing and delivering solutions using Azure services including Azure Data Factory, Azure Databricks, Azure Synapse, Azure Storage, Azure DevOps. Proven track record in Data Engineering and supporting the business to gain true insight from data. Experience in data integration and modelling including ELT More ❯
and AI/ML use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects More ❯
and AI/ML use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects More ❯
and internal policies. What We're Looking For: Solid experience in data engineering or backend data development. Strong experience with Azure data services (e.g., Data Factory, Dake Lake, Synapse, Databricks) and tools like dbt. Proficiency in SQL and Python, with a solid understanding of data modelling and transformation. Experience integrating data from enterprise systems (CRM, ERP, HRIS). Familiarity with More ❯
and internal policies. What We're Looking For: Solid experience in data engineering or backend data development. Strong experience with Azure data services (e.g., Data Factory, Dake Lake, Synapse, Databricks) and tools like dbt. Proficiency in SQL and Python, with a solid understanding of data modelling and transformation. Experience integrating data from enterprise systems (CRM, ERP, HRIS). Familiarity with More ❯