Senior Data Engineer - (Azure/Databricks) page is loaded Senior Data Engineer - (Azure/Databricks) Apply locations London - Scalpel time type Full time posted on Posted 15 Days Ago job requisition id REQ05851 This is your opportunity to join AXIS Capital - a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our … civil union status, family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks Job Family Grouping: Chief Underwriting Officer Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data … services. Implement end-to-end datapipelines, ensuring data quality, data integrity and data security. Troubleshoot and resolve datapipeline issues while ensuring data integrity and quality. Implement and enforce data security best practices, including role-based access control (RBAC), encryption, and compliance with industry More ❯
that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on … re underpinned by over 300 engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data … data engineering projects. You will work closely with cross-functional teams and contribute to the strategic direction of our data initiatives. RESPONSIBILITIES DataPipeline Development: Lead the design, implement, and maintain scalable datapipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such More ❯
you share our passion for ideas and commitment to excellence, we want you to join us. To learn more visit alti-global.com. Job Description & Overview The Head of Data Engineering & Analytics will lead the development and execution of AlTi’s enterprise data engineering strategy, enabling the capture, transformation, storage and delivery of high-quality data across the firm’s global wealth, investment, corporate and asset management functions. This leader will architect and scale data engineering capabilities to support real-time and batch integration, reporting, and advanced analytics. This role reports to the CTO and will be a key member of the Global Technology Solutions leadership team. In this hands-on leadership … code, automated testing, release and version control and system observability for data pipelines. Establish metrics and KPIs and identify and deploy tools to measure datapipeline health, data quality, timeliness and accuracy, team performance, cost-effectiveness, and business impact. Actively mentor and grow talent within the team while fostering a collaborative and outcome More ❯
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets. Key Responsibilities : DataPipeline Development & Optimisation : Design, develop, and maintain robust and scalable datapipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data … technical solutions. Communicate technical concepts effectively to both technical and non-technical audiences. Participate in code reviews and knowledge sharing sessions. Automation & DevOps: Implement automation for datapipeline deployments and other data engineering tasks. Work with DevOps teams to implement and Build CI/CD pipelines, for environmental deployments. Promote and implement DevOps best practices. … Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and datapipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. Experience with SQL and NoSQL databases. Experience with data quality and data governance More ❯
London, England, United Kingdom Hybrid / WFH Options
The Remote Job Journal
Data Platform, managing the platform, and driving its further development. This role requires proficiency in data engineering, data modelling, and datapipeline orchestration, with a proven track record of working with Azure-specific data platform tools and technologies. Strategic and forward-thinking capabilities, as well as experience in applying … and mentorship to the Data & Analytics team. Oversee the team’s Agile processes including sprints, ask prioritisation and issue resolution Define best practices for datapipeline design, development, and maintenance. Coordinate with the solution architect to ensure platform architecture meets current and future requirements. Act as a point of escalation for complex data … clean, usable datasets for reporting and analytics. Platform Administration and Optimization: Monitor and optimize Azure Synapse Analytics performance, ensuring cost efficiency and resource utilization. Implement monitoring and alerting for pipeline performance, failures, and resource consumption. Suggest and implement automation for repetitive tasks using tools like Azure Logic Apps, Power Automate, or custom scripting. Collaborate with the DataMore ❯
class opportunity to help shape our organisation for the next stage of its journey. To drive towards this ambition, we are seeking a motivated individual to join our Data Platform team and support Aztec's new technology strategy using Azure Databricks. You will lead our Data Engineering capability and collaborate with others passionate about solving business … problems. Key responsibilities: Data Platform Design and Architecture Design, develop, and maintain a high-performing, secure, and scalable data platform, leveraging Databricks Corporate Lakehouse and Medallion Architectures. Utilise our metadata-driven data platform framework combined with advanced cluster management techniques to create and optimise scalable, robust, and efficient data solutions. Implement comprehensive … Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and dataMore ❯
that solves real-world problems for large blue-chip companies and governments worldwide. We're pioneers of meaningful AI : our solutions go far beyond chatbots. We are using data and AI to solve the world's biggest issues in telecommunications, sustainable water management, energy, healthcare, climate change, smart cities, and other areas that have a real impact on … re underpinned by over 300 engineers and scientists working across Africa, Europe, the UK and the US. ABOUT THE ROLE Sand Technologies focuses on cutting-edge cloud-based data projects, leveraging tools such as Databricks, DBT, Docker, Python, SQL, and PySpark to name a few. We work across a variety of data architectures such as Data … modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES DataPipeline Development: Design, implement, and maintain scalable datapipelines for ingesting, processing, and transforming large volumes of data from various sources using tools such as databricks More ❯
Job Description Job Title: Data Architect Location: London - 3 days travel to office SC Cleared: Required Job Type: Full-Time Experience: 10+ years Job Summary : We are seeking a highly experienced and visionary Data Architect to lead the design and implementation of the data architecture for our cutting-edge Azure Databricks platform focused on … economic data. This platform is crucial for our Monetary Analysis, Forecasting, and Modelling efforts. The Data Architect will be responsible for defining the overall data strategy, data models, data governance framework, and data integration patterns. This role requires a deep understanding of data warehousing principles, big data technologies, cloud computing (specifically Azure), and a strong grasp of data analysis concepts within the economic domain. Key Experience: Extensive Data Architecture Knowledge: They possess a deep understanding of data architecture principles, including data modeling, data warehousing, data integration, and data governance. Databricks Expertise More ❯
Get AI-powered advice on this job and more exclusive features. Direct message the job poster from Modular Data We are a technology solutions provider that specialises in serving the public sector. Our mission is to help government organisations become data-driven, transform their digital offerings, streamline their processes, improve citizen services and enhance transparency. We … and Business Stakeholders to understand data requirements and deliver clean, validated datasets. Monitor, troubleshoot, and optimize ETL/ELT workflows to ensure data quality and pipeline efficiency. Implement best practices in data governance, security, and compliance within cloud environments. Lead and mentor junior data engineers, promoting a culture of technical excellence … using tools like AWS Glue, EMR, or custom frameworks. Familiarity with data modeling concepts. Excellent problem-solving and communication skills. Proficiency in Java and datapipeline development. Familiarity with version control systems (e.g., Git) and agile development methodologies. Experience working with Public Sector clients Consulting experience Preferred Knowledge/Experience Experience with CI/CD More ❯
over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast … paced organisation. You will join as a Senior Data Platform Engineer providing technical leadership to the Data Engineering team. You will work closely with our Data Scientists and business stakeholders to ensure value is delivered through our solutions. Job Accountabilities ·Develop robust, scalable datapipelines to serve the easyJet analyst and data … time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam. ·Understanding of the challenges faced in the design and development of a streaming datapipeline and the different options for processing unbounded data (pubsub, message queues, event streaming etc) ·Understanding of the most commonly used Data Science and Machine Learning More ❯
London, England, United Kingdom Hybrid / WFH Options
DATAPAO
Join to apply for the Senior Data Engineer (Databricks) role at DATAPAO Get AI-powered advice on this job and more exclusive features. This range is provided by DATAPAO. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range Direct message the job poster from DATAPAO At … DATAPAO, data ignites passion, community fuels collaboration, and growth knows no bounds. We are a leading Data Engineering and Data Science consulting company, recognized for our innovation and rapid growth. We have been named Databricks EMEA Emerging Business Partner of the Year and have achieved a second consecutive appearance on the Financial Times FT1000 … Data Engineering , focusing on cloud platforms (AWS, Azure, GCP); Proven experience with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); Extensive ETL/ELT and datapipeline orchestration experience (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions); Proficiency in SQL and Python for data transformation and optimization; Knowledge of CI/CD pipelinesMore ❯
London, England, United Kingdom Hybrid / WFH Options
Locus Robotics
picking and replenishment to sorting and pack-out-Locus Robotics empowers businesses to meet peak demands and adapt to ever-changing operational needs. Are you a skilled Python Data Engineer with a passion for building scalable, production-level systems and an interest in machine learning? We want to hear from you! At Locus Robotics, we're evolving our … data infrastructure to support advanced machine learning capabilities and real-time analytics. As a key member of our team, you'll help migrate our existing codebase into machine learning territory, drive innovation across multiple deployment sites, and collaborate closely with our optimization and engineering teams. You'll develop robust Python code, analyze system performance, and contribute to deployment … to work in one of these countries without the need for work sponsorship. Responsibilities: Develop and maintain Python-based systems deployed across remote platforms. Contribute to and improve datapipelines, ensuring reliable and efficient system updates. Build and enhance features for real-time data analysis and system monitoring to ensure high uptime and efficiency. Collaborate with More ❯
s leading Google Cloud specialist consultancy. We're a team that's energised by innovation and delivering exceptional results for our clients. We deliver cutting edge solutions across data and analytics, AI, cloud infrastructure or security that drive digital transformation and enable our customers to scale, modernise and thrive. In joining Qodea, you'll work alongside specialists in … and energetic people who are eager to learn and grow, so if you want to supercharge your career, join Qodea! What you'll do: Develop and maintain automated data processing pipelines using Google Cloud: Design, build, and maintain datapipelines to support data ingestion, ETL, and storage Build and maintain automated datapipelines to monitor data quality and troubleshoot issues Implement and maintain databases and data storage solutions: Stay up-to-date with emerging trends and technologies in big data and data engineering Ensure data quality, accuracy, and completeness Implement and enforce data governance policies and procedures to More ❯
to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best use of their data. In building data platforms and pipelines, our data engineers create the foundation for diverse data & analytics solutions, including data science and AI. They build data lakes and warehouses, create the processes to extract or access operational data, and transform siloed datasets into integrated data models that allow insight into business performance and problems or training of ML models. These are hands-on, client-facing roles, with openings at senior or lead level to suit your experience. You may be … or solving tough engineering challenges. You'd also be expected to spend some time on-site with clients in the London area on an ad-hoc basis. Our data engineers combine a strong software engineering approach with solid data fundamentals and experience with modern tools. We're technology agnostic, and we're open minded when it More ❯
London, England, United Kingdom Hybrid / WFH Options
Circana
Join to apply for the Senior Data Engineer (Remote) role at Circana Join to apply for the Senior Data Engineer (Remote) role at Circana At Circana, we are fueled by our passion for continuous learning and growth, we seek and share feedback freely, and we celebrate victories both big and small in an environment that is … entirely on what current employees say about their experience working at Circana. Learn more at www.circana.com. What will you be doing? We are seeking a skilled and motivated Data Engineer to join a growing global team. In this role, you will be responsible for designing, building, and maintaining robust datapipelines and infrastructure on the Azure … scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
A capabilities and maintain them. Partnerships with other teams such as Supply, Manufacturing & R&D are also key to ensure process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and … maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation … and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with datapipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. More ❯
WORLD FOR PETS. Through comprehensive veterinary care, nutrition, breakthrough programmes in diagnostics, wearable health monitoring, DNA testing and pet welfare, we help pets in more than 130 countries. Data and Analytics is foundational to our Petcare OGSM goal to transform the experience of pet ownership through digitalisation at scale. To deliver on this ambition we require the very … highest level of technical/engineering expertise within Global Petcare Data & Analytics. What are we looking for? For this role, we hope you have the following skills we require to round out our team: Must have Experience in at least one programming or data manipulation language. Experience in processing and modelling data from diverse … Experience with Databricks, Delta Live Tables (DLT) or Data Build Tool (DBT) Experience working in a cloud environment (Azure, AWS, GCP) Experience in building datapipeline/ETL/ELT solutions Experience working with consumer data in a real-time environment Experience with data visualisation tools (e.g. PowerBI) Personal Traits An More ❯
Senior Data Engineer – London - Hybrid Mars is a family-owned business with more than $35 billion in global sales. We produce some of the world’s best-loved brands: M&M’s®, SNICKERS®, TWIX®, MILKY WAY®, DOVE®, PEDIGREE®, ROYAL CANIN®, WHISKAS®, EXTRA®, ORBIT®, 5TM, SKITTLES®, BEN’S ORIGINAL®, and COCOAVIA®. Alongside our consumer brands, we proudly take … A capabilities and maintain them. Partnerships with other teams such as Supply, Manufacturing & R&D are also key to ensure process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and … and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with datapipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. More ❯
Senior Data Engineer – London - Hybrid Mars is a family-owned business with more than $35 billion in global sales. We produce some of the world’s best-loved brands: M&M’s®, SNICKERS®, TWIX®, MILKY WAY®, DOVE®, PEDIGREE®, ROYAL CANIN®, WHISKAS®, EXTRA®, ORBIT®, 5TM, SKITTLES®, BEN’S ORIGINAL®, and COCOAVIA®. Alongside our consumer brands, we proudly take … A capabilities and maintain them. Partnerships with other teams such as Supply, Manufacturing & R&D are also key to ensure process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and … and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with datapipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. More ❯
Senior Data Engineer – London - Hybrid Mars is a family-owned business with more than $35 billion in global sales. We produce some of the world’s best-loved brands: M&M’s®, SNICKERS®, TWIX®, MILKY WAY®, DOVE®, PEDIGREE®, ROYAL CANIN®, WHISKAS®, EXTRA®, ORBIT®, 5TM, SKITTLES®, BEN’S ORIGINAL®, and COCOAVIA®. Alongside our consumer brands, we proudly take … A capabilities and maintain them. Partnerships with other teams such as Supply, Manufacturing & R&D are also key to ensure process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and … and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with datapipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. More ❯
Employment Type:** Permanent **Salary:** Up to £62,000 **Location:** in the West of London **Work Arrangement:** In-office only, five days a week. **Job Title:** Data Architect/Data Engineer **Experience Required:** 6-7Years **Role Overview:** We are seeking a Senior Data Architect/Data Engineer to design and implement scalable … data solutions, including real-time and ETL/ELT data pipelines. Responsibilities include optimizing our data warehouse, creating Power BI reports, and developing AI-driven solutions using OpenAI APIs. **Key Responsibilities:** - Design and implement robust real-time datapipelines from diverse sources. - Maintain a high-performance data warehouse for reporting … and analytics. - Develop insightful Power BI dashboards and reports for stakeholders. - Use Azure Data Factory, Azure Functions, and Logic Apps for data integration. - Employ Python and SQL for data transformation and analysis. - Leverage OpenAI APIs for AI/ML solutions. - Collaborate with teams to align data strategy with business goals. - Ensure dataMore ❯
Employment Type:** Permanent **Salary:** Up to £62,000 **Location:** in the West of London **Work Arrangement:** In-office only, five days a week. **Job Title:** Data Architect/Data Engineer **Experience Required:** 6-7Years **Role Overview:** We are seeking a Senior Data Architect/Data Engineer to design and implement scalable … data solutions, including real-time and ETL/ELT data pipelines. Responsibilities include optimizing our data warehouse, creating Power BI reports, and developing AI-driven solutions using OpenAI APIs. **Key Responsibilities:** - Design and implement robust real-time datapipelines from diverse sources. - Maintain a high-performance data warehouse for reporting … and analytics. - Develop insightful Power BI dashboards and reports for stakeholders. - Use Azure Data Factory, Azure Functions, and Logic Apps for data integration. - Employ Python and SQL for data transformation and analysis. - Leverage OpenAI APIs for AI/ML solutions. - Collaborate with teams to align data strategy with business goals. - Ensure dataMore ❯
London, England, United Kingdom Hybrid / WFH Options
Datapao
We are currently looking for a Senior Data Engineer to join us remotely in the UK. We plan to set up a hub in the UK in the following 12-18 months to enable our GTM strategy, and this would be our first hire in the region. You will work with our EMEA and US customers to help … them solve their data engineering, ML, ML Ops, and cloud migration puzzles. "As a Data Engineer at DATAPAO, you will have the unique opportunity to push the boundaries of technology and lead the way in how data transforms the world. We work with customers who want to innovate and change their industries. So it … GCP); You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and datapipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions.); You’re proficient in SQL and Python , using them to transform and optimize data like a More ❯
Employment Type:** Permanent **Salary:** Up to £62,000 **Location:** in the West of London **Work Arrangement:** In-office only, five days a week. **Job Title:** Data Architect/Data Engineer **Experience Required:** 6-7Years **Role Overview:** We are seeking a Senior Data Architect/Data Engineer to design and implement scalable … data solutions, including real-time and ETL/ELT data pipelines. Responsibilities include optimizing our data warehouse, creating Power BI reports, and developing AI-driven solutions using OpenAI APIs. **Key Responsibilities:** - Design and implement robust real-time datapipelines from diverse sources. - Maintain a high-performance data warehouse for reporting … and analytics. - Develop insightful Power BI dashboards and reports for stakeholders. - Use Azure Data Factory, Azure Functions, and Logic Apps for data integration. - Employ Python and SQL for data transformation and analysis. - Leverage OpenAI APIs for AI/ML solutions. - Collaborate with teams to align data strategy with business goals. - Ensure dataMore ❯