pipelines and architectures—integrating data from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Proven ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user … with a diligent approach to data engineering including data privacy (e.g., GDPR compliance) and evolving best practices. Job Purpose As the Lead Data Architect & Engineer, you will oversee the development, maintenance, and optimisation of the data infrastructure that powers analytics and reporting across the business. You will design and implement new engineering solutions that enhance our scalable architecture, while … Cloud Platform, you’ll play a leading role in advancing our data capabilities and delivering high-impact projects. You will also manage and mentor a Data Engineer, supporting their developmentand ensuring alignment with team goals and technical standards. Key Responsibilities and Accountabilities - what is delivered Design and Maintain Data Pipelines: Develop and maintain robust, scalable, and efficient data More ❯
pipelines and architectures, ideally integrating data from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user … meet functional requirements. Process Automation and Optimisation: Identify, design, and implement improvements to automate manual processes, enhance data delivery performance, and re-architect infrastructure for improved scalability and resilience. ETLDevelopmentand Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data … issues and uncover opportunities for operational or strategic improvements. Unstructured Data Handling: Capability for working with unstructured and semi-structured datasets, transforming raw information into actionable insights. Data Workflow Development: Skilled in developing and maintaining data transformation processes, managing data structures, metadata, workload dependencies, and orchestration frameworks. Large-scale Data Processing: A demonstrated history of manipulating, processing, and extracting More ❯
ensure data is reliable, accessible, and valuable across all areas of the business. What you’ll be doing: Designing, building, and owning high-performance data pipelines and APIs Developing ETL processes to support analytics, reporting, and business operations Assembling complex datasets from a wide variety of sources using tools like SQL, Python, dbt, and Azure Supporting and improving data … we’re looking for: Strong experience with Azure cloud technologies , particularly around data services Proficient in SQL and experienced with Python for data transformation Hands-on experience with dbt , ETLdevelopment , and data warehousing best practices Comfortable with deploying infrastructure as code and building CI/CD pipelines (e.g., using GitHub, Azure DevOps) Ability to manage large, unstructured More ❯
SQL, ETL, Azure Senior Data Engineer is required to join a forward-thinking data team within a thriving city-based insurance group. This role will see you playing a critical role in delivering reliable, scalable and business-focused data solutions. With a strong focus on Microsoft technologies and cloud-based tools, you’ll work directly with key business stakeholders … an MGA or insurance carrier is essential. Key Responsibilities Deliver data solutions and changes that support evolving business requirements. Build and maintain robust, scalable data pipelines using SQL andETL best practices. Collaborate with stakeholders to analyse, define and implement solutions to complex data challenges. Proactively assess the impact of changes on the broader data model and ensure integrity … owners and architects to align technical delivery with strategic objectives. Build deep knowledge of internal systems and promote collaboration across teams. Key Skills & Experience: Significant experience with SQL andETL development. Strong experience with MS SQL Server, T-SQL, Azure Data Factory, Azure Databricks, Python, Data Lake. Strong background in insurance MI or reporting—experience within an MGA or More ❯