a few in-person meetings in shared co-working spaces on an ad hoc basis. Role Description We are looking for an SQL Developer (Snowflake), specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI , with a strong focus on building semantic models and supporting analytics. Key Responsibilities … Develop and optimise complex SQL queries , views, and stored procedures in Snowflake . Design and maintain efficient ETL/ELT pipelines using modern data integration platforms. Create and manage Python-based stored procedures in Snowflake to support advanced transformations and automation. Build and maintain Power BI datasets, data models, and semantic models to support business intelligence needs. Work closely with … of data warehousing, dimensional modelling, and ELT best practices. Knowledge of version control and Agile development methodologies. Qualifications: Strong experience in Data Engineering, with a focus on data modelling, ETL, and Snowflake. Snowflake certification (e.g., SnowPro Core) is a strong plus Proficiency in Snowflake for data warehousing, including semantic modelling and Python-based stored procedures. Experience with Azure Data Factory More ❯
London, England, United Kingdom Hybrid / WFH Options
Mirai Talent
databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. Apply data modelling techniques such as Kimball methodologies, star schemas, and data warehouse design principles. Build and support ETL workflows using tools like Azure Data Factory, Synapse, Delta Live Tables, dbt, SSIS, etc. Automate infrastructure deployment with Terraform, ARM, or Bicep. Collaborate on report development and visualisation with Power … skills in Python, SQL, and PySpark. Experienced working with data lakes, warehouses, lakehouses, and cloud platforms, preferably Azure. Knowledgeable in data modelling, including Kimball and star schemas. Familiar with ETL tools such as Azure Data Factory, Synapse, Delta Live Tables, dbt, SSIS. Experienced with Infrastructure as Code (Terraform, ARM, Bicep). Skilled in Power BI report development. Proficient in version More ❯
the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data warehouses and data lakes. Ensure data governance More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data warehouses and data lakes. Ensure data governance More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
data and analytics needs. Design and deploy end-to-end data solutions using Microsoft Fabric, encompassing data ingestion, transformation, and visualisation workflows. Construct and refine data models, pipelines, andETL frameworks within the Fabric ecosystem. Leverage Fabric's suite of tools to build dynamic reports, dashboards, and analytical applications. Maintain high standards of data integrity, consistency, and system performance across More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Capgemini
into technical solutions that ensure scalability, robust data governance and optimal performance. Developing innovative solutions: Create PoCs and MVPs for Data & AI solutions focusing on pipeline automation, ELT/ETL processes, and deployment through enterprise data platforms. Crafting compelling user experiences: Blend user-centred design with storytelling to deliver impactful Gen AI/BI, WebApp and data product interfaces, leveraging More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
structures. Demonstrated ability to write clean, efficient code in Python and SQL with knowledge on building and optimizing data pipelines or analytics solutions. Foundational understanding of data pipelines andETL/ELT processes, including data ingestion and transformation concepts. Basic knowledge of CI/CD tools (e.g., Jenkins, GitLab CI) with a willingness to learn more advanced automation practices. General More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
using Azure Data Factory (ADF), ensuring efficient and reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, andload data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams More ❯
market. Working specifically with underwriting teams to support their BI processes. Write efficient DAX measures and calculated columns to support analytical requirements. Develop robust SQL queries, stored procedures, andETL processes to source andtransform data. Translate complex data requirements into scalable BI solutions. Collaborate with business users to gather requirements and deliver meaningful insights. Lead data governance and quality … . Experience working with Underwriting teams. Strong understanding of insurance-specific data models including premiums, claims, exposures, and bordereaux. Hands-on experience working with data warehouses, data marts, andETL pipelines. Solid knowledge of relational databases (e.g., MS SQL Server, Azure SQL). Excellent problem-solving skills and ability to work independently or as part of a team. Strong stakeholder More ❯
Cheshire, England, United Kingdom Hybrid / WFH Options
Oliver James
market. Working specifically with underwriting teams to support their BI processes. Write efficient DAX measures and calculated columns to support analytical requirements. Develop robust SQL queries, stored procedures, andETL processes to source andtransform data. Translate complex data requirements into scalable BI solutions. Collaborate with business users to gather requirements and deliver meaningful insights. Lead data governance and quality … SQL. Experience working with Underwriting teams. Strong understanding of insurance-specific data models including premiums, claims, exposures, and bordereaux. Hands-on experience working with data warehouses, data marts, andETL pipelines. Solid knowledge of relational databases (e.g., MS SQL Server, Azure SQL). Excellent problem-solving skills and ability to work independently or as part of a team. Strong stakeholder More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate andtransform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise and manage the data platform to ensure … teams to gather and understand data requirements. Design and build interactive and insightful dashboards and reports for internal and external stakeholders. Develop and maintain comprehensive documentation for data models, ETL processes, and BI solutions. Ensure data accuracy, integrity, and consistency across the data platform. Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and Apache Spark for data engineering … and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data sources. Experience in gathering, documenting, and refining requirements from key business stakeholders to align BI solutions with business needs. Strong problem-solving skills and attention to detail. More ❯
Ibstock, England, United Kingdom Hybrid / WFH Options
Ibstock Plc
insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate andtransform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise and manage the data platform to ensure … teams to gather and understand data requirements. Design and build interactive and insightful dashboards and reports for internal and external stakeholders. Develop and maintain comprehensive documentation for data models, ETL processes, and BI solutions. Ensure data accuracy, integrity, and consistency across the data platform. Knowledge, Skills and Experience: Strong expertise in Databricks and Apache Spark for data engineering and analytics. … Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data sources. Experience in gathering, documenting, and refining requirements from key business stakeholders to align BI solutions with business needs. Strong problem-solving skills and attention to detail. Experience with More ❯
City of Westminster, England, United Kingdom Hybrid / WFH Options
nudge Global Ltd
means we need to stay agile, meaning the responsibilities of a role are never set in stone. Responsibilities Design, build, and maintain robust, scalable, and secure data pipelines andETL processes from various structured and unstructured data sources Leverage data mining techniques to extract meaningful insights from large-scale datasets, including customer transactions, behavioural data, and third-party data providers More ❯
Kirkby on Bain, England, United Kingdom Hybrid / WFH Options
ANGLIAN WATER-2
Modeller, design robust, secure and supportable corporate data solutions to meet business requirements following dimensional modelling methodology, considering privacy by design and self-service capabilities by default. As an ETL Developer, develop, test and/or quality assure extracts of data from corporate systems into the Data Lake As a Semantic Layer Developer, develop, test and/or quality assure More ❯
on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with data visualisation tools (Tableau, PowerBI, Looker) and analytics frameworks Leadership & Communication Proven experience leading technical More ❯
offered on an Outside IR35 basis. Key Responsibilities: Design, build, and maintain scalable data pipelines using Microsoft Fabric, Azure Synapse, and Azure Data Factory (ADF). Develop and optimise ETL/ELT processes to support business intelligence and analytics solutions. Collaborate with data architects, analysts, and stakeholders to understand business requirements and translate them into technical solutions. Contribute to the … needed. Key Skills & Experience: Proven experience as a Data Engineer working in cloud-native environments. Microsoft Fabric Azure Synapse Analytics Azure Data Factory (ADF) Strong understanding of data modelling, ETL/ELT development, and data warehouse best practices. Experience working across greenfield and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Primus
offered on an Outside IR35 basis. Key Responsibilities: Design, build, and maintain scalable data pipelines using Microsoft Fabric, Azure Synapse, and Azure Data Factory (ADF). Develop and optimise ETL/ELT processes to support business intelligence and analytics solutions. Collaborate with data architects, analysts, and stakeholders to understand business requirements and translate them into technical solutions. Contribute to the … needed. Key Skills & Experience: Proven experience as a Data Engineer working in cloud-native environments. Microsoft Fabric Azure Synapse Analytics Azure Data Factory (ADF) Strong understanding of data modelling, ETL/ELT development, and data warehouse best practices. Experience working across greenfield and brownfield projects. Comfortable working independently in a remote setup. Strong communication and stakeholder engagement skills. Nice to More ❯
ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring More ❯
London, England, United Kingdom Hybrid / WFH Options
Circana
ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a desire to make a significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the More ❯
London, England, United Kingdom Hybrid / WFH Options
Guaranteed Tenants Ltd
following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Google Cloud Professional certifications. Experience in alternative cloud More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or PowerBI. Excellent communication and stakeholder management skills. Desirable Experience: Google Cloud Professional certifications. Experience in More ❯