ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
london (city of london), south east england, united kingdom
i3
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
REST APIs , Power BI Embedded , and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensional modelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications such as DA-100 , DP-500 , or MCSE: BI Familiarity with CI/CD for BI assets (e.g. Git More ❯
and help deliver actionable business insights to stakeholders across the organisation. Key Responsibilities*Design and develop Power BI datasets, dashboards, and reports*Build efficient, scalable data models (star/snowflake schemas)*Implement advanced DAX calculations and KPIs*Gather and interpret business reporting requirements*Ensure data accuracy, compliance, and security in all reporting*Liaise between business, IT, and analytics teams More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production More ❯
a Business Data Analyst to join their expanding data team during an exciting cloud data transformation programme and a Workday (HR & Finance) implementation. The business is building a new Snowflake-based data warehouse , and this role will play a pivotal part in shaping how data is structured, accessed, and used across the organisation. You will act as the key … interface between technical teams and business stakeholders, ensuring requirements are translated into high-quality Snowflake data models, pipelines, and reports that deliver measurable business value. Key Responsibilities Snowflake Data Analysis & Modelling Collaborate with stakeholders to capture, refine and document business and technical requirements for Snowflake-based data warehouses, reports, and dashboards . Design and maintain optimised Snowflake … star schemas) to ensure scalable, reliable reporting and analytics. Support Strategic Data Initiatives & Reporting Optimisation Evaluate existing reports and conduct technical analyses to identify and close data gaps within Snowflake's data model. Ensure alignment of data models with evolving business requirements and governance standards. Development, Testing & Operational Support Investigate and resolve data integrity or reporting issues within Snowflake. More ❯
to modern finance workflows. Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines to ingest, transform, and securely store data from NetSuite and other finance systems into Snowflake, ensuring data integrity, compliance, and security best practices (e.g., encryption, access controls, and auditing). Collaborate with finance and data teams to define data models, schemas, and governance policies … 5+ years of experience as a data engineer or analytics engineer, with a proven track record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Reed.co.uk
Looker solutions that support sales performance and revenue growth. Develop, document, and distribute Looker reports, ensuring consistency, accuracy, and usability. Build and maintain robust data models in DBT and Snowflake, ensuring clean, trustworthy data flows into reporting. Support data integrity within Salesforce and other commercial data sources, enhancing quality and enabling consistent reporting. Automate ingestion and transformation of financial … tool such as Looker, Tableau, Power Bi or similar Good SQL skills, you know your way around a database. Some exposure to a cloud-based data warehouse such as Snowflake, Redshift, Big Query or similar. Some knowledge of ETL/ELT and data automation. A desire to work with complex Salesforce detail and master that data. Excellent problem-solving More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
london (city of london), south east england, united kingdom
Capgemini
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
wider customer and insight function, harnessing data and predictive modelling to deliver personalised experiences across global brands and channels. What you'll need Python, SQL, data modelling Expertise in Snowflake & Snowpark Immaculate communication to influence and collaborate Experience building ML environments (MLflow or similar) CI/CD practices What you'll do Design and implement a scalable ML environment … with Data Science Define best practice for deployment, monitoring, and governance Build pipelines in Snowflake/Snowpark to power ML workflows Support migration of models, removing blockers Partner with IT to ensure smooth integration and adoption Mentor peers and champion modern engineering practices Ensure compliance with governance, privacy, and security If you're a Data Engineer ready to make More ❯
Chelsea, London, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
Data Scientist – Location: London (Hybrid) 4 days in the office Salary: £50,000 to £60,000 + Benefits CRM | Predictive Modeling | Snowflake | dbt Are you a Data Analyst or mid level Data Scientist who’s ready to move beyond dashboards and into real impact? This is your chance to join a global data team that’s driving strategy across … ll be building predictive models (LTV, lead scoring), developing analytics-ready data marts, and deploying solutions that directly influence decision-making. You’ll work with a cutting-edge stack: Snowflake, dbt, Tableau, Salesforce, Python, SQL and collaborate with stakeholders who genuinely value data.What You’ll Be Doing as a Data Scientist: Designing and refining predictive models to guide marketing … with drift monitoring and retraining triggers Partnering with marketing and enrolment teams to run A/B tests and measure impact Building analytics-ready data marts in dbt/Snowflake with proper documentation and SLAs Developing dashboards in Tableau to track ROI, pipeline health, and market segmentation Maintaining pipelines from Salesforce, Marketing Cloud, GA, Facebook, and more Using reverse More ❯
Overview Data Scientist - Location: London (Hybrid) 4 days in the office Salary: £50,000 to £60,000 + Benefits CRM Predictive Modeling Snowflake dbt Are you a Data Analyst or mid level Data Scientist who's ready to move beyond dashboards and into real impact? This is your chance to join a global data team that's driving strategy … ll be building predictive models (LTV, lead scoring), developing analytics-ready data marts, and deploying solutions that directly influence decision-making. You'll work with a cutting-edge stack: Snowflake, dbt, Tableau, Salesforce, Python, SQL and collaborate with stakeholders who genuinely value data. What You'll Be Doing as a Data Scientist Designing and refining predictive models to guide … with drift monitoring and retraining triggers Partnering with marketing and enrolment teams to run A/B tests and measure impact Building analytics-ready data marts in dbt/Snowflake with proper documentation and SLAs Developing dashboards in Tableau to track ROI, pipeline health, and market segmentation Maintaining pipelines from Salesforce, Marketing Cloud, GA, Facebook, and more Using reverse More ❯