City of London, London, United Kingdom Hybrid / WFH Options
OTA Recruitment
in building, maintaining, and scaling modern data pipelines and transformation workflows (ELT), ideally within a cloud or lakehouse environment. Strong experience with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands … data engineers, data scientists, and business stakeholders. Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of … data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding. Proficient in analytics tools such as Power BI, Plotly/Dash, or similar for building interactive and impactful visualizations. Deep experience with modern ELT workflows and transformation tools (e.g., dbt, custom SQL models, etc). Strong ability to debug and optimize slow or More ❯
in building, maintaining, and scaling modern data pipelines and transformation workflows (ELT), ideally within a cloud or lakehouse environment. Strong experience with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands … data engineers, data scientists, and business stakeholders. Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of … data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding. Proficient in analytics tools such as Power BI, Plotly/Dash, or similar for building interactive and impactful visualizations. Deep experience with modern ELT workflows and transformation tools (e.g., dbt, custom SQL models, etc). Strong ability to debug and optimize slow or More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding More ❯
hands-on experience with Power Query, DAX, tabular model design, visualization best practices. Proficiency in optimizing report performance and implementing advanced features. Data Modelling: Strong knowledge of star/snowflake schemas, dimension/fact design, and row-level security for enterprise-scale tabular models. Programming & Scripting: Proficiency in SQL; familiarity with Python or other scripting languages is a plus. More ❯
is a Remote role with a few in-person meetings in shared co-working spaces on an ad hoc basis. Role Description We are looking for an SQL Developer (Snowflake), specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI, with a strong focus on building … SQL queries, views, and stored procedures in Snowflake. Design and maintain efficient ETL/ELT pipelines using modern data integration platforms. Create and manage Python-based stored procedures in Snowflake to support advanced transformations and automation. Build and maintain Power BI datasets, data models, and semantic models to support business intelligence needs. Work closely with stakeholders to understand data … requirements and translate them into scalable technical solutions. Ensure data quality, consistency, and performance across environments. Monitor and tune Snowflake performance, storage, and compute usage. Implement best practices in data modelling, schema design, and cloud architecture. Collaborate on CI/CD and automation initiatives for data deployments. Maintain technical documentation for processes, pipelines, models, and reports. Skills Required More ❯
non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflakeschema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor's or Master … Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflakeschema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts; experience gathering requirements … R; experience with machine learning algorithms and techniques is a plus. •Experience in building and maintaining APIs for data integration and delivery. •Experience with data warehouse platforms such as Snowflake a plus. ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in More ❯
Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Exposure More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Avanti
and workflow automation Experience with AWS data tools (e.g. Redshift, Glue, Lambda, S3 ) and infrastructure tools such as Terraform Understanding of data modelling concepts (e.g. dimensional models, star/snowflake schemas) Knowledge of data quality, access controls , and compliance frameworks Nice to Have Experience with orchestration or pipeline frameworks like Airflow or dbt Familiarity with BI platforms (e.g. Power More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
We are looking for a Data Engineer, specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI, with a strong focus on building semantic models and supporting analytics. Please only apply if you are confident with the above. This is a Remote role with a few in … scalability. We collaborate closely with clients to achieve their desired results, covering Retail, Out of Home, E-Commerce, and Field Sales. Key Responsibilities: Design and optimize ETL pipelines in Snowflake and Azure Data Factory to streamline data integration and transformation. Build and manage semantic data models in Snowflake and Power BI to support scalable, user-friendly analytics and … reporting. Develop Snowflake stored procedures using Python to automate workflows and handle complex data transformations. Ensure data integrity and accessibility within Snowflake to support effective data warehousing operations. Collaborate with analytics and business teams to align data models with reporting needs and business goals. Qualifications: Strong experience in Data Engineering, with a focus on data modelling, ETL, and More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
or Scala to support data transformation, orchestration, and integration tasks. Work with cloud platforms like AWS to deploy, monitor, and manage data services. Utilise tools such as DBT and Snowflake for data modeling, transformation, and warehouse management. Collaborate with analysts, data scientists, and business stakeholders to ensure data accuracy, consistency, and availability. Apply strong analytical thinking and problem-solving … on expertise in Spark, Kafka, and other distributed data processing frameworks. Solid programming skills in Python Strong familiarity with cloud data ecosystems, especially AWS. Strong knowledge of DBT and Snowflake Strong problem-solving mindset with the ability to diagnose and resolve technical challenges. Excellent communication skills and the ability to work effectively in a cross-functional team. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Signify Technology
or Scala to support data transformation, orchestration, and integration tasks. Work with cloud platforms like AWS to deploy, monitor, and manage data services. Utilise tools such as DBT and Snowflake for data modeling, transformation, and warehouse management. Collaborate with analysts, data scientists, and business stakeholders to ensure data accuracy, consistency, and availability. Apply strong analytical thinking and problem-solving … on expertise in Spark, Kafka, and other distributed data processing frameworks. Solid programming skills in Python Strong familiarity with cloud data ecosystems, especially AWS. Strong knowledge of DBT and Snowflake Strong problem-solving mindset with the ability to diagnose and resolve technical challenges. Excellent communication skills and the ability to work effectively in a cross-functional team. More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
Principal Data Consultant (Snowflake/Matillion) Application Deadline: 18 June 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you'll be at the forefront of our most strategic initiatives. Your role will involve leading client engagements, managing large-scale projects, and … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work … closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and More ❯
Mandatory Skills Required: Data Architecture Data Migration Data Modeling Snowflake Designer/Developer DBT (Data Build Tool) ETL Design AWS Services – including S3, ETL/EMR, Security, Lambda, etc. StreamSet Python Programming Leadership and Team Handling Strong Communication and Collaboration Skills > Bachelor’s degree in computer science, Engineering, or a related field. >5+ years of experience in data engineering … with a strong focus on Snowflake and AWS. > Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) > Hands on experience with Oracle RDBMS > Data Migration experience to Snowflake > Experience with AWS services such as S3, Lambda, Redshift, and Glue. > Strong understanding of data warehousing concepts and data modeling. > Excellent problem-solving and communication skills, with a focus More ❯
Mandatory Skills Required: Data Architecture Data Migration Data Modeling Snowflake Designer/Developer DBT (Data Build Tool) ETL Design AWS Services – including S3, ETL/EMR, Security, Lambda, etc. StreamSet Python Programming Leadership and Team Handling Strong Communication and Collaboration Skills > Bachelor’s degree in computer science, Engineering, or a related field. >5+ years of experience in data engineering … with a strong focus on Snowflake and AWS. > Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) > Hands on experience with Oracle RDBMS > Data Migration experience to Snowflake > Experience with AWS services such as S3, Lambda, Redshift, and Glue. > Strong understanding of data warehousing concepts and data modeling. > Excellent problem-solving and communication skills, with a focus More ❯
for an experienced Data Engineer to join their team. They only recruit the "best" talent and have a diverse workforce. Key Responsibilities: Design, build, and manage data pipelines in Snowflake, ensuring seamless data flow from diverse sources into the data platform. Collaborate with business stakeholders and data analysts to gather data requirements and deliver effective solutions. Integrate structured and … unstructured data sources into Snowflake, optimising data models for analytics and reporting. Support data analysts by optimising semantic models and ensuring data readiness for Snowflake-based reporting and analytics. Implement and enforce data governance policies, maintaining data security and compliance within the Snowflake environment. Monitor system performance and ensure data platforms are scalable, reliable, and efficient. Requirements … Demonstrated experience in Snowflake, including developing and managing data pipelines and data models. Strong knowledge of cloud-based data integration, transformation, and storage. Hands-on experience working with both structured and unstructured data in Snowflake. Familiarity with data governance, security best practices, and Snowflake optimisation techniques. Proven ability to work collaboratively with analysts and business users to deliver More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have experience with streaming platforms like Apache Kafka and be able More ❯
SR2 | Socially Responsible Recruitment | Certified B CorporationTM
comfortable working across both technical and business domains. Key technical skills: Strong SQL and ELT/data pipeline development experience Expertise in Data Warehouse & Data Lake design (including Star Schema, SnowflakeSchema, Data Vault) Hands-on experience with enterprise databases: Oracle, Snowflake, Teradata, or SQL Server Solid understanding of AWS (S3, Lambda, IAM, etc.) Proficiency in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Searchability®
ETL processes, and cloud-based data solutions, with a particular focus on advanced SQL for data transformation and optimisation. The ideal candidate will have hands-on experience working with Snowflake, Azure Data Factory, and Power BI, with proven ability to design and maintain semantic data models that support scalable analytics. Experience working with consumer goods or retail data models … this role you give express consent for us to process and submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Snowflake, Azure Data Factory, PowerBI, SQL, ETL More ❯
ETL processes, and cloud-based data solutions, with a particular focus on advanced SQL for data transformation and optimisation. The ideal candidate will have hands-on experience working with Snowflake, Azure Data Factory, and Power BI, with proven ability to design and maintain semantic data models that support scalable analytics. Experience working with consumer goods or retail data models … this role you give express consent for us to process and submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Snowflake, Azure Data Factory, PowerBI, SQL, ETL More ❯
My client is based in the London area are currently looking to recruit for an experienced Lead Snowflake Data Engineer to join their Data & AI Consulting team. They are a specialist insurance organisation, that are at the forefront of engineering practices. They are currently going through a period of growth and are looking for an experienced Lead Data Engineer … office, Bonus of up to 15%, 29 Holidays, Flexible Working, Private Health Care,And More... For this role, you will need to aid the implementation of a brand new Snowflake Data Warehouse . They are looking for a candidate that has experience in... AWS Data Platform, Strong knowledge of Snowflake, S3, Lambada, Data Modelling, DevOps Practices, Ariflow, DBT More ❯
a unified, scalable architecture to enable improved analytics, data governance, and operational insights. Technical requirements: ️ Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (Data Build Tool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity … both enterprise reporting and self-service analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to More ❯
Cambridgeshire, United Kingdom Hybrid / WFH Options
Exalto Consulting
working, this is paying £55-65K basic salary depending on experience with a 10% bonus. Essential skills and experience for the role: Python Azure Data Factory Power BI Snowflake, dbt Experience designing ETL/ELT pipelines and integrating data from cloud and on-premise sources. API integrations Familiarity with data modelling best practices (star/snowflakeschemaMore ❯
Cambridgeshire, East Anglia, United Kingdom Hybrid / WFH Options
Exalto Consulting ltd
working, this is paying £55-65K basic salary depending on experience with a 10% bonus. Essential skills and experience for the role: Python Azure Data Factory Power BI Snowflake, dbt Experience designing ETL/ELT pipelines and integrating data from cloud and on-premise sources. API integrations Familiarity with data modelling best practices (star/snowflakeschemaMore ❯