business each day. You will work with the Lead Data Engineer and other members of the Data Engineering team to deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Development of data ingestion … hours including weekends, evenings and public holidays. Your profile Key Skills and Competency Requirements: At least 2 years' experience designing and implementing a full-scale data warehouse solution using Snowflake Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting. Data Modelling using the Kimball Methodology. Experience in developing CI/ More ❯
the lives of teachers and outcomes of students everywhere. About the role Our Data team centralises data from all instances of Arbor MIS, and many other sources to a Snowflake data warehouse. This delivers analytics and analytics-enablement to our customers including schools, local authorities and multi-academy trusts. We are looking for a collaborative and experienced Senior Data Engineer … to join our Data team and help us maintain availability, security, and performance of our Snowflake data warehouse. The remit and focus of the role is to take ownership of projects and initiatives and maintain availability, security and performance of a Snowflake data warehouse. It’s a broad and exciting role, so we’re looking for someone up for a … in practising and improving excellent engineering practices within the team About you Extensive experience writing SQL Extensive experience writing Python Experience as Senior Engineer on a data warehouse, ideally Snowflake Experience building and/or maintaining a CI/CD pipeline Experience using modern orchestration tooling e.g. Prefect, Luigi, Airflow Experience developing infrastructure in Terraform or a similar IAC tool More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations Ltd
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. About You We are looking for a skilled Lead Data Engineer who: Personal Attributes Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake Understanding of AWS cloud and … delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, Snowflake Schema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake Good understanding of MLOPS Good More ❯
pipelines to ensure clean, well-structured data for analytics and reporting Implementing and maintaining ELT/ETL processes using tools like DBT and Fivetran Owning and optimising data warehousing ( Snowflake ) and ensuring efficient data management Collaborating with software engineers, product managers and business stakeholders to understand data needs and deliver high-quality, accessible datasets Establishing best practices for data governance … intermediate proficiency in Python and Git Experience designing, building and maintaining data pipelines Familiarity with data warehousing architecture and implementation Visualisation experience, ideally with Tableau Experience with Fivetran and Snowflake Familiarity with cloud infrastructure (ideally AWS) Experience with Terraform and Docker Please note: unfortunately, this role does not offer VISA sponsorship. Seniority level Mid-Senior level Employment type Full-time More ❯
diversity and innovation, LSEG is a place where everyone can grow, develop and fulfil your potential with meaningful careers. ROLE PROFILE : We are looking for a versatile and passionate Snowflake/Python Developer to join the SwapAgent Reporting development team, part of the SwapAgent Technology function supporting LSEG’s Post Trade Solutions business. The team is responsible for the designing … building, and supporting critical reporting service and systems that facilitate the SwapAgent Business operations. This Snowflake/Python Developer role will focus on designing, building and maintaining data pipelines and related systems that are critical to the service. You will be assigned to a pod team and work multi-functionally within an Agile environment to ensure the smooth execution of … the development Team Lead while actively contributing to the delivery of high-quality solutions. The successful candidate needs to be hands-on, with a strong and deep understanding of Snowflake and python plus the knowledge of various databases and AWS cloud development. You need to be proactive and able to work independently and take ownership. You will also be encouraged More ❯
with Security and Compliance teams to ensure adherence to data privacy regulations (e.g., GDPR) and internal governance standards. Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI). Oversee data integration strategy across the enterprise-including ETL/ELT pipelines, APIs, and event-driven data streaming. Contribute to the development … dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery). Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices. Strong knowledge of data governance, data privacy, and compliance … complex data concepts into business-friendly language. Ability to lead technical discussions and influence data-driven decision making across teams. Certifications such as Microsoft Certified: Azure Data Engineer Associate , Snowflake Architect , or CDMP are highly desirable. Why join us? We're on a journey to become market leaders in our space - and with that comes some incredible opportunities. Collaborate and More ❯
Data Systems Engineer, providing technical guidance, setting priorities, and fostering a high-performing team culture. Oversee the delivery of high-quality ETL solutions across cloud-based platforms such as Snowflake and Azure, ensuring performance and reliability. Champion modern software engineering best practices, including automated testing, peer code reviews, version control (Git), and CI/CD pipelines. Collaborate with the Principal … data engineering role. Proven leadership experience managing and mentoring technical teams, including Data Engineers and DBAs. Proficient in designing and optimising cloud-based data pipelines using platforms such as Snowflake and Azure. Strong hands-on experience with Python, Java and SQL for data transformation, orchestration, and automation. Experience collaborating with DBAs to optimise database performance and ensure infrastructure reliability. Demonstrable … data strategy. Play a key role in Pret’s digital transformation, delivering innovative and scalable data solutions. Work in a collaborative, learning-focused environment with cutting-edge tools like Snowflake, Azure, and CI/CD pipelines. Competitive salary, benefits, and the opportunity to make a tangible impact on Pret’s decision-making and technology roadmap. If you are a strategic More ❯
ERP, and marketing data. Deliver end-to-end data modelling projects, connecting multiple sources and creating metrics/KPIs . Work primarily with SQL, dbt, and cloud data warehouses (Snowflake, BigQuery, Redshift). Technical Requirements Strong proficiency in SQL with the ability to write complex queries, optimise performance, and manipulate large datasets efficiently. This includes expertise in database management, data … extraction, transformation, and analysis, ensuring seamless data workflows for all stakeholders. Working with modern cloud data warehouses such as Snowflake , BigQuery , or Redshift . You should be comfortable creating robust data models, building scalable pipelines, and ensuring data quality within these cloud environments. Experience with dbt (Data Build Tool) , particularly in managing and automating data transformation. Solid understanding of Python More ❯
London, England, United Kingdom Hybrid / WFH Options
Ignite Digital Talent
and enable insight-led growth from the ground up. What you will be doing: Build and optimise robust data pipelines using Python , Kafka , Airflow , and DBT Design and manage Snowflake data warehouse objects to support scalable analytics Write clean and efficient SQL to support reporting, dashboards and data products Collaborate across engineering, analytics and product teams to enable data-driven … analytics innovation projects What we are looking for: Strong hands-on experience with Python in a data context Proven skills in SQL Experience with Data Warehousing (DWH) ideally with Snowflake or similar cloud data platforms (Databricks or Redshift) Experience with DBT, Kafka, Airflow, and modern ELT/ETL frameworks Familiarity with data visualisation tools like Sisense, Looker, or Tableau Solid More ❯
London, England, United Kingdom Hybrid / WFH Options
Artefact
and a proven track record of leading data projects in a fast-paced environment. Key Responsibilities Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark. Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms. Implement continuous integration and … communication and interpersonal skills. Excellent understanding of data architecture including data mesh, data lake and data warehouse. Preferred Qualifications: Certifications in Azure, AWS, or similar technologies. Certifications in Databricks, Snowflake or similar technologies Experience in the leading large scale data engineering projects Working Conditions This position may require occasional travel. Hybrid work arrangement: two days per week working from the More ❯
with Security and Compliance teams to ensure adherence to data privacy regulations (e.g., GDPR) and internal governance standards Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI) Oversee data integration strategy across the enterprise—including ETL/ELT pipelines, APIs, and event-driven data streaming Contribute to the development of … Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery) Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices Strong knowledge of data governance, data privacy, and compliance frameworks … complex data concepts into business-friendly language Ability to lead technical discussions and influence data-driven decision making across teams Certifications such as Microsoft Certified: Azure Data Engineer Associate, Snowflake Architect, or CDMP are highly desirable Why join us? We’re on a journey to become market leaders in our space – and with that comes some incredible opportunities. Collaborate and More ❯
performance of our data systems, all while mentoring your team and shaping best practices. Role Lead Technical Execution & Delivery - Design, build, and optimise data pipelines and data infrastructure using Snowflake, Hadoop, Apache NiFi, Spark, Python, and other technologies. - Break down business requirements into technical solutions and delivery plans. - Lead technical decisions, ensuring alignment with data architecture and performance best practices. … practices for data pipeline efficiency. All About You Technical & Engineering Skills - Extensive demonstrable experience in data engineering, with expertise in building scalable data pipelines and infrastructure. - Deep understanding of Snowflake, Hadoop, Apache NiFi, Spark, Python, and other data technologies. - Strong experience with ETL/ELT processes and data transformation. - Proficiency in SQL, NoSQL, and data modeling. - Familiarity with cloud data More ❯
and architectural direction across entire programmes and client estates. Key Responsibilities Senior Data Architects: Lead the design and delivery of cloud-native data solutions using modern platforms (e.g. Databricks, Snowflake, Kafka, Confluent) Architect data lakes, lakehouses, streaming pipelines, and event-driven architectures Oversee engineering teams and collaborate with analysts and QA functions Translate complex requirements into scalable, robust data products … Proven track record in data architecture , either from a delivery or enterprise strategy perspective Deep experience with cloud platforms (Azure, AWS, GCP) and modern data ecosystems (Spark, Databricks, Kafka, Snowflake) Strong understanding of Data Mesh , Data Fabric , and data product-led approaches Data modelling expertise (relational, dimensional) and familiarity with tools like Erwin , Sparx , Archi Experience with ETL/ELT More ❯
and architectural direction across entire programmes and client estates. Key Responsibilities Senior Data Architects: Lead the design and delivery of cloud-native data solutions using modern platforms (e.g. Databricks, Snowflake, Kafka, Confluent) Architect data lakes, lakehouses, streaming pipelines, and event-driven architectures Oversee engineering teams and collaborate with analysts and QA functions Translate complex requirements into scalable, robust data products … Proven track record in data architecture , either from a delivery or enterprise strategy perspective Deep experience with cloud platforms (Azure, AWS, GCP) and modern data ecosystems (Spark, Databricks, Kafka, Snowflake) Strong understanding of Data Mesh , Data Fabric , and data product-led approaches Data modelling expertise (relational, dimensional) and familiarity with tools like Erwin , Sparx , Archi Experience with ETL/ELT More ❯
technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflake schema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor's or Master's degree in Computer … Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflake schema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts; experience gathering requirements, creating technical documentation, and … R; experience with machine learning algorithms and techniques is a plus. •Experience in building and maintaining APIs for data integration and delivery. •Experience with data warehouse platforms such as Snowflake a plus. ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in More ❯
drive real-time decisions, personalisation, and pricing optimisation for each product line. Lead, Mentor, and Set Standards Champion Modern Best Practices: Act as the subject matter expert for DBT, Snowflake, and Looker, setting the gold standard for how multi-product data models are built, tested, and optimised. Coach Across the Business: Enable analysts, scientists, and engineers to self-serve effectively … observable, and optimised for customer impact across insurance, payments, loyalty, and future offerings. Solve Hard Problems Own the Full Stack: Gain deep expertise in the full data platform (AWS, Snowflake, DBT, Looker, orchestration frameworks) to deliver safe, reliable solutions at pace. Automation & Efficiency: Build with a bias towards automation and reusability, essential for scaling multiple product lines without a linear … complexity or cost. Who You Are Technical Expertise: Deep experience with production-grade ETLs, data warehousing, and robust data modelling. You are highly proficient in SQL and deeply understand Snowflake architecture. Tooling Mastery: Hands-on experience with DBT, Looker (or similar BI tools), and orchestration frameworks like Airflow or AWS Step Functions. Mindset: You're proactive, customer-obsessed, and see More ❯
Coventry, England, United Kingdom Hybrid / WFH Options
Berkeley Square IT
ETL processes and Data Quality. This role is 100% remote and sits Outside of IR35. Must have technologies: Experience in an ETL toolset (Talend, Pentaho, SAS DI, Informatica, etc.) Snowflake Experience in a Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc.) Experience in data modelling (Data Warehouse, Marts) Job Scheduling toolset (Job Scheduler, TWS, etc.) Programming and scripting languages (PL More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
working across both technical and business domains. Key technical skills: Strong SQL and ELT/data pipeline development experience Expertise in Data Warehouse & Data Lake design (including Star Schema, Snowflake Schema, Data Vault) Hands-on experience with enterprise databases: Oracle, Snowflake, Teradata, or SQL Server Solid understanding of AWS (S3, Lambda, IAM, etc.) Proficiency in Python (especially working with Boto3 More ❯
resources such as Lambda and EC2 Experience with containers such as AWS ECR, ECS, and Docker. Experience with Cloud Data Warehousing/Data Lake technologies and architectures such as Snowflake, AuroraDB or any relational database experience. Proven data modeling skills - must have demonstrable experience designing models for data warehousing and analytics use-cases (e.g. from operational data store to semantic … scalable and maintainable code. Strong communication skills - you will be able to tailor your communication to technical and less technical audiences alike Experience with Airflow highly advantageous Experience with Snowflake highly advantageous Python expertise with object oriented programming knowledge. Experience using dashboarding tools such as Tableau, Superset, Domo or similar. Experience preparing large, complex datasets for Machine Learning pipelines. Prior More ❯
including automated testing, peer code reviews, and version control (Git). Monitor and optimise data pipelines for performance, scalability, and cost-effectiveness. Work with cloud-based platforms such as Snowflake and Azure to manage and process large datasets. Promote a culture of collaboration, learning, and knowledge sharing across the team. Stay up to date with modern data technologies and actively … learn and grow. Make a tangible impact by designing and implementing solutions that drive business decisions. Competitive salary, benefits, and the chance to work with cutting-edge technologies like Snowflake, AnyPoint, Azure, and exciting global technology projects. Pret Offers Competitive salary and annual bonus 33 days holiday a year including Bank Holidays Private healthcare Life assurance Pret pension scheme Season More ❯
automatically updates product fit recommendations and descriptions, powered directly by manufacturing data, helping brands reduce returns and improve their bottom line. Our Current Stack: Data Warehouse and Tools: DBT, Snowflake, Airbyte Languages: Python, TypeScript, Node.js , React Cloud & Infra: AWS (S3, ECS), Docker Orchestration: AWS Lambda and Step Functions CI/CD: GitHub Actions Analytics Layer: GoodData What You’ll Work … multi-tenant, data-heavy systems, ideally in a startup or fast-moving environment. Technical Stack : Languages/Tools: Python (REST API integrations), DBT, Airbyte, GitHub Actions Modern Data Warehousing: Snowflake, Redshift, Databricks, or BigQuery. Cloud & Infra: AWS (ECS, S3, Step Functions), Docker (Kubernetes or Fargate a bonus) Data Modelling : Strong grasp of transforming structured/unstructured data into usable models More ❯
automatically updates product fit recommendations and descriptions, powered directly by manufacturing data, helping brands reduce returns and improve their bottom line. Our Current Stack: Data Warehouse and Tools: DBT, Snowflake, Airbyte Languages: Python, TypeScript, Node.js , React Cloud & Infra: AWS (S3, ECS), Docker Orchestration: AWS Lambda and Step Functions CI/CD: GitHub Actions Analytics Layer: GoodData What You’ll Work … multi-tenant, data-heavy systems, ideally in a startup or fast-moving environment. Technical Stack : Languages/Tools: Python (REST API integrations), DBT, Airbyte, GitHub Actions Modern Data Warehousing: Snowflake, Redshift, Databricks, or BigQuery. Cloud & Infra: AWS (ECS, S3, Step Functions), Docker (Kubernetes or Fargate a bonus) Data Modelling : Strong grasp of transforming structured/unstructured data into usable models More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
effective manner. Your Toolkit The Languages You Speak : Python, SQL, the dialect of data. Libraries Tools : Terraform, Flask, Pandas, FastAPI, Dagster, GraphQL, SQLAlchemy, GitLab, Athena. Your Trusted Companions : Docker, Snowflake, MongoDB, Relational Databases (eg MySQL, PostgreSQL), Dagster, Airflow/Luigi, Spark, Kubernetes. Your AWS Kingdom : Lambda, Redshift, EC2, ELB, IAM, RDS, Route53, S3-the building blocks of cloud mastery. Your … Driven : Proficiency in designing and developing ETL/ELT pipelines for data integration and transformation. Cloud Navigator : Confidently guiding projects through the AWS ecosystem and hands-on experience with Snowflake or similar cloud-based data warehouse platforms. Dynamic Collaborator : Adept Problem-Solver with keen attention to detail. Excellent problem-solving skills, attention to detail, and the ability to work in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Radley James
data warehouses) Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
data warehouses) Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯