in building, maintaining, and scaling modern data pipelines and transformation workflows (ELT), ideally within a cloud or lakehouse environment. Strong experience with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands … data engineers, data scientists, and business stakeholders. Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of … data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding. Proficient in analytics tools such as Power BI, Plotly/Dash, or similar for building interactive and impactful visualizations. Deep experience with modern ELT workflows and transformation tools (e.g., dbt, custom SQL models, etc). Strong ability to debug and optimize slow or More ❯
City of London, London, United Kingdom Hybrid / WFH Options
OTA Recruitment
in building, maintaining, and scaling modern data pipelines and transformation workflows (ELT), ideally within a cloud or lakehouse environment. Strong experience with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands … data engineers, data scientists, and business stakeholders. Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of … data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding. Proficient in analytics tools such as Power BI, Plotly/Dash, or similar for building interactive and impactful visualizations. Deep experience with modern ELT workflows and transformation tools (e.g., dbt, custom SQL models, etc). Strong ability to debug and optimize slow or More ❯
Bracknell, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
developing, and maintaining our MS SQL Server Data Warehouses and associated data feeds into and out of the warehouses, and developing on our new modern cloud data platform, requiring Snowflake, dbt and Azure Data Factory experience. Our data platform's support regulatory requirements, business intelligence & reporting needs and numerous system integrations. This role requires strong technical proficiency and a … of data engineering and data warehousing principles and practices. You will be critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. The role will be … skilled team. An understanding of the wealth management industry, including products & services and the associated data, is a plus. Key Responsibilities • Design, develop, and implement data warehouse solutions using Snowflake, Azure, and MS SQL Server. • Develop data models and database schemas that support reporting and analytics needs. • Extensive use of and fully conversant in SQL. • Experience working with programming More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Professional Experience: 3–5 years in Data Engineering, Data Warehousing, or programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in … Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling: Designing star/snowflake schemas and understanding normalization and denormalization. SQL Expertise: Writing efficient queries and optimizing for performance. DevOps and CI/CD: Version Control: Using Git and platforms like GitHub, GitLab, or Bitbucket. Data Governance More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding More ❯
hands-on experience with Power Query, DAX, tabular model design, visualization best practices. Proficiency in optimizing report performance and implementing advanced features. Data Modelling: Strong knowledge of star/snowflake schemas, dimension/fact design, and row-level security for enterprise-scale tabular models. Programming & Scripting: Proficiency in SQL; familiarity with Python or other scripting languages is a plus. More ❯
of continuous improvement and innovation Build data integrations from multiple sources, including CRM, digital, and social platforms Design, implement and optimize data models with a medallion architecture using Star Schema and Snowflake techniques to enhance query performance and support analytical workloads Ensure data quality, consistency, and reliability across all marketing datasets Collaborate with analysts and data scientists to … with new tools, technologies, and approaches in data engineering and marketing analytics YOU’LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES: Significant experience with Snowflake, DBT, Python Dagster, Airflow or similar orchestrating tool Knowledge of additional technologies is a plus: Azure, Microsoft SQL Server, Power BI Strong proficiency in SQL & Python Familiarity with additional More ❯
is a Remote role with a few in-person meetings in shared co-working spaces on an ad hoc basis. Role Description We are looking for an SQL Developer (Snowflake), specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI , with a strong focus on building … semantic models and supporting analytics. Key Responsibilities: Develop and optimise complex SQL queries , views, and stored procedures in Snowflake . Design and maintain efficient ETL/ELT pipelines using modern data integration platforms. Create and manage Python-based stored procedures in Snowflake to support advanced transformations and automation. Build and maintain Power BI datasets, data models, and semantic … business intelligence needs. Work closely with stakeholders to understand data requirements and translate them into scalable technical solutions. Ensure data quality, consistency, and performance across environments. Monitor and tune Snowflake performance, storage, and compute usage. Implement best practices in data modelling , schema design, and cloud architecture. Collaborate on CI/CD and automation initiatives for data deployments. Maintain More ❯
non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflakeschema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor's or Master … Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflakeschema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts; experience gathering requirements … R; experience with machine learning algorithms and techniques is a plus. •Experience in building and maintaining APIs for data integration and delivery. •Experience with data warehouse platforms such as Snowflake a plus. ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in More ❯
databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) for efficient data storage and retrieval. Data Warehousing : Experience with data warehousing solutions, such as Amazon Redshift, Google BigQuery, Snowflake, or Azure Synapse Analytics, including data modelling and ETL processes. ETL Processes: Proficient in designing and implementing ETL (Extract, Transform, Load) processes using tools like Apache NiFi, Talend, or … Pipeline Orchestration : Experience with workflow orchestration tools such as Apache Airflow or Prefect to manage and schedule data pipelines. Data Modelling : Strong understanding of data modelling concepts (e.g., star schema, snowflakeschema) and best practices for designing efficient and scalable data architectures. Data Quality and Governance : Knowledge of data quality principles and experience implementing data governance practices More ❯
Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Exposure More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Avanti
and workflow automation Experience with AWS data tools (e.g. Redshift, Glue, Lambda, S3 ) and infrastructure tools such as Terraform Understanding of data modelling concepts (e.g. dimensional models, star/snowflake schemas) Knowledge of data quality, access controls , and compliance frameworks Nice to Have Experience with orchestration or pipeline frameworks like Airflow or dbt Familiarity with BI platforms (e.g. Power More ❯
querying, performance tuning, CTEs, joins, aggregations. Data visualization: UX design, storytelling, accessibility. Data profiling and quality techniques. Python (preferred): data exploration, automation, integration. Knowledge of data modeling, star/snowflake schemas, semantic models. Project leadership, prioritization, and cross-team coordination. Stakeholder engagement and communication skills, especially with senior leaders. Experience mentoring analysts and improving analytical maturity. Understanding of data More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
London, England, United Kingdom Hybrid / WFH Options
Boost Talent ltd
Dash for data visualisation Strong experience with AWS or equivalent Spark or Kafka (experience with one or the other expected) Proficiency in data modelling techniques such as Star or Snowflakeschema Experience designing and deploying transformation pipelines in production Passion for data quality , including implementing API checks and robust validation logic Exposure to automation scripting in Python to More ❯
We are looking for a Data Engineer, specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI, with a strong focus on building semantic models and supporting analytics. Please only apply if you are confident with the above. This is a Remote role with a few in … scalability. We collaborate closely with clients to achieve their desired results, covering Retail, Out of Home, E-Commerce, and Field Sales. Key Responsibilities: Design and optimize ETL pipelines in Snowflake and Azure Data Factory to streamline data integration and transformation. Build and manage semantic data models in Snowflake and Power BI to support scalable, user-friendly analytics and … reporting. Develop Snowflake stored procedures using Python to automate workflows and handle complex data transformations. Ensure data integrity and accessibility within Snowflake to support effective data warehousing operations. Collaborate with analytics and business teams to align data models with reporting needs and business goals. Qualifications: Strong experience in Data Engineering, with a focus on data modelling, ETL, and More ❯
comfortable with CLI tools. Strong analytical thinking, problem-solving and communication skills. Growth mindset, curiosity, and ability to quickly learn new services. Understanding of data warehousing concepts, star/snowflake schemas, and DAX. Basic knowledge of MLOps or prompt engineering for generative AI. Benefits Are you ready to join the Data & AI Apps team at Nasstar? • AI is in More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
data migration projects and cloud transformation Deep understanding of enterprise architecture frameworks and methodologies Experience with financial risk systems and analytics platforms Strong technical background in database platforms (Oracle, Snowflake) Understanding of cloud architecture principles (AWS preferred) Excellent communication and stakeholder management skills Ability to translate complex technical concepts for non-technical stakeholders PREFERRED CERTIFICATIONS Cloud platform certifications (AWS … DAMA) Project/Program Management certifications (PMP, Prince2, Agile/Scrum) AI/ML certifications or specialized training TECHNICAL SKILLS Enterprise systems architecture AWS and cloud technologies Oracle and Snowflake platforms Data modeling and design Data lineage and governance tools ETL/ELT processes Big data and analytics platforms Financial risk systems and methodologies Programming knowledge (Python, SQL, etc. … DevOps and CI/CD principles Knowledge graph and semantic web technologies Machine learning operations (MLOps) AI-focused data catalog tools Schema registry and metadata management systems Feature store architecture for machine learning PIMCO follows a total compensation approach when rewarding employees which includes a base salary and a discretionary bonus. Base salary is the fixed component of compensation More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
Partners Data Services Team? Investment in Data: We recognise the power of data and are fully committed to growing a best-in-class team. Cutting - Edge Tech: Work with Snowflake, Power BI, and AI-driven solutions to shape the future of data analytics. People - First Culture: A collaborative, flexible, and supportive work environment. Career Growth Opportunities : Continuous learning and … developing, and maintaining our MS SQL Server Data Warehouses and associated data feeds into and out of the warehouses, and developing on our new modern cloud data platform, requiring Snowflake, dbt and Azure Data Factory experience. Our data platform's support regulatory requirements, business intelligence & reporting needs and numerous system integrations. This role requires strong technical proficiency and a … of data engineering and data warehousing principles and practices. You will be critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. As a Data Engineer More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Signify Technology
or Scala to support data transformation, orchestration, and integration tasks. Work with cloud platforms like AWS to deploy, monitor, and manage data services. Utilise tools such as DBT and Snowflake for data modeling, transformation, and warehouse management. Collaborate with analysts, data scientists, and business stakeholders to ensure data accuracy, consistency, and availability. Apply strong analytical thinking and problem-solving … on expertise in Spark, Kafka, and other distributed data processing frameworks. Solid programming skills in Python Strong familiarity with cloud data ecosystems, especially AWS. Strong knowledge of DBT and Snowflake Strong problem-solving mindset with the ability to diagnose and resolve technical challenges. Excellent communication skills and the ability to work effectively in a cross-functional team. More ❯
or Scala to support data transformation, orchestration, and integration tasks. Work with cloud platforms like AWS to deploy, monitor, and manage data services. Utilise tools such as DBT and Snowflake for data modeling, transformation, and warehouse management. Collaborate with analysts, data scientists, and business stakeholders to ensure data accuracy, consistency, and availability. Apply strong analytical thinking and problem-solving … on expertise in Spark, Kafka, and other distributed data processing frameworks. Solid programming skills in Python Strong familiarity with cloud data ecosystems, especially AWS. Strong knowledge of DBT and Snowflake Strong problem-solving mindset with the ability to diagnose and resolve technical challenges. Excellent communication skills and the ability to work effectively in a cross-functional team. More ❯
London, England, United Kingdom Hybrid / WFH Options
Native Instruments
If you're looking to contribute, grow, and work on diverse data challenges, join us! Your Contribution Design and develop efficient and scalable data models and data marts (e.g.star schema, snowflakeschema) using best practices for data warehousing and business intelligence that are optimized for self-service analytics tools Collaborate with business stakeholders (e.g. finance, marketing, operations More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Hybrid 2-3 days onsite in a week) Duration: Long Term B2B Contract Job Description: The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines using various resources. Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using … processing. Leverage Python to create automation scripts and optimize data processing tasks. Proficiency in SQL performance tuning and query optimization techniques using Snowflake. Troubleshoot and optimize DBT models and Snowflake performance. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow. Strong analytical and problem-solving skills with the ability to work … reliability, and consistency across different environments. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions. Certification in AWS, Snowflake, or DBT is a plus. #J-18808-Ljbffr More ❯
site, working closely with engineers, analysts, and business stakeholders. About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting … and visualisation What You’ll Do: Design and implement scalable, well-documented data models in Snowflake using dbt Build curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and … dimensional modelling, layered architecture, and data quality What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across More ❯