and relational databases (e.g., MySQL, PostgreSQL, MS SQL Server). Experience with NoSQL databases (e.g., MongoDB, Cassandra, HBase). Familiarity with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Hands-on experience with ETL frameworks and tools (e.g., Apache NiFi, Talend, Informatica, Airflow). Knowledge of big data technologies (e.g., Hadoop, Apache Spark, Kafka). Experience with … tools (e.g., Docker, Kubernetes) for building scalable data systems. Knowledge of version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). Understanding data modeling concepts (e.g., star schema, snowflakeschema) and how they relate to data warehousing and analytics. Knowledge of data lakes, data warehousing architecture, and how to design efficient and scalable storage solutions. More ❯
Management: Design, build, and maintain a scalable and secure data lake on Azure. Design and build scalable ETL pipelines, establish data ingestion, transformation, and storage patterns. Architect data warehouse schema (Star, Snowflake, Galaxy) Optimize data lake performance and ensure data quality and integrity. Establish balancing and reconciliation queries to ensure Data warehouse is stable after failures, numbers and … communication, and interpersonal skills. Strong analytical and problem-solving abilities. Experience in managing globally distributed teams of data engineers as staff and contractors Desired Skills: Experience with Databricks or Snowflake or Azure Syanpse/Fabric Experience in Extract Transform and Load (ETL) tools like Fivetran or others like Azure Data Factory, AWS Glue, Apache airflow Hands-on experience in … architecting data warehouse using Star, Snowflake or Galaxy schema Proficiency with at least one cloud platform (Azure, AWS, or GCP) Experience in data security Strong understanding of relational databases (MySQL, Postgress, SQL Server, MongoDB) Advanced SQL skills Experience in data dictionary, data quality and lineage tools Experience in machine learning frameworks: Pytorch, MLFlow, and or Tensorflow Experience with More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding More ❯
Build and optimise data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … DBT) Extensive experience in building and managing data transformations with dbt, with experience in optimising complex transformations and documentation. Hands-on experience with popular Cloud data-warehouses such as Snowflake or Redshift, including knowledge of performance optimisation, data modeling, and query tuning. Highly Proficient in data analysis tools and languages (e.g., SQL, Python). Strong understanding of data modeling More ❯
Bracknell, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
developing, and maintaining our MS SQL Server Data Warehouses and associated data feeds into and out of the warehouses, and developing on our new modern cloud data platform, requiring Snowflake, dbt and Azure Data Factory experience. Our data platform's support regulatory requirements, business intelligence & reporting needs and numerous system integrations. This role requires strong technical proficiency and a … of data engineering and data warehousing principles and practices. You will be critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. The role will be … skilled team. An understanding of the wealth management industry, including products & services and the associated data, is a plus. Key Responsibilities • Design, develop, and implement data warehouse solutions using Snowflake, Azure, and MS SQL Server. • Develop data models and database schemas that support reporting and analytics needs. • Extensive use of and fully conversant in SQL. • Experience working with programming More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Professional Experience: 3–5 years in Data Engineering, Data Warehousing, or programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in … Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling: Designing star/snowflake schemas and understanding normalization and denormalization. SQL Expertise: Writing efficient queries and optimizing for performance. DevOps and CI/CD: Version Control: Using Git and platforms like GitHub, GitLab, or Bitbucket. Data Governance More ❯
and implementing data-led approaches. - Ensure high-quality, accurate, and professional outputs that drive real business decisions. Key Experience: - Proficiency in SQL for data transformation, analysis and problem-solving. - Snowflake, DBT, Data Modelling, Data Vault, Data warehousing and GitHub - Understanding of version control systems, continuous integration pipelines, and service-oriented architecture. - Understanding the need for different and appropriate design More ❯
Role: Snowflake Developer Location: New York (Hybrid) Position Type: Contract on W2 Only Experience Required: 8+ Years of Experience Eligible Visas: USC/GC/H4-EAD We are seeking an experienced Senior Snowflake Developer who will be responsible for designing, developing, and maintaining our data warehouse solutions on the Snowflake platform. The ideal candidate should also … be proficient in SQL and have a strong background in SQL Server Integration Services (SSIS) for data integration and ETL processes. Key Responsibilities: Design, develop, and optimize Snowflake-based data warehouse solutions to meet business requirements. Collaborate with stakeholders to gather and analyze data requirements, translating them into technical specifications. Develop and maintain ETL processes using SQL and SSIS … to ensure efficient data integration and migration. Implement data pipelines and workflows in Snowflake to support data analytics and reporting needs. Monitor and tune Snowflake performance to ensure optimal data processing and query execution. Ensure data quality, consistency, and integrity across the data warehouse. Provide technical support and troubleshooting for Snowflake and SSIS-related issues. Stay current More ❯
non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflakeschema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor's or Master … Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflakeschema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts; experience gathering requirements … R; experience with machine learning algorithms and techniques is a plus. •Experience in building and maintaining APIs for data integration and delivery. •Experience with data warehouse platforms such as Snowflake a plus. ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in More ❯
databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) for efficient data storage and retrieval. Data Warehousing : Experience with data warehousing solutions, such as Amazon Redshift, Google BigQuery, Snowflake, or Azure Synapse Analytics, including data modelling and ETL processes. ETL Processes: Proficient in designing and implementing ETL (Extract, Transform, Load) processes using tools like Apache NiFi, Talend, or … Pipeline Orchestration : Experience with workflow orchestration tools such as Apache Airflow or Prefect to manage and schedule data pipelines. Data Modelling : Strong understanding of data modelling concepts (e.g., star schema, snowflakeschema) and best practices for designing efficient and scalable data architectures. Data Quality and Governance : Knowledge of data quality principles and experience implementing data governance practices More ❯
of continuous improvement and innovation Build data integrations from multiple sources, including CRM, digital, and social platforms Design, implement and optimize data models with a medallion architecture using Star Schema and Snowflake techniques to enhance query performance and support analytical workloads Ensure data quality, consistency, and reliability across all marketing datasets Collaborate with analysts and data scientists to … with new tools, technologies, and approaches in data engineering and marketing analytics YOU’LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES: Significant experience with Snowflake, DBT, Python Dagster, Airflow or similar orchestrating tool Knowledge of additional technologies is a plus: Azure, Microsoft SQL Server, Power BI Strong proficiency in SQL & Python Familiarity with additional More ❯
Insights. Our goal is to deliver tangible business benefits by helping our clients manage, monetise, and leverage their data effectively. As a trusted partner of Microsoft, Informatica, AWS, and Snowflake, we're poised for growth. We’re dedicated to fostering a diverse, equitable, and inclusive workplace. We believe that diversity drives innovation and fosters creativity. We actively promote diversity … Power BI. Experience with Azure-based data services (Azure Data Lake, Synapse, Data Factory, Fabric) and their integration with Power BI. Knowledge of data modelling techniques including star/snowflakeschema design for BI solutions. Understanding of DevOps/DataOps principles as applied to Power BI CI/CD and workspace automation. What You Will Get in Return More ❯
Insights. Our goal is to deliver tangible business benefits by helping our clients manage, monetise, and leverage their data effectively. As a trusted partner of Microsoft, Informatica, AWS, and Snowflake, we're poised for growth. We’re dedicated to fostering a diverse, equitable, and inclusive workplace. We believe that diversity drives innovation and fosters creativity. We actively promote diversity … Power BI. Experience with Azure-based data services (Azure Data Lake, Synapse, Data Factory, Fabric) and their integration with Power BI. Knowledge of data modelling techniques including star/snowflakeschema design for BI solutions. Understanding of DevOps/DataOps principles as applied to Power BI CI/CD and workspace automation. What You Will Get in Return More ❯
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
london (city of london), south east england, united kingdom
Levy Search
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
in choosing a platform, defining their data needs and migrating them to a modern cloud data environment using cloud providers such as Azure, Google Cloud Platform, Amazon Web Services, Snowflake, Databricks or Teradata. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to … Bachelor Degree Minimum Years of Experience :4 year(s) Preferred Qualifications : Certification(s) Preferred : Certification in one of the following cloud platform - AWS/Azure/GCP Certification in Snowflake Certification in any ETL/ELT tool Preferred Knowledge/Skills : Demonstrates thorough knowledge and success as both team leader and member roles within a professional services firm or … cloud providers - AWS, Azure, GCP; Implementing cloud data architecture and data integration patterns for one or more of the cloud providers (AWS Glue, Azure Data Factory, Event Hub, Databricks,Snowflake etc.), storage and processing (Redshift, Azure Synapse, BigQuery, Snowflake); Infrastructure as code (CloudFormation, Terraform); Understanding and thorough knowledge of Data Warehousing concepts (normalization, OLAP, OLTP, Vault data model More ❯
into technical solutions that align with organizational goals and drive value. Data Modelling, Architecture, and Governance Establish and uphold data modelling best practice and develop robust semantic models using Snowflake and Power BI. Implement monitoring and continuous optimisation of data models for performance and scalability. Support the design, governance and development of the Power BI platform infrastructure, ensuring scalability … ensure compliance with data security best practices and regulatory requirements (e.g., GDPR, HIPAA). Data Platform Integration Experience integrating Power BI with other data sources and platforms (e.g., Azure, Snowflake, SharePoint, SAP, Salesforce) Experience with REST APIs for data extraction and integration with Power BI is desirable. Innovation Stay informed about the latest Power BI features and industry trends … for diverse use cases. Hands-on, proven experience in managing, administrating, and troubleshooting Power BI platforms, ensuring performance and security. Experience working with cloud data warehouse solutions such as Snowflake, BigQuery, databricks, Redshift. Experience with version control systems like Git for managing Power BI products. Experience with the DevOps lifecycle. Experience mentoring visualisation developers. Knowledge of (re)insurance industry More ❯
into technical solutions that align with organizational goals and drive value. Data Modelling, Architecture, and Governance · Establish and uphold data modelling best practice and develop robust semantic models using Snowflake and Power BI. · Implement monitoring and continuous optimisation of data models for performance and scalability. · Support the design, governance and development of the Power BI platform infrastructure, ensuring scalability … ensure compliance with data security best practices and regulatory requirements (e.g., GDPR, HIPAA). Data Platform Integration · Experience integrating Power BI with other data sources and platforms (e.g., Azure, Snowflake, SharePoint, SAP, Salesforce) · Experience with REST APIs for data extraction and integration with Power BI is desirable. Innovation · Stay informed about the latest Power BI features and industry trends … for diverse use cases. · Hands-on, proven experience in managing, administrating, and troubleshooting Power BI platforms, ensuring performance and security. · Experience working with cloud data warehouse solutions such as Snowflake, BigQuery, databricks, Redshift. · Experience with version control systems like Git for managing Power BI products. · Experience with the DevOps lifecycle. · Experience mentoring visualisation developers. · Knowledge of (re)insurance industry More ❯
London, England, United Kingdom Hybrid / WFH Options
LHV Bank
Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Desirable More ❯
data pipelines using tools like Airflow, dbt, or similar Proven success in managing structured and unstructured data at scale Familiarity with cloud data platforms (e.g. AWS Redshift, Google BigQuery, Snowflake) Understanding of data modeling and schema design Experience integrating APIs and managing streaming/batch data flows Demonstrated ability to support ML/AI workflows with high-quality More ❯
Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Exposure More ❯
innovative, and supportive - ideal for someone who thrives on building solutions from scratch. 🌐About the Project: You’ll play a key role in designing and implementing a brand-new Snowflake Data Warehouse hosted on Microsoft Azure , with a modern tech stack that includes: Snowflake for scalable cloud data warehousing dbt for data transformation and modelling Azure Data Factory … for orchestration Power BI for data visualisation and reporting 🛠️ What You’ll Do: Design, build, and maintain robust data pipelines and data models in Snowflake Integrate data from multiple structured and unstructured sources into the new platform Collaborate with Data Analysts to optimise semantic models and support self-service analytics Implement and enforce data governance, security, and access control … re Looking For: Proven experience in a Data Engineering or Data Development role Strong knowledge of Azure services , especially Azure Data Factory Hands-on experience (or strong understanding) of Snowflake , including pipeline development and data modelling Familiarity with dbt or similar transformation tools Experience working with both structured and unstructured data Understanding of data governance , security best practices , and More ❯