and relational databases (e.g., MySQL, PostgreSQL, MS SQL Server). Experience with NoSQL databases (e.g., MongoDB, Cassandra, HBase). Familiarity with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Hands-on experience with ETL frameworks and tools (e.g., Apache NiFi, Talend, Informatica, Airflow). Knowledge of big data technologies (e.g., Hadoop, Apache Spark, Kafka). Experience with … tools (e.g., Docker, Kubernetes) for building scalable data systems. Knowledge of version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). Understanding data modeling concepts (e.g., star schema, snowflakeschema) and how they relate to data warehousing and analytics. Knowledge of data lakes, data warehousing architecture, and how to design efficient and scalable storage solutions. More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding More ❯
non-technical audiences, tailoring communication style based on the audience. Data Modeling and Warehousing: •Design and implement data models optimized for analytical workloads, using dimensional modeling techniques (e.g., star schema, snowflakeschema). •Participate in the design, implementation, and maintenance of data warehouses ensuring data integrity, performance, and scalability. BASIC QUALIFICATIONS •Educational Background: Bachelor's or Master … Skills: Working knowledge of R or Python for analytics, data manipulation, and algorithm development. •Data Warehousing Knowledge: In-depth knowledge of data warehousing principles, dimensional modeling techniques (e.g., star schema, snowflakeschema), and data quality management. •Communication and Collaboration Abilities: Excellent verbal and written communication skills, with the ability to effectively communicate technical concepts; experience gathering requirements … R; experience with machine learning algorithms and techniques is a plus. •Experience in building and maintaining APIs for data integration and delivery. •Experience with data warehouse platforms such as Snowflake a plus. ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in More ❯
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
growing data team and lead the design and implementation of scalable, secure, and high-performance data solutions. You’ll play a key role in architecting modern data platforms using Snowflake , SQL , Python , and leading cloud technologies to support advanced analytics, reporting, and machine learning initiatives across the business. Key Responsibilities: Design and maintain end-to-end data architectures, data … models, and pipelines in Snowflake and cloud platforms (AWS, Azure, or GCP). Develop and optimize scalable ELT/ETL processes using SQL and Python. Define data governance, metadata management, and security best practices. Collaborate with data engineers, analysts, product managers, and stakeholders to understand data needs and translate them into robust architectural solutions. Oversee data quality, lineage, and … cost-efficiency, and system reliability. Required Skills & Experience: Proven experience as a Data Architect or Senior Data Engineer working on cloud-native data platforms. Strong hands-on experience with Snowflake – data modeling, performance tuning, security configuration, and data sharing. Proficiency in SQL for complex querying, optimization, and stored procedures. Strong coding skills in Python for data transformation, scripting, and More ❯
into technical solutions that align with organizational goals and drive value. Data Modelling, Architecture, and Governance Establish and uphold data modelling best practice and develop robust semantic models using Snowflake and Power BI. Implement monitoring and continuous optimisation of data models for performance and scalability. Support the design, governance and development of the Power BI platform infrastructure, ensuring scalability … ensure compliance with data security best practices and regulatory requirements (e.g., GDPR, HIPAA). Data Platform Integration Experience integrating Power BI with other data sources and platforms (e.g., Azure, Snowflake, SharePoint, SAP, Salesforce) Experience with REST APIs for data extraction and integration with Power BI is desirable. Innovation Stay informed about the latest Power BI features and industry trends … for diverse use cases. Hands-on, proven experience in managing, administrating, and troubleshooting Power BI platforms, ensuring performance and security. Experience working with cloud data warehouse solutions such as Snowflake, BigQuery, databricks, Redshift. Experience with version control systems like Git for managing Power BI products. Experience with the DevOps lifecycle. Experience mentoring visualisation developers. Knowledge of (re)insurance industry More ❯
Lambda, IAM, Terraform, GitHub, CI/CD) Proficiency in SQL and Python for data processing and automation Experience working with data modeling tools and practices (e.g., dimensional, star/snowflakeschema, dbt) Solid understanding of data governance, metadata, and quality frameworks Strong collaboration and communication skills, with the ability to work cross-functionally in an Agile environment Exposure More ❯
innovative, and supportive - ideal for someone who thrives on building solutions from scratch. 🌐About the Project: You’ll play a key role in designing and implementing a brand-new Snowflake Data Warehouse hosted on Microsoft Azure , with a modern tech stack that includes: Snowflake for scalable cloud data warehousing dbt for data transformation and modelling Azure Data Factory … for orchestration Power BI for data visualisation and reporting 🛠️ What You’ll Do: Design, build, and maintain robust data pipelines and data models in Snowflake Integrate data from multiple structured and unstructured sources into the new platform Collaborate with Data Analysts to optimise semantic models and support self-service analytics Implement and enforce data governance, security, and access control … re Looking For: Proven experience in a Data Engineering or Data Development role Strong knowledge of Azure services , especially Azure Data Factory Hands-on experience (or strong understanding) of Snowflake , including pipeline development and data modelling Familiarity with dbt or similar transformation tools Experience working with both structured and unstructured data Understanding of data governance , security best practices , and More ❯
We are looking for a Data Engineer, specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI, with a strong focus on building semantic models and supporting analytics. Please only apply if you are confident with the above. This is a Remote role with a few in … scalability. We collaborate closely with clients to achieve their desired results, covering Retail, Out of Home, E-Commerce, and Field Sales. Key Responsibilities: Design and optimize ETL pipelines in Snowflake and Azure Data Factory to streamline data integration and transformation. Build and manage semantic data models in Snowflake and Power BI to support scalable, user-friendly analytics and … reporting. Develop Snowflake stored procedures using Python to automate workflows and handle complex data transformations. Ensure data integrity and accessibility within Snowflake to support effective data warehousing operations. Collaborate with analytics and business teams to align data models with reporting needs and business goals. Qualifications: Strong experience in Data Engineering, with a focus on data modelling, ETL, and More ❯
data-oriented, systematic approaches to solving problems. You have 2-3 years of experience in data management and analysis. You are familiar with SQL and Cloud Data Warehouses like Snowflake . Exposure to data models and ETL frameworks , and familiarity with DBT is a plus. You have the ability to translate business needs into meaningful dashboards and reports. You More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
in relational databases (e.g. postgres, sql server) Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience 2+ years of experience in More ❯
setup, requirements gathering, design, development, testing, deployment. Perform hands-on development with SQL, occasionally Python, create data visualizations in Tableau, Power BI, or similar, and design data models in Snowflake, Databricks. Lead client meetings such as stakeholder interviews and workshops. Translate complex business challenges into actionable plans with clear communication. Identify risks and issues proactively and suggest mitigation strategies. … Agile environment. Experience with client-facing activities like requirements gathering, facilitation, and presentation. 5+ years in a consulting or data-focused role with strong SQL experience on platforms like Snowflake, Databricks, Teradata, Redshift, BigQuery, MS SQL Server. Experience designing relational databases or data warehouses. Hands-on experience with business-driven/self-service BI tools like Tableau, Power BI More ❯
and implementing innovative approaches to address business problems and solutions. Experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven ability to build business knowledge through meaningful partnerships at the individual contributor, leadership, and EMG levels. Demonstrated advanced communication skills … monitoring, and case management systems. Strong analytical and problem-solving skills with attention to detail. Excellent communication and collaboration skills to work with cross-functional teams. Strong expertise in Snowflake, including data modeling, writing advanced SQL, and performance tuning. Solid understanding of cloud data architecture and modern data stack concepts. Compensation range: The salary range for this position is More ❯
with other engineers on the team to elevate technology and consistently apply best practices. DBT Experience is must required: DBT macros and overwriting DBT Custom Schemas DBT Materialization details Snowflake Cluster Keys Snowflake SQL functions Required Skill and Experience: A minimum of 7+ years of IT experience in data engineering or data management field 4 - 5 years of … experience in application development using SQL, PLSQL Strong working knowledge and min of 2 years' experience in DBT, Snowflake Hands-on experience with Airflow is highly recommended Strong working experience with Python, python pandas. Recent experience as a Senior Data Engineer in any public cloud, preferably on Azure as well as on a cloud warehouse like Snowflake is … or in a similar role. Strong proficiency in Data Modelling, process metadata, observability, and monitoring data platforms. Familiarity with data modelling tools/techniques is a plus Azure or Snowflake training/certification is a plus Strong analytical, problem solving, communication, collaborative and organizational skills Self-driven, self-directed, passionate analytical and focused on delivering the right results More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Data and BI systems for analytics and decision making, using Primarily Data Warehouse/Mart, Data Lake, Self Service BI Tools and Data Science/ML platform using Oracle, Snowflake and AWS tech stack. Qualifications We are looking for a skilled Lead Data Engineer who: Personal Attributes: Self-starter with initiative and enthusiasm Thrives at solving problems with minimal … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/Snowflake … and delivery at scale (high level and detailed) and various architectural strategies Solid information architecture skills/experience: Data ingestions, and Data Warehouse and Date Lake modelling experience (Star Schema, SnowflakeSchema, Data Vault etc.) Past hands-on development experience in at least one enterprise analytics database like Oracle/Terradata/SQL Server/SnowflakeMore ❯
Employment Type: Permanent, Part Time, Work From Home
as Power BI, Tableau, QlikView/Qlik Sense, or similar. Strong SQL skills; experience with databases such as SQL Server, Oracle, or PostgreSQL. Knowledge of data modelling, star/snowflakeschema, and DAX/MDX (if applicable). Familiarity with ETL processes and tools (e.g., SSIS, Talend, Azure Data Factory). Strong problem-solving skills and attention to More ❯
Warrington, Cheshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
client are a start-up data consultancy looking to expand their data engineering practice to facilitate the service of their growing client base. This role will require proficiency in Snowflake, Python, DBT, AWS, and SQL. Consultancy experience would be a huge plus. Salary and Benefits Competitive salary of £55k - £65k (DOE) Fully remote working 25 days annual leave And … many more! Role and Responsibilities Design and deliver data pipelines using SQL, dbt, and Python within Snowflake to transform and model data for a variety of use cases. Build robust, testable, and maintainable code that integrates data from diverse sources and formats. Work closely with analytics, product, and client teams to turn data requirements into scalable solutions. Support the … standards, development processes, and client delivery playbooks. Participate in sprint planning, reviews, and retrospectives with a lean agile delivery mindset. What do I need to apply Strong experience with Snowflake Experience with Python, DBT, AWS, SQL. Strong stakeholder engagement. Experience with integrating API's Data science background or some exposure. My client have limited interview slots and they are More ❯
site, working closely with engineers, analysts, and business stakeholders. 🚀 About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting … and visualisation 🧠 What You’ll Do: Design and implement scalable, well-documented data models in Snowflake using dbt Build curated, reusable data layers that support consistent KPIs and enable self-service analytics Collaborate with Power BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and … dimensional modelling, layered architecture, and data quality ✅ What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across More ❯
Writing efficient and complex DAX formulas for calculated columns, measures, and KPIs. Power Query (M Language) : Data transformation and shaping using Power Query Editor. Data Modeling : Designing star/snowflake schemas, relationships, and optimizing data models for performance. Data Visualisation Best Practices : Ability to design intuitive and impactful dashboards. KPI Development : Creating and tracking key performance indicators aligned with More ❯
Writing efficient and complex DAX formulas for calculated columns, measures, and KPIs. Power Query (M Language) : Data transformation and shaping using Power Query Editor. Data Modeling : Designing star/snowflake schemas, relationships, and optimizing data models for performance. Data Visualisation Best Practices : Ability to design intuitive and impactful dashboards. KPI Development : Creating and tracking key performance indicators aligned with More ❯
Principal Data Consultant (Snowflake/Matillion) Application Deadline: 18 June 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you'll be at the forefront of our most strategic initiatives. Your role will involve leading client engagements, managing large-scale projects, and … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work … closely with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and More ❯
I'm looking for an experienced Data Engineer to join a global business who are embarking on a greenfield project to build a brand-new Snowflake Data Warehouse. They are building a modern, AI-ready data platform, which allows for better decision-making for both themselves and their customers, with a tech stack including Snowflake, dbt, Microsoft Azure … thinking and entrepreneurial environment. The purpose of this role is to build and maintain the data architecture that supports various business intelligence initiatives. This will include designing and managing Snowflake data pipelines, ensuring data flows seamlessly from multiple sources into the new data platform, and working with Data Analysts to support their data needs and optimise semantic models for … Snowflake-based reporting and analytics. More generally, you'll implement governance policies and ensure data security across the Snowflake platform, and will also maintain the performance of systems to ensure scalability and reliability. This role would be well-suited to an experienced Data Engineer or Developer who is excited by the prospect of building something from the ground More ❯
and security controls. Define data ownership and stewardship roles across data sets. Conduct regular data audits and drive continuous improvement initiatives. Establish data model management in line with star schema principles. Organizing data into fact and dimension tables. Work with key business stakeholders to define new data sets for incorporation into the data platform. Work with the development teams … into the fact and dimension tables. Ensuring that the data model is optimized for performance, working with the Development team to ensure any performance issues are resolved. Document the schema design, data sources, ETL processes, and any changes made to the schema. Provide training to team members on how to use and maintain the star schema. Work closely with … stakeholders, including data analysts, data scientists, developers and IT teams, to understand their data needs and ensure that the star schema supports their analytical requirements. Data Quality Management Develop and Implement Data Quality Management Strategy: responsible for creating and executing a comprehensive data quality management strategy. This involves setting policies and procedures to ensure data accuracy, completeness, and consistency More ❯