be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days with each year of service. More ❯
or equivalent Expert experience building data warehouses and ETL pipelines Expert experience of SQL, python, git, dbt (incl. query efficiency and optimization) Expert experience of Cloud Data Platforms (AWS, Snowflake and/or Databricks) Qualification preferred, not mandatory Significant experience of Automation and Integrations tools (FiveTran, Airflow, Astronomer or similar) Significant experience with IoC tools (Terraform, Docker, Kubernetes or similar More ❯
or equivalent Expert experience building data warehouses and ETL pipelines Expert experience of SQL, python, git, dbt (incl. query efficiency and optimization) Expert experience of Cloud Data Platforms (AWS, Snowflake and/or Databricks) Qualification preferred, not mandatory Significant experience of Automation and Integrations tools (FiveTran, Airflow, Astronomer or similar) Significant experience with IoC tools (Terraform, Docker, Kubernetes or similar More ❯
solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Experience working with one or more of Kafka, Snowflake, Azure Data Factory, Azure Synapse or Microsoft Fabric is highly desirable. Knowledge of data modelling and data architectures: Inmon, Kimball, DataVault. About You A high level of drive with the More ❯
Numpy/Pandas) and SQL. Proven experience designing and building robust ETL/ELT pipelines (dbt, Airflow). Strong knowledge of data pipelining, schema design, and cloud platforms (e.g., Snowflake, AWS). Excellent communication skills and the ability to translate technical concepts for diverse audiences. Familiarity with software architecture, containerisation, and modern DevOps practices is a plus. A forward-thinking More ❯
/AI solutions using Python, Java, or Scala. Data Engineering Specialization: Hands-on expertise with Data Lakes, SQL-based databases, and Cloud-based Data Warehousing/ETL tools like Snowflake, Redshift, Bigquery, etc. Advanced Tech Skills: Deep knowledge of Spark core internals, Delta/Iceberg, JVM optimization, and memory management, with additional proficiency in AI ecosystems like Machine Learning, Deep More ❯
with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
model, DAX, and visualizations. • Relational and Dimensional (Kimball) data modelling • Proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Desirable: • Databricks (or Alternative Modern Data Platform such as Snowflake) • Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. • Oracle Database • MongoDB • Cloud Data Technologies (Mainly Azure - SQL Database, Data Lake, HD More ❯
Snowflake Architect - ETL, AIRFLOW, AWS, SQL, Python, and ETL tools (Streamsets, DBT), RDBMS A Snowflake Architect is required for a long-term project with a fast-growing company. Responsibilities: -Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. -Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. -Collaborate with data analysts … scientists, and other stakeholders to define and fulfill data requirements. -Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability. -Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake. -Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity. -Stay up-to-date with the latest trends and best … Cloud Services such as AWS Qualifications: >Bachelor’s degree in computer science, Engineering, or a related field. >5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. >Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.) >Hands on experience with Oracle RDBMS >Data Migration experience to Snowflake >Experience with AWS services such as S3 More ❯
At DIMÁTICA , a company recognized as a Microsoft PARTNER , we specialize in the design, development, and maintenance of software solutions across multiplatform environments . We're looking for a Snowflake Specialist to join our remote team, to lead the architecture and development of scalable data pipelines on GCP and Snowflake . What You'll Do Design, develop, and optimize robust … to support maintenance and future scaling of the data ecosystem. ️ What We're Looking For 5+ years of experience leading data engineering projects in cloud environments , ideally GCP and Snowflake . Proficiency in Python and advanced SQL , with experience designing ELT architectures using dbt . Strong knowledge of data modeling , Snowpipe Streaming , warehouse optimization , and orchestration tools like Apache Airflow … . Experience working in distributed, cross-functional teams with excellent communication and documentation skills. Understanding of data privacy regulations (GDPR, CCPA, etc.). Nice to Have Snowflake or GCP certification . Experience with Kafka , Docker , Kubernetes , and real-time data streaming. Requirements Degree in Computer Science, Data Engineering, Data Science , or a related field. High level of English (fully English More ❯
Principal Data Consultant (Snowflake/Matillion) Application Deadline: 18 June 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you'll be at the forefront of our most strategic initiatives. Your role will involve leading client engagements, managing large-scale projects, and shaping … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely … with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and incorporating things More ❯
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each … system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure … features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. More ❯
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each … system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure … features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. More ❯
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each … system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure … features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding of More ❯
business each day. You will work with the Lead Data Engineer and other members of the Data Engineering team to deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Development of data ingestion … hours including weekends, evenings and public holidays. Your profile Key Skills and Competency Requirements: At least 2 years' experience designing and implementing a full-scale data warehouse solution using Snowflake Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting. Data Modelling using the Kimball Methodology. Experience in developing CI/ More ❯
Experience with analytical and real-time/streaming data solutions. Hands-on experience with data modeling tools (e.g., Erwin, Lucidchart, SAP PowerDesigner). Knowledge of Python and SQL (e.g., Snowflake or similar warehousing technology, real-time systems). Experience with AWS services such as Lambda, SNS, S3, EKS, API Gateway. Knowledge of data warehouse design, ETL/ELT processes, and … big data technologies (e.g., Snowflake, Spark). Understanding of data governance and compliance frameworks (e.g., GDPR, HIPAA). Strong communication and stakeholder management skills. Analytical mindset with attention to detail. Leadership and mentoring abilities in data modeling best practices. Preferred Skills and Qualifications Certifications in data modeling, cloud platforms, or database technologies. Experience developing and implementing enterprise data models. Experience … with interface/API data modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business empowering local teams with exciting work More ❯
Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you More ❯
automatically updates product fit recommendations and descriptions, powered directly by manufacturing data, helping brands reduce returns and improve their bottom line. Our Current Stack: Data Warehouse and Tools: DBT, Snowflake, Airbyte Languages: Python, TypeScript, Node.js , React Cloud & Infra: AWS (S3, ECS), Docker Orchestration: AWS Lambda and Step Functions CI/CD: GitHub Actions Analytics Layer: GoodData What You’ll Work … multi-tenant, data-heavy systems, ideally in a startup or fast-moving environment. Technical Stack : Languages/Tools: Python (REST API integrations), DBT, Airbyte, GitHub Actions Modern Data Warehousing: Snowflake, Redshift, Databricks, or BigQuery. Cloud & Infra: AWS (ECS, S3, Step Functions), Docker (Kubernetes or Fargate a bonus) Data Modelling : Strong grasp of transforming structured/unstructured data into usable models More ❯
automatically updates product fit recommendations and descriptions, powered directly by manufacturing data, helping brands reduce returns and improve their bottom line. Our Current Stack: Data Warehouse and Tools: DBT, Snowflake, Airbyte Languages: Python, TypeScript, Node.js , React Cloud & Infra: AWS (S3, ECS), Docker Orchestration: AWS Lambda and Step Functions CI/CD: GitHub Actions Analytics Layer: GoodData What You’ll Work … multi-tenant, data-heavy systems, ideally in a startup or fast-moving environment. Technical Stack : Languages/Tools: Python (REST API integrations), DBT, Airbyte, GitHub Actions Modern Data Warehousing: Snowflake, Redshift, Databricks, or BigQuery. Cloud & Infra: AWS (ECS, S3, Step Functions), Docker (Kubernetes or Fargate a bonus) Data Modelling : Strong grasp of transforming structured/unstructured data into usable models More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
effective manner. Your Toolkit The Languages You Speak : Python, SQL, the dialect of data. Libraries Tools : Terraform, Flask, Pandas, FastAPI, Dagster, GraphQL, SQLAlchemy, GitLab, Athena. Your Trusted Companions : Docker, Snowflake, MongoDB, Relational Databases (eg MySQL, PostgreSQL), Dagster, Airflow/Luigi, Spark, Kubernetes. Your AWS Kingdom : Lambda, Redshift, EC2, ELB, IAM, RDS, Route53, S3-the building blocks of cloud mastery. Your … Driven : Proficiency in designing and developing ETL/ELT pipelines for data integration and transformation. Cloud Navigator : Confidently guiding projects through the AWS ecosystem and hands-on experience with Snowflake or similar cloud-based data warehouse platforms. Dynamic Collaborator : Adept Problem-Solver with keen attention to detail. Excellent problem-solving skills, attention to detail, and the ability to work in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Radley James
data warehouses) Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
data warehouses) Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Radley James
data warehouses) Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
Dutch (written and spoken) The following is preferable but not essential for the Senior Data Engineer role: Experience in the energy sector (especially grid operators) Hands-on with GitLab, Snowflake, Tableau, DBT, Python, and/or web analytics More ❯