Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform)(Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS)Large-scale data environmentUp to £70,000 plus benefitsFULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment? Do … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star/snowflake, CDC), data contracts, and data cataloguing. API & Integration Fluency: Building data ingestion More ❯
function and work closely with senior stakeholders, including programme sponsors and business leads. Key responsibilities - Platform Engineer Support the build and enhancement of a cloud-based data platform using Snowflake on Azure, working within an established technical framework. Develop infrastructure components and manage deployment through infrastructure-as-code practices. Collaborate with internal stakeholders to ensure the data platform design … assurance processes. Contribute to platform documentation and technical standards. There is potential for future responsibility in mentoring or overseeing junior team members. Required experience Hands-on experience working with Snowflake, including schema development and data pipeline integration. Familiarity with Azure services, including resource provisioning and identity setup. Experience using Terraform for infrastructure deployment (ideally with cloud data services More ❯
a unified, scalable architecture to enable improved analytics, data governance, and operational insights. Technical requirements: ️ Substantial experience designing and implementing data solutions on Microsoft Azure Hands-on expertise with Snowflake, including data modelling, performance optimisation, and secure data sharing practices. Proficiency in DBT (Data Build Tool), with a strong understanding of modular pipeline development, testing, and version control. Familiarity … both enterprise reporting and self-service analytics. Candidates must demonstrate experience working in Agile environments, delivering in iterative cycles aligned to business value. Tech Stack: Azure Power BI DBT Snowflake About us esynergy is a technology consultancy and we build products, platforms and services to accelerate value for our clients. We drive measurable impact that is tightly aligned to More ❯
Senior Engineer to join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines … deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test automation. Familiarity with AI-powered development tools More ❯
Role: Snowflake Data Architect Location: Hove, UK Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, Logical and Physical Data Models in Snowflake. Design Data Pipelines and ingestion frameworks using Snowflake native tools. Work with Data Governance teams to establish Data lineage, Data quality More ❯
We are seeking a Senior Data Engineer to lead the design, implementation, and optimization of our modern data stack. You'll work with tools like Snowflake, dbt, Airflow, and Terraform to build scalable, reliable, and modular data systems. This role will have a strong focus on enabling analytics through clean data modelling, automation, and observability - empowering domain teams with … trusted, self-serve data products. What You'll Do Modern Data Platform Development Design and expand a Snowflake-based data platform, incorporating modular design principles. Orchestrate complex workflows using Apache Airflow and containerized jobs on Docker and Kubernetes. Use Terraform to define infrastructure as code for consistent, version-controlled deployments. Pipeline Engineering & Automation Design robust ELT pipelines using Python … Spark, and dbt, processing structured and semi-structured data (e.g., JSON, Parquet). Automate ingestion and transformation layers while enforcing data contracts, quality rules, and schema validation. Build reusable, testable modules to reduce development effort and improve standardization. Data Modelling & Analytics Enablement Own the semantic layer using dbt for transformation and modelling in Snowflake, including SCD and dimensional More ❯
We are seeking a Senior Data Engineer to lead the design, implementation, and optimization of our modern data stack. You'll work with tools like Snowflake, dbt, Airflow, and Terraform to build scalable, reliable, and modular data systems. This role will have a strong focus on enabling analytics through clean data modelling, automation, and observability - empowering domain teams with … trusted, self-serve data products. What You'll Do Modern Data Platform Development Design and expand a Snowflake-based data platform, incorporating modular design principles. Orchestrate complex workflows using Apache Airflow and containerized jobs on Docker and Kubernetes. Use Terraform to define infrastructure as code for consistent, version-controlled deployments. Pipeline Engineering & Automation Design robust ELT pipelines using Python … Spark, and dbt, processing structured and semi-structured data (e.g., JSON, Parquet). Automate ingestion and transformation layers while enforcing data contracts, quality rules, and schema validation. Build reusable, testable modules to reduce development effort and improve standardization. Data Modelling & Analytics Enablement Own the semantic layer using dbt for transformation and modelling in Snowflake, including SCD and dimensional More ❯
the role is fully remote. For location-specific details, please connect with our recruiting team. What You Will Do: Model usage based economics - Build and maintain dbt models in Snowflake that translate raw product telemetry into ARR, margin, and unit economics views. Forecast revenue & costs - Design Python driven Monte Carlo and statistical models to project usage, revenue, and gross … testing, reconciliation, and data quality SLAs so Finance can audit every revenue and cost metric back to source. Mentor analysts & engineers - Provide best practice guidance on dbt design patterns, Snowflake optimization, and Python analytical tooling. About You: 7+ years in analytics or data science roles focused on Finance (SaaS or usage based preferred). Deep hands on expertise withdbt More ❯
while also helping define platform standards and best practices. Key responsibilities include: Build and maintain ELT pipelines Take full ownership of data ingestion Support data modelling and architecture within Snowflake Own and evolve the dbt layer, including governance and access controls Collaborate across analytics, product, and engineering teams Contribute to platform improvements, automation, and optimisation YOUR SKILLS AND EXPERIENCE … A successful Senior Data Engineer will bring: Strong SQL skills Experience with dbt in a production environment Snowflake experience is desirable Exposure to AWS Confident mentoring peers and contributing to a collaborative, high-impact team Experience working in fast-paced, agile environments with modern data workflows THE BENEFITS: You will receive a salary of up to £55,000 depending More ❯
modelled, documented, and served across the business, working with a modern stack and high-impact teams. Key responsibilities in the role Build and maintain robust dbt models within the Snowflake data warehouse Design scalable solutions that translate business questions into reliable datasets Collaborate with teams across product, ops, marketing, and finance Improve data quality, documentation, and platform reliability Integrate … third-party sources Support self-serve analytics through Looker What They’re Looking For Previous experience in an analytics engineering role (or similar) Strong experience with SQL, dbt, and Snowflake Someone just as confident building models as they are shaping best practices Comfortable in ambiguity and proactive about making things better Bonus if you’ve worked in insurance or More ❯
modelled, documented, and served across the business, working with a modern stack and high-impact teams. Key responsibilities in the role Build and maintain robust dbt models within the Snowflake data warehouse Design scalable solutions that translate business questions into reliable datasets Collaborate with teams across product, ops, marketing, and finance Improve data quality, documentation, and platform reliability Integrate … third-party sources Support self-serve analytics through Looker What They’re Looking For Previous experience in an analytics engineering role (or similar) Strong experience with SQL, dbt, and Snowflake Someone just as confident building models as they are shaping best practices Comfortable in ambiguity and proactive about making things better Bonus if you’ve worked in insurance or More ❯