Data Services: Glue, EMR, Redshift, Athena, S3, Lambda, Step Functions, CloudWatch. Proficiency in Python and PySpark for ETL, transformation, and automation. Strong experience in SQL and data modeling (star schema, snowflakeschema, dimensional models Experience building scalable, high-performance data pipelines in cloud environments. Knowledge of data governance, data quality frameworks, and security best practices. Excellent communication More ❯
Data Services: Glue, EMR, Redshift, Athena, S3, Lambda, Step Functions, CloudWatch. Proficiency in Python and PySpark for ETL, transformation, and automation. Strong experience in SQL and data modeling (star schema, snowflakeschema, dimensional models Experience building scalable, high-performance data pipelines in cloud environments. Knowledge of data governance, data quality frameworks, and security best practices. Excellent communication More ❯
/Oracle/SQL Experience with microservices and API design Superior problem-solving skills and debugging capabilities Knowledge of relational and multidimensional data design and development techniques including star schema, snowflakeschema, cube design, ETL and others Background in data warehousing concepts, ETL development, data modeling, metadata management and data quality Bachelor's degree in Computer Science … similar degree, or equivalent experience What would be great to have: Experience in the finance/banking sector Ability to design and implement effective analytics solutions and models with Snowflake Experience in building distributed, service oriented, microservices-style and cloud-based application architectures Experience in automation testing, mock frameworks, virtual services, performance testing and pipeline tools like Jenkins or More ❯
loading (ETL) processes, leveraging tools such as Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi, or scripting languages (Python, PySpark, SQL). Understanding of data warehousing and data modelling techniques (Star Schema, SnowflakeSchema). Familiarity with security frameworks (GDPR, HIPAA, ISO 27001, NIST, SOX, PII) and AWS security features (IAM, KMS, RBAC). Strong analytical skills to assess More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
london (city of london), south east england, united kingdom
HCLTech
PySpark, and SQL, with the ability to build modular, efficient, and scalable data pipelines. Deep expertise in data modeling for both relational databases and data warehouses, including Star and Snowflakeschema designs. Extensive experience working with AWS Redshift and Aurora for data warehousing and transactional workloads. Experience using dbt (Data Build Tool) for building modular, version-controlled, and More ❯
in Azure data services like Azure Data Lake Storage, Azure Cosmos DB, Azure Synapse Analytics, and Azure Key Vault. • Data Modeling - Strong skills in data modeling, including star/snowflake schemas, normalizations, denormalizations, and working with NoSQL and relational databases. • Programming Languages - Proficiency in Python, SQL, Scala, or other relevant languages for data manipulation and analysis. OTHER QUALIFICATIONS • Written More ❯
data engineering role. Proficient in SQL and Python . Strong experience with AWS services (e.g., Lambda, Glue, Redshift, S3). Solid understanding of data warehousing and modelling: star/snowflakeschema Familiarity with Git , CI/CD pipelines, and containerisation (e.g., Docker). Ability to troubleshoot BI tool connections (e.g., Power BI). Desirable Skills Experience with Infrastructure More ❯
data engineering role. Proficient in SQL and Python . Strong experience with AWS services (e.g., Lambda, Glue, Redshift, S3). Solid understanding of data warehousing and modelling: star/snowflakeschema Familiarity with Git , CI/CD pipelines, and containerisation (e.g., Docker). Ability to troubleshoot BI tool connections (e.g., Power BI). Desirable Skills Experience with Infrastructure More ❯
M22, Northenden, Manchester, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
Sharston, Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
in designing and managing systems and reporting frameworks.n tool such as Tableau or Microsoft Power BI Extensive experience in informatics roles and functions. Experience with data modelling (star/snowflake schemas), indexing, and partitioning. Desirable Experience/Involvement in development and implementation of patient based information systems Experience of line management and providing support to other colleagues Knowledge Essential More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
. Advanced programming skills in SQL and Python for data transformation and analysis. Experience with web analytics tools (e.g. Adobe Analytics). Familiarity with data warehousing and database modeling (Snowflake experience a plus). Strong communication and presentation skills, with the ability to explain complex data simply. Self-starter with a commercial mindset and a drive for continuous improvement. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment Ltd
in governance forums and strategic planning What You’ll Bring Proven experience designing and architecting complex, high-volume data systems Strong hands-on knowledge of cloud platforms (AWS preferred), Snowflake, DBT, and data modelling tools Experience with data integration, ETL, and API-based data exchange Familiarity with Data Vault, dimensional modelling, and master data management Excellent communication skills and More ❯
dashboards, DAX, Power Query, and complex data modeling Strong SQL skills for data extraction, transformation, and performance optimisation (essential) Solid understanding of data warehousing principles such as star and snowflake schemas, as well as ETL processes Experience in designing and implementing semantic and tabular models for reporting solutions Excellent communication abilities with proven experience collaborating with clients (essential) What More ❯
pace. Proven success delivering governance initiatives in complex organisations. Strong knowledge of governance frameworks, catalogues, lineage, and metadata tools. Excellent stakeholder communication skills. (Preferred) Experience with cloud data platforms (Snowflake + dbt), data modelling, ETL/ELT, and hospitality/retail sectors. If you are experienced in retail/fast paced businesses, and can deliver - please apply. More ❯
improving a system that handles a high volume of traffic so experience with cloud and data warehouse infrastructure will help you in this role (we use AWS, Cloudflare and snowflake). Familiarity with infrastructure as code will also help when updating our cloud architecture (we use terraform). We place a large focus on data quality so you'll … not just the code, but the architecture of our platforms and everything that enables the business to thrive. Gain expertise over our tools and services: Python, Docker, Github Actions, Snowflake, AWS Participate in all team ceremonies and have direct input in the team's ways of working. This is a high-trust, supportive, and collaborative environment where you will … experience-we want to hire people to grow into the role and beyond. About the team: Python is our bread and butter. The wider data platform team uses dbt, Snowflake, and Looker to model, transform, and expose data for analytics and reporting across the business. We use Docker and Kubernetes to manage our production services. We use Github Actions More ❯
a range of global brands across industries such as (but not limited to) banking, pharmaceuticals, retail, and insurance. Plus, we have strong partnerships with tech providers including Microsoft, AWS, Snowflake, Starburst, Neo4J, Databricks, Google Cloud, and others, giving us access to cutting-edge technologies and enabling us to spearhead innovation. Who We're Looking For Degree in any major More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able … to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
Senior Engineer to join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines … deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test automation. Familiarity with AI-powered development tools More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
Qualifications What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business processes into scalable and user-friendly data structures that drive analytics, reporting, and scenarios powered by … Delta Live Tables Strong background in building high-performance, scalable data models that support self-service BI and regulatory reporting requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC More ❯