skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP-203 Azure Data Engineering/ More ❯
skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP-203 Azure Data Engineering/ More ❯
client requirements into solution blueprints and supporting proposal development. Key Responsibilities Architect and oversee the delivery of enterprise-scale data platforms (data lakes, lakehouses, warehouses) using tools like Databricks , Snowflake , Synapse , and Azure Fabric Define and execute cloud migration strategies, leveraging CI/CD pipelines, Terraform, Azure DevOps, and GitHub Support RFP/RFI responses, conduct client workshops, and shape More ❯
documentation, data warehousing, and data modeling. Experience with Python for interaction with Web Services (e.g., Rest and Postman). Experience with using and developing data APIs. Experience using AWS, Snowflake, or other comparable large-scale analytics platforms. Experience monitoring and managing databases (we use Elasticsearch/MongoDB/PostgreSQL). Experience with SQL. Experience with data versioning tools. Experience developing More ❯
and query optimization 2+ years using Python or similar scripting languages for data science Experience with data processes and building ETL pipelines Experience with cloud data warehouses such as Snowflake, Azure Data Warehouse, Amazon Redshift, or Google BigQuery Proficiency in creating visualizations using Power BI or Tableau Experience designing ETL/ELT solutions with tools like SSIS, Alteryx, AWS Glue More ❯
Prior experience delivering and owning a reporting cycle (monthly or quarterly) with evidence improving processes while maintaining high quality • Strong SQL skills and experience with relational databases (e.g. Postgres, Snowflake, MS SQL) • Familiarity with modern data stack tools including dbt, Prefect and other cloud platforms • Comfortable working with large, complex datasets in a fast-paced, regulated environment • Excellent communication and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
/remote working and a salary between £50,000 and £70,000. As part of the data engineering team, youll design and deliver scalable data products using technologies like Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating More ❯
ideally within fintech or cloud-native organisations (AWS preferred). Strong technical background in data engineering, analytics, or data science. Experience with modern data stacks (e.g., SQL, dbt, Airflow, Snowflake, Looker/Power BI) and AI/ML tooling (e.g., Python, MLflow, MLOps). A track record of building and managing high-performing data teams. Strategic thinking and ability to More ❯
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Softwire
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
practices and procedures within the department. Required Qualifications Bachelor's degree with at least 5 years of experience, or equivalent. In-depth knowledge and expertise in data engineering, including: Snowflake (data warehousing and performance tuning) Informatica (ETL/ELT development and orchestration) - nice to have Python (data processing and scripting) - required AWS (data services such as S3, Glue, Redshift, Lambda More ❯
Spark Streaming, Kinesis) Familiarity with schema design and semi-structured data formats Exposure to containerisation, graph databases, or machine learning concepts Proficiency with cloud-native data tools (BigQuery, Redshift, Snowflake) Enthusiasm for learning and experimenting with new technologies Why Join Capco Deliver high-impact technology solutions for Tier 1 financial institutions Work in a collaborative, flat, and entrepreneurial consulting culture More ❯
meet compliance standards. Mentor: Upskill other platform engineers, data engineers and AI engineers to deliver and build adoption on your team's initiatives Our Tech Stack Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github More ❯
tools. Understanding of Agile methodologies. Additional Skills Experience mentoring or supporting team development. Knowledge of Azure SQL DB, Data Factory, Data Lake, Logic Apps, Data Bricks (Spark SQL), and Snowflake is advantageous. More ❯
Need to Succeed Strong skills in Python and SQL Demonstrable hands-on experience in AWS cloud Data ingestions both batch and streaming data and data transformations (Airflow, Glue, Lambda, Snowflake Data Loader, FiveTran, Spark, Hive etc.). Apply agile thinking to your work. Delivering in iterations that incrementally build on what went before. Excellent problem-solving and analytical skills. Good More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For Proven data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For Proven data More ❯
Bromsgrove, Worcestershire, United Kingdom Hybrid / WFH Options
Reed Technology
deliverables Producing and maintaining high-quality technical documentation Championing data engineering best practices and standards across the business Technical skills Cloud data platforms - Azure, AWS, or GCP (Azure preferred) Snowflake - Deep knowledge and hands-on experience Matillion - Expertise in ETL orchestration Data warehousing and advanced analytics Dimensional modelling and data vault methodologies Stakeholder engagement and cross-functional collaboration Flexible hybrid More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯
SQL. Vast experience in data modelling using tools such as Erwin, Power Designer, SQLDBM or Sparx EA. Minimum 10 years experience in using databases such as Oracle, SQL Server, Snowflake or any other OLTP and OLAP databases. Minimum 5 years experience with reporting tools: Power BI, Business Objects, Tableau or OBI. Understanding of Master Data Management technology landscape, processes and More ❯
Edgbaston, Birmingham, West Midlands (County), United Kingdom
Network IT
relationship building to understand and provide business needs. Experience: We are looking for a Data Engineer who has experience designing and implementing data warehouses, with strong technical competency using Snowflake (preferably certified), Azure Data Factory for building cloud ETL pipelines, Power BI and Data Build Tool (DBT). Other elements of your experience which are desirable to our client include More ❯