Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Ltd
IaaS PaaS), including Infra as Code. Strong SQL skills and proficiency in Python or PySpark . Built or maintained data lakes/warehouses using Synapse , Fabric , Databricks , Snowflake , or Redshift . Experience hardening cloud environments (NSGs, identity, Defender). Demonstrated automation of backups, CI/CD deployments, or DR workflows. Nice-to-Haves: Experience with Azure OpenAI , vector databases More ❯
/CD solutions Linux system administration, including shell scripting and system optimisation Desirable Skills Experience with AWS services such as SQS, SNS, API Gateway, or data analytics platforms (e.g., Redshift, Glue) CI/CD pipelines using GitLab CI Monitoring solutions using NewRelic Deployment management with Helm Admiral: Where You Can We take pride in being a diverse and inclusive More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
re an experienced data engineering professional with a proven ability to lead and inspire teams. You bring deep technical skills in Python, SQL, and AWS services such as EC2, Redshift, Lambda and Kinesis, alongside strong stakeholder management and commercial awareness. You'll also bring: Proven experience designing and implementing data pipelines, ETL processes, and warehousing in cloud environments. The More ❯
new technologies essential for automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You More ❯
/Data Engineering/BI Engineering experience Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage script More ❯
if you have: Expertise in Cloud-Native Data Engineering: 3+ years building and running data pipelines in AWS or Azure, including managed data services (e.g., Kinesis, EMR/Databricks, Redshift, Glue, Azure Data Lake). Programming Mastery: Advanced skills in Python or another major language; writing clean, testable, production-grade ETL code at scale. Modern Data Pipelines: Experience with More ❯
e.g.scikit-learn, pandas, NumPy, SciPy, etc) Experience with ML frameworks such as TensorFlow, PyTorch, XGBoost, LightGBM, or similar Strong SQL skills and experience with data warehousing solutions (Snowflake, BigQuery, Redshift) Experience with cloud platforms (AWS, Azure, GCP) and their ML and AI services (SageMaker, Azure ML, Vertex AI) Knowledge of MLOps tools including Docker, MLflow, Kubeflow, or similar platforms More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯
/BI Engineering experience Excellent SQL skills Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage script More ❯
such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to More ❯
improve our technology stack. Develop oneself into a Subject Matter Expert (SME) on Technical and Functional domain areas. What we value Demonstrated experience in Python, PySpark and SQL (AWS Redshift, Postgres, Oracle). Demonstrated experience building data pipelines with PySpark and AWS. Application development experience in financial services with hands on designing, developing, and deploying complex applications.Demonstrated ability to More ❯
play a pivotal role in designing and implementing data warehousing solutions using Snowflake and AWS. You will help drive the evolution of our data architecture as we move from Redshift to Snowflake. Looking for someone with extensive experience with cloud providers? Hands-on experience with AWS services such as Glue (Spark), Lambda, Step Functions, ECS, Redshift, and SageMaker. … like GitHub Actions, Jenkins, AWS CDK, CloudFormation, Terraform. Key Responsibilities: Design and implement scalable, secure, and cost-efficient data solutions on AWS, leveraging services such as Glue, Lambda, S3, Redshift, and Step Functions. Lead the development of robust data pipelines and analytics platforms, ensuring high availability, performance, and maintainability. Demonstrate proficiency in software engineering principles, contributing to the development … strong hands-on programming skills and software engineering fundamentals, with experience building scalable solutions in cloud environments (AWS preferred) Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Solid foundation in software engineering principles, including version control (Git), testing, CI/CD, modular design, and clean code practices. Experience developing reusable components and APIs More ❯
play a pivotal role in designing and implementing data warehousing solutions using Snowflake and AWS. You will help drive the evolution of our data architecture as we move from Redshift to Snowflake. Looking for someone with extensive experience with cloud providers? Hands-on experience with AWS services such as Glue (Spark), Lambda, Step Functions, ECS, Redshift, and SageMaker. … like GitHub Actions, Jenkins, AWS CDK, CloudFormation, Terraform. Key Responsibilities: Design and implement scalable, secure, and cost-efficient data solutions on AWS, leveraging services such as Glue, Lambda, S3, Redshift, and Step Functions. Lead the development of robust data pipelines and analytics platforms, ensuring high availability, performance, and maintainability. Demonstrate proficiency in software engineering principles, contributing to the development … strong hands-on programming skills and software engineering fundamentals, with experience building scalable solutions in cloud environments (AWS preferred) Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Solid foundation in software engineering principles, including version control (Git), testing, CI/CD, modular design, and clean code practices. Experience developing reusable components and APIs More ❯
Mentor junior consultants Build strong customer relationships Support managed services as required Essential Requirements 3+ years consulting/managed services experience Solid Data Lake experience with AWS data solutions (Redshift, Glue, Athena, Lake Formation) Advanced AWS skills (EC2, S3, VPC, IAM, Lambda) Infrastructure as Code experience (CloudFormation/Terraform) AWS Solutions Architect Associate certification Strong communication and presentation abilities More ❯
Portsmouth, Hampshire, England, United Kingdom Hybrid / WFH Options
Computappoint
Mentor junior consultants Build strong customer relationships Support managed services as required Essential Requirements 3+ years consulting/managed services experience Solid Data Lake experience with AWS data solutions (Redshift, Glue, Athena, Lake Formation) Advanced AWS skills (EC2, S3, VPC, IAM, Lambda) Infrastructure as Code experience (CloudFormation/Terraform) AWS Solutions Architect Associate certification Strong communication and presentation abilities More ❯
data-focused SRE, Data Platform, or DevOps role Strong knowledge of Apache Flink, Kafka, and Python in production environments Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further roles, please click More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
data-focused SRE, Data Platform, or DevOps role *Strong knowledge of Apache Flink, Kafka, and Python in production environments *Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) *Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further roles, please click More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
data-focused SRE, Data Platform, or DevOps role*Strong knowledge of Apache Flink, Kafka, and Python in production environments*Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.)*Comfortable with monitoring tools, distributed systems debugging, and incident response Reference Number: BBBH259303 To apply for this role or for to be considered for further roles, please click More ❯
production data systems. Advanced SQL skills and deep experience implementing complex business logic and performance-optimized transformations at scale. Expertise with modern cloud-based data platforms (e.g., Snowflake, BigQuery, Redshift) and experience managing large, distributed data environments. Strong understanding of data architecture and dimensional modeling principles, with a track record of designing enterprise-grade schemas. Proven ability to lead More ❯
drills for stream and batch environments. Architecture & Automation Collaborate with data engineering and product teams to architect scalable, fault-tolerant pipelines using AWS services (e.g., Step Functions , EMR , Lambda , Redshift ) integrated with Apache Flink and Kafka . Troubleshoot & Maintain Python -based applications. Harden CI/CD for data jobs: implement automated testing of data schemas, versioned Flink jobs, and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to More ❯
communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to More ❯
lead on career development opportunities Ability to evangelize next-generation infrastructure in analytics space (batch, near real-time, real-time technologies), using both SAS components and AWS tooling (e.g., Redshift, SQS, Kinesis) Strong familiarity with spatial-temporal data sets, with exposure to maritime data preferred Experienced in evaluating, training, and communicating the performance of ML models (e.g., supervised models More ❯