Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, worcester col-narrow-left Client: Athsai Location: worcester, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, bedford col-narrow-left Client: Athsai Location: bedford, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, peterborough col-narrow-left Client: Athsai Location: peterborough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, gloucester col-narrow-left Client: Athsai Location: gloucester, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, nottingham col-narrow-left Client: Athsai Location: nottingham, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, shrewsbury col-narrow-left Client: Athsai Location: shrewsbury, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, basildon col-narrow-left Client: Athsai Location: basildon, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, belfast col-narrow-left Client: Athsai Location: belfast, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide … Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines … Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows using GitHub More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, swindon, wiltshire col-narrow-left Client: Athsai Location: swindon, wiltshire, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 … col-wide Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging … ETL pipelines Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/CD workflows More ❯
Social network you want to login/join with: DataEngineer , Python, PySpark, and SQL, AWS, stoke-on-trent col-narrow-left Client: Athsai Location: stoke-on-trent, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry … Date: 21.07.2025 col-wide Job Description: About the Role The DataEngineer will play a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing … and debugging ETL pipelines Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and Airflow Experience in designing data pipelines using cloud-native services on AWS Extensive knowledge of AWS services Experience in deploying AWS resources using Terraform Hands-on experience in setting up CI/ More ❯
Graduate DataEngineer (Python Spark SQL) *Newcastle Onsite* to £33k Experience, qualification, and soft skills, have you got everything required to succeed in this opportunity Find out below. Do you have a first class education combined with Data Engineering skills? You could be progressing your career … start-up Investment Management firm that have secure backing, an established Hedge Fund client as a partner and massive growth potential. As a Graduate DataEngineer you'll join a graduate trainee scheme, initially yo... JBLK1_UKTJ More ❯
A small and growing consultancy who are delivering cutting-edge data solutions for high-profile clients, are now looking for a hands-on DataEngineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt … AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. This role would be well-suited to someone with a self-starter mentality, who wants to be part of … company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
A small and growing consultancy who are delivering cutting-edge data solutions for high-profile clients, are now looking for a hands-on DataEngineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt … AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. This role would be well-suited to someone with a self-starter mentality, who wants to be part of … company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
A small and growing consultancy who are delivering cutting-edge data solutions for high-profile clients, are now looking for a hands-on DataEngineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt … AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. This role would be well-suited to someone with a self-starter mentality, who wants to be part of … company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
A small and growing consultancy who are delivering cutting-edge data solutions for high-profile clients, are now looking for a hands-on DataEngineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt … AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. This role would be well-suited to someone with a self-starter mentality, who wants to be part of … company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
A small and growing consultancy who are delivering cutting-edge data solutions for high-profile clients, are now looking for a hands-on DataEngineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt … AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. This role would be well-suited to someone with a self-starter mentality, who wants to be part of … company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Security Cleared DataEngineer - Inside IR35 - Fabric/Azure/Python/C#, Slough Client: Methods Location: Slough, United Kingdom Job Category: Other EU work permit required: Yes Job Views: 2 Posted: 31.05.2025 Expiry Date: 15.07.2025 Job Description: Methods is looking for Data Engineers for a … on a hybrid working policy. The client will be based in London and will require on-site presence 1-2 days per week. The DataEngineer must hold a live Security Clearance used on a Security Cleared site within the last 12 months or be actively using the … clearance in their current role. You Will: Design, build, and maintain data pipelines. Ensure data accuracy, consistency, and reliability by implementing validation methods. Implement and enforce data governance policies and ensure compliance with regulations. Build infrastructure for optimal extraction, transformation, and loading (ETL) of data from More ❯
We are looking for a Python GCP Data Applications Developer to join our team. This hybrid role offers flexibility to be based in either Bradford or Leeds. You will be expected to work in the office at least two days a week. Join our Technology and Product Development … team to build innovative cloud-based data and technology-driven back-office products using Google’s AI offerings. You will leverage your Python and data skills to develop data processing pipelines and Generative AI applications in the cloud What will you be doing? Meet business requirements … and collaborate with the team to build solutions. Participate in product technical architecture design, focusing on data and AI. Collaborate with the broader Technology and Product team to ensure exceptional service standards. Maintain a close relationship with the group security team to ensure compliance of all products and services. More ❯