Java modules. Collaborate with development teams to design and implement automated tests for microservices, emphasizing Spring Boot and Java-based architectures. Implement testing strategies for AWS data lakes (e.g., S3, Glue, Athena) with a focus on schema evolution, data quality rules, and performance benchmarks, prioritizing data lake testing over traditional SQL approaches. Automate data tests within CI/CD … tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures and best practices. Advanced skills in automating API and backend testing workflows, ensuring robust and More ❯
Portsmouth, Hampshire, United Kingdom Hybrid / WFH Options
Bluecrest Health Screening Limited
Updating and altering application features to enhance performance. Proficient with code versioning tools. Prior working experience in Agile methodologies. Our systems sit on AWS, so experience with EC2, RDS, S3 and LAMBDA is beneficial but not essential Other information The next steps So, if you think you've got some exceptional skills to offer us and Bluecrest feels like More ❯
Portsmouth, Hampshire, England, United Kingdom Hybrid / WFH Options
Computappoint
and AWS Redshift AWS Certified Solutions Architect, Developer, or SysOps Administrator certifications are highly desirable Deployment experience with AWS CloudFormation or Terraform Knowledge of AWS Core services i.e. EC2, S3, RDS, VPC, IAM, Lambda A strong team player and a creative problem solver. A strong communicator with the ability to make complex technical solutions simple to both technical and More ❯
Data Pipeline Orchestration and ELT tooling such as Apache Airflow, Apache NiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3, MinIO, LakeFS CI/CD Pipeline and Integration, ideally with Azure DevOps Python scripting API Management Solutions Automation Key Skills: Design, Configuration, and Usage of Low-code Platforms such More ❯
Focused Data Pipeline Orchestration, and ELT tooling such as Apache Airflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration, ideally Azure DevOps Python Scripting API Management Solutions Automation Key Skills Experience in the Design/Configuration/Usage in a number More ❯
transformation journey of our data platform (AWS) Cloud Proficiency: Hands-on experience with at least one major cloud platform (AWS, Azure, or GCP) and its core data services (e.g., S3, Redshift, Lambda/Functions, Glue). Data Modelling: Deep understanding of ELT/ETL patterns, and data modelling techniques. CRM/Customer Data Focus: Experience working directly with data More ❯
knowledge sharing, including handover notes, rollout summaries, and demo materials WHAT YOU'LL BRING " ESSENTIAL " AWS Certified Cloud Practitioner Foundational experience with AWS services such as VPCs, EC2, ECS, S3, Site-to-Site VPN, Transit Gateway, ACM, CloudWatch, CloudFront, and MediaConnect Strong written and verbal communication skills, with an ability to document workflows, processes, and support materials clearly A More ❯
Experience working in a proactive analytics environment. Experience in the Utilities sector. Experience leading technical projects. Skills & Technologies required: Proficiency in cloud-based data engineering tools (ADF, Synapse Analytics, S3, Lambda). Proficiency in using PySpark notebooks for ELT processes. Ability to foster and cultivate a culture of best practices. Strong analytical and problem-solving skills. Ability to work More ❯