and working experience in the data & analytics services such as Amazon EC2, AWS Lambda, AWS Fargate, Amazon ECS, Amazon EKS, AmazonS3, AWS Glue, Amazon RDS, Amazon DynamoDB, Amazon Aurora, Amazon SageMaker, Amazon Bedrock (including LLM hosting and management). Expertise … ReactJS) Demonstrated experience designing and implementing Generative AI solutions (chatbots, digital assistants, content generation, etc.) Hands-on implementation and operation of AI/ML models with services like Amazon SageMaker Advanced proficiency in Python and related AI/ML productivity libraries Expertise in SQL and NoSQL database technologies Skills Python AWS Gen AI Job Title: Technical Lead Location More ❯
pipelines for actionable business insights. Key Requirements: Proven experience as an AWS DevOps Engineer or Data Engineer in complex cloud environments. Strong hands-on expertise with AWS services (EC2, S3, Lambda, RDS, IAM, CloudWatch, etc.). Demonstrated experience with Airflow (Astronomer) setup, orchestration, and optimization. Proficiency in infrastructure as code (IaC) tools such as Terraform or CloudFormation. Experience with More ❯
Hereford, Herefordshire, West Midlands, United Kingdom Hybrid / WFH Options
Hays
messaging experience (e.g., RabbitMQ) Solid understanding of SQL and relational databases Linux administration and shell scripting Familiarity with network security protocols Experience with cloud platforms, ideally AWS (EC2, RDS, S3, Lambda) Desirable: Coding experience in Java, Go, Python or similar Knowledge of cross-domain technologies Experience in service management environments Practical application of observability patterns Experience with Azure Additional More ❯
Hereford, Herefordshire, West Midlands, United Kingdom Hybrid / WFH Options
Twinstream Limited
experience building and maintaining CI/CD pipelines (e.g. Jenkins) Deep understanding of monitoring & observability tools (Grafana, Prometheus, InfluxDB) Solid grounding in Linux, network security, SQL, and AWS (EC2, S3, RDS, Lambda) Comfortable with MQ messaging (RabbitMQ or similar) Bonus points for: Experience with Azure environments Strong coding ability in Python, Java, or Go Knowledge of cross-domain systems More ❯
environments What you bring: Deep Linux expertise and fluency in at least one high-level programming language (Python preferred) Strong experience with AWS (VPCs, EC2, ECS/EKS, RDS, S3, etc.) Solid understanding of database systems (Postgres, SQL Server) IaC mastery (Terraform, CloudFormation, Ansible) Passion for monitoring and observability (Grafana, Elastic, PagerDuty, etc.) Familiarity with configuration management tools (Puppet More ❯
environments What you bring: Deep Linux expertise and fluency in at least one high-level programming language (Python preferred) Strong experience with AWS (VPCs, EC2, ECS/EKS, RDS, S3, etc.) Solid understanding of database systems (Postgres, SQL Server) IaC mastery (Terraform, CloudFormation, Ansible) Passion for monitoring and observability (Grafana, Elastic, PagerDuty, etc.) Familiarity with configuration management tools (Puppet More ❯
using Apigee Edge or similar platforms. Implement API design patterns, performance optimization, and custom analytics reporting. Key skills for this role Cloud & DevOps: AWS (Lambda, API Gateway, ECS, DynamoDB, S3), Azure DevOps, GitHub workflows. Security Expertise: Secure coding, vulnerability scanning, dependency management, compliance alignment. Infrastructure Automation: Terraform, Ansible, Docker. Programming: Proficiency in Python and Node.js. API Management: Apigee or More ❯
consultant who is willing to engage with clients, learn domain knowledge, be proactive and curious in addition to standard tech would be ideal for the role. Tech Stack: AWS - S3 for storage, Lambda functions, Athena (thus strong SQL), Glue, Glue Data Catalog Python Current frontend is Tableau but experience with any of them will be interchangeable DevOps experience a More ❯
consultant who is willing to engage with clients, learn domain knowledge, be proactive and curious in addition to standard tech would be ideal for the role. Tech Stack: AWS - S3 for storage, Lambda functions, Athena (thus strong SQL), Glue, Glue Data Catalog Python Current frontend is Tableau but experience with any of them will be interchangeable DevOps experience a More ❯
specialising in Databricks (AWS preferred; Azure considered). Demonstrable experience in end-to-end implementation or migration to Databricks. Deep understanding of cloud-native data architectures , particularly in AWS (S3, Glue, EMR, Lambda) or Azure (ADF, Synapse, Data Lake). Strong experience with data modelling, ETL/ELT design , and orchestration tools (Airflow, DBT, or similar). Proficiency in More ❯
modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent services. Experience implementing Infrastructure as Code (Terraform) and CI/CD pipelines (e.g., Jenkins, GitHub Actions). A mindset focused on continuous improvement, learning, and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS, IAM. Experience with DevOps tooling, particularly Terraform and CI/CD pipelines (e.g. Jenkins). A proactive, growth-oriented mindset with a passion for modern data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
Skills for the Senior Data Engineer: Experience with event sourcing, dbt, or related data transformation tools Familiarity with PostgreSQL and cloud-native data services (Azure Event Hub, Redshift, Kinesis, S3, Blob Storage, OneLake, or Microsoft Fabric) Understanding of machine learning model enablement and operationalisation within data architectures Experience working within Agile delivery environments If you are an experienced Senior More ❯
for a highly skilled Cloud DevOps Engineer to join the team on an inital 6 month contract Kubernetes, specifically with EKS Terraform (Enterprise) Micro-services AWS experience including Cloudfront, S3, EKS, LoadBalancing, Security, API Gateway, Transit Gateway ArgoCD OpenTelemetry Istio Database experience ideally but not mandatory: RDS (Oracle/Postgres), MongoDB Gitlab CICD If this role sounds like a More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
compliance. Desirable Experience Solid understanding of cloud infrastructure fundamentals: Compute Engine, Cloud Networking Experience with container orchestration, Kubernetes, Istio and GKE in particular. Knowledge of cloud-native storage: GCS, S3 and filer solutions. Exposure to data services: Cloud SQL, managed databases, MongoDB. Messaging systems: Kafka, RabbitMQ and EMS; API gateways like Apigee. Observability tools: Prometheus, Grafana, Cloud Monitoring. IAM More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
hybrid 2 days a week Clearance: SC Clearance required You will join a global IT Consultancy delivering digital transformation to the NHS. AWS Skills EC2 - CIS1 hosted on Amazon Linux. Connectivity, diagnosis, general troubleshooting EC2 - Auto scaling groups - Manages the EC2 infrastructure provisioning EC2 - Load balancing. Mix of application, classic and network load balancers. Target groups and rule … control lists) and Direct connect configuration VPC - Peering connections. Connectivity between VPCs and other AWS Accounts Lambda - Majority of functions written in Python AWS Eventbridge - - Used to trigger lambdas S3 - - Simple bucket storage Postgres RDS - Relational Database DynamoDB - Some minor items stored here (Terraform state and locks) Cloudwatch - Logging and metrics alerting AWS Systems Manager - Parameters, run commands EFS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
preferably Scaled Agile Framework or similar. Good experience and skills within streaming technologies like Confluent Kafka, Streamsets, MongoDB, IBM CDC. Knowledge of public/enterprise cloud technologies (AWS EC2, S3 Bucket, GCP, Azure) is advantageous but not required. Some skills/experience with automated testing frameworks (Java, Python, PySpark, Bitbucket, Gitlab, Jenkins) is advantageous but not required. Strong Environment More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
tools and techniques Knowledge of Accessibility testing for mobile applications Experience in working with API's and backend validations (Postman, REST Assured) Experience with Cloud tools such as AWS S3, CloudWatch and Browser Stack. Knowledge of Agile/Scrum Methodologies Excellent Problem-solving skills and attention to detail Strong Communication and Collaboration skills If you find this opportunity intriguing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
role is ideal for someone with strong experience in data migration, ETL (both batch and real-time), and data warehouse development. What you'll need: DataStage Redshift QuickSight AWS S3 Java SQL Relational databases GitHub/lab experience Nice to have: Data quality expertise XML knowledge AWS Data Speciality certification Reasonable Adjustments: Respect and equality are core values to More ❯
with query optimization and performance tuning on relational databases like PostgreSQL, MySQL, or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for data lake storage AWS Glue for ETL and data cataloging Amazon Redshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL More ❯
with query optimization and performance tuning on relational databases like PostgreSQL, MySQL, or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for data lake storage AWS Glue for ETL and data cataloging Amazon Redshift or Athena for data warehousing and analytics Lambda for event-driven data processing. ETL More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
background) Location: Canary Wharf, UK (Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python … such as NumPy, pandas, beautiful soup, Selenium, pdfplumber, Requests etc. Proficient in SQL programming, Postgres SQL. Knowledge on DevOps like CI/CD, Jenkins, Git. Experience working with AWS(S3) and Azure Databricks. Have experience in delivering project with Agile and Scrum methodology. Able to co-ordinate with Teams across multiple locations and time zones Strong interpersonal and communication More ❯