Senior AWS Migration Engineer, contract, outside IR35, London & remote Senior AWS Migration Engineer to join our team on a contract basis. You will play a key role in planning, executing, and optimizing cloud migration projects, specifically leveraging the full AWS service stack with a strong emphasis on … of applications, databases and services from on-premises or other cloud environments to AWS. Architect and design serverless and traditional cloud-native solutions utilizing AWS Lambda, API Gateway, DynamoDB, StepFunctions, and related services. Implement and optimize AWS infrastructure services (EC2, S3, RDS, VPC, CloudFormation, IAM … runbooks. Automate infrastructure deployment and management through Infrastructure as Code (IaC) using CloudFormation, Terraform, or CDK. Ensure high availability, scalability, performance, and security of AWS-hosted applications. Troubleshoot migration issues and optimize the transition process. Provide documentation, technical guidance, and mentorship to internal teams. Contribute to cloud best practices More ❯
Lead AWS Data Engineer 📍 City of London (Hybrid: Around 2-3 office days per week) 💰 Salary: Up to £110,000 🏦 Industry: FinTech/RegTech A 200-employee data and technology company specialising in Regulatory Technology (RegTech) are looking for a Lead AWS Data Engineer. They develop products that … team of onshore and offshore Data Engineers, driving technical growth and ensuring effective collaboration. Design, develop, and optimise scalable data pipelines and infrastructure using AWS (Glue, Athena, Redshift, Kinesis, StepFunctions, Lake Formation). Utilise PySpark for distributed data processing, ETL, SQL querying, and real-time data … streaming. Establish and enforce best practices in data engineering, coding standards, and architecture guidelines. Build and manage data lake architecture s on AWS to support scalable, secure, and cost-efficient storage and processing of structured and unstructured data. Manage team tasks and delivery using Agile methodologies, collaborating closely with More ❯
platform. Key Skills: Experience in Data Engineering, including data integration, modelling, optimisation and data quality Exceptional understanding of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWSStepFunctions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to the circumstance Comfortable working in an agile environment, where features may change and evolve quickly Prior experience in financial services, specifically capital More ❯
foundation that powers the entire technology stack, from AI agents to APIs to workflows to integrations Set up and manage the cloud infrastructure using AWS, Docker containerisation, and Infrastructure-as-Code with Terraform Build robust data integration infrastructure to connect the platform with various customer systems (TMS, RMS, etc. … that enables our AI-driven workflows Skills and Qualifications Experience building end-to-end platform solutions that integrate workflow orchestration systems (like Airflow, Temporal, AWSStepFunctions) with real-world business processes and data pipelines Strong background in integration engineering and data modelling Exceptional Python skills for … building APIs, services, and data processing pipelines Experience with cloud infrastructure (AWS) and infrastructure-as-code (Terraform) including implementing monitoring and observability Knowledge of AI engineering workflows - particularly understanding how to build reliable systems around LLMs Problem-solving mindset and ability to navigate the challenges of creating a reliable More ❯
for excellence, innovation, and client service. With a strong presence in London, are looking to expanding there data engineering capabilities to support critical business functions, regulatory reporting, and data-driven decision-making. Role Overview They are seeking a highly skilled AWS Data Engineer to join the London-based … to work in a fast-paced, high-impact environment where technology drives the business. Key Responsibilities Design, build, and maintain robust data pipelines using AWS native services (Glue, Lambda, StepFunctions, S3, etc.) Develop and optimize data lake and data warehouse solutions using Redshift, Athena, and related … troubleshoot, and improve the performance and reliability of data systems Required Skills & Experience Proven experience as a Data Engineer working in cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake More ❯
growing team. This role is ideal for a technically strong, hands-on professional with proven experience designing and delivering modern data solutions-particularly within AWS environments-and leading technical teams. What You'll Do: Design and build scalable data solutions using AWS services (Redshift, Glue, S3, StepFunctions). Lead the development of robust ETL/ELT workflows with AWS Glue and Python. Manage and mentor Data Engineers and Analysts across multiple projects. Collaborate with DevOps to maintain CI/CD pipelines (Jenkins, GitHub, AWS tools). Align data strategy with business goals … classification). Eligibility for SC Clearance Proven experience managing technical teams in a data engineering or consultancy environment. Strong hands-on experience with AWS data services and relational databases. Skilled in SQL, Python, and data pipeline development. Experience with CI/CD practices, DevOps tools, and serverless orchestration. Confident More ❯
Position: AWS DevOps Lead Employment Type: Permanent, Full‑time Start: ASAP Location: UK (Hybrid) Language: English The Role As Lead DevOps Engineer , you will own the end‑to‑end design, implementation and continuous improvement of our cloud infrastructure and delivery pipelines. You’ll guide multidisciplinary Agile squads, embed DevOps … and deploy microservice applications in line with 12‑factor app principles. Infrastructure Automation: Define and implement Infrastructure as Code using Terraform and Ansible for AWS environments. Container & Orchestration: Design, deploy and manage Docker containers on Kubernetes (and Rancher), ensuring high availability and efficient resource usage. CI/CD Pipelines … Create and maintain GitLab CI/CD or Jenkins pipelines for automated build, test, security scans (Blackduck, Checkmarx, SonarQube) and deployment workflows. AWS Architecture: Architect and operate AWS services—including VPC, EC2, EBS, Route 53, WAF, ALB/ELB, Network ACLs, Security Groups, KMS and S3—to meet More ❯
reliable, and secure data systems that align with wider business goals. You’ll be responsible for developing and maintaining enterprise-scale data infrastructure within AWS, integrating data from numerous internal and external sources. The Data Platform functions as a core hub—making it easy and safe for teams … to use and contribute to data systems. You’ll work with services like Lambda, S3, LakeFormation, Glue, StepFunctions, Athena, EventBridge, SNS, SQS, and DynamoDB, and will be expected to navigate and manage data systems with a high degree of rigour and compliance. Familiarity with additional tools such … realise the full potential of the organisation’s data assets. This role demands a strong foundation in data architecture, excellent development skills, and deep AWS proficiency. You should be adept at designing serverless systems, building resilient data pipelines, and leading technical conversations around system architecture. The ideal candidate will More ❯
solutions, and driving greenfield projects in dynamic environments. You will work directly with clients to define requirements, architect and implement data solutions using modern AWS technologies, and support knowledge transfer to in-house teams. The ideal candidate is highly autonomous, curious, and comfortable working in both technical and consultative … capacities. Key Responsibilities: • Design, develop, and deploy data pipelines using AWS tools such as S3, Redshift, Glue, Lambda, StepFunctions, and DynamoDB • Build robust ETL/ELT solutions with Matillion for cloud data warehousing • Collaborate with client stakeholders to refine data needs and deliver production-ready solutions … and maintain metadata, data dictionaries, and schema documentation Required Experience: • Strong experience with data engineering in a cloud-first environment • Hands-on expertise in AWS data stack, particularly Glue, Redshift, S3, and Matillion • Deep understanding of relational databases and data modelling • Comfortable with Unix/Linux environments and scripting More ❯
Go templating. You will join a team of engineers ranging from junior to lead. You will report to the founder of the company, who functions as Architect and Principal Engineer. To some of our clients, you will be the face of our company; professionalism and communication are key in … this role. Required skills and experience: Minimum five years of experience as a Senior DevOps Engineer Experience in a consultancy setting Highly sought: Current AWS certification, e.g. Solutions Architecture (associate or Professional) or DevOps Professional, or GCP equivalents. Terraform/OpenTofu Docker and Kubernetes, preferably with Certified Kubernetes Administrator … real-world experience will be considered in lieu of a degree Cloud skills We are multi-cloud, however most of our work is with AWS and GCP, and you should have experience of working with the following cloud services: ECS, EKS, GKE, ECR, GAR Lambda, StepFunctionsMore ❯
engineering expertise with strategic leadership, guiding complex engagements from RFI and RFP through to successful delivery. This is an opportunity for a highly capable AWS Data Engineer who has operated at Senior Manager or upper-level Manager grade within a leading consultancy. You will bring robust stakeholder engagement skills … delivery teams, and a genuine passion for cloud-native solutions. What You'll Be Doing: Leading the design and delivery of end-to-end AWS-based data solutions (S3, Redshift, Glue, Lambda, StepFunctions, Matillion, DynamoDB, etc.). Acting as a senior technical SME in RFI and … constraints, and guide architectural decisions. Supporting practice growth initiatives and contributing to reusable assets and accelerators. What We're Looking For: Extensive hands-on AWS data engineering expertise including experience with S3, Glue, Redshift, Lambda, and Matillion. Demonstrated leadership within a consultancy or delivery-focused organisation at SM/ More ❯