AWS Data Engineer - Glasgow and remote - 10 months+

AWS Data Engineer - Glasgow and remote - 10 months+/RATE: £306.25 per day inside IR35

One of our Blue Chip Clients is urgently looking for an AWS Data Engineer.

For this role you will need to be onsite in Glasgow 2-3 days per week.

Please find some details below:

CONTRACTOR MUST BE ELIGIBLE FOR BPSS

MUST BE PAYE THROUGH UMBRELLA

Role Description:

We are seeking a highly skilled Senior AWS Data Engineer with strong hands on experience building scalable, secure, and automated data platforms on AWS. The ideal candidate will have deep expertise in AWS CloudFormation, data ingestion and transformation services, Python-based ETL development, and orchestration workflows. This role will focus on designing, implementing, and optimizing end to end data pipelines, ensuring data quality, reliability, and governance across cloud-native environments.

Key Responsibilities
Data Engineering & Pipeline Development
Design, develop, and maintain large scale data pipelines using AWS services such as Glue, Lambda, Step Functions, EMR, DynamoDB, S3, Athena, and other ETL/ELT components.
Build automated ingestion, transformation, and enrichment workflows for structured and unstructured datasets.
Implement reusable data engineering frameworks and modular components using Python, PySpark, and AWS-native tooling.

Cloud Infrastructure for Data Platforms
Develop and manage AWS CloudFormation templates for provisioning secure, scalable data engineering infrastructure.
Optimize data storage strategies (S3 layouts, partitioning, compression, life cycle rules).
Configure and maintain compute services for data workloads (Lambda, ECS, EC2, EMR).

Automation & Orchestration
Build and enhance orchestration flows using AWS Step Functions, EventBridge, and Glue Workflows.
Implement CI/CD practices for data pipelines and infrastructure automation.

Security, Governance & Best Practices
Apply strong authentication/authorization mechanisms using IAM, KMS, access policies, and data access controls.
Ensure compliance with enterprise security standards, encryption requirements, and governance frameworks.
Implement data quality checks, schema validation, lineage tracking, and metadata management.

Collaboration & Troubleshooting
Work with data architects, platform engineers, analysts, and cross functional stakeholders to deliver high quality datasets.
Troubleshoot pipeline issues, optimize performance, and improve reliability and observability across the data platform.
Drive continuous improvement in automation, monitoring, and operational efficiency.

Required Skills & Experience
8+ years of hands on experience as a Data Engineer with strong AWS expertise.
Expert-level proficiency in AWS CloudFormation (mandatory).
Strong experience with AWS data and compute services:
o Glue, Lambda, Step Functions, EMR
o S3, DynamoDB, Athena
o ECS/EC2 for data workloads where relevant
Solid experience building ETL/ELT pipelines using Python (and ideally PySpark).
Strong knowledge of IAM, KMS, encryption, and AWS security fundamentals.
Ability to design and implement authentication/authorization patterns (OAuth2, API security, IAM roles & policies).
Strong understanding of distributed systems, data modelling, modern data architectures, and cloud-native design.
Experience deploying pipelines using CI/CD practices and automated workflows.

Good to Have
Experience with monitoring and observability tools (CloudWatch, Prometheus, Grafana).
Exposure to serverless data architectures.
Hands on experience in cloud migration, Legacy-to-cloud data movement, or enterprise-scale transformations.
Familiarity with data catalogues, lineage tools, and governance frameworks.

Please send CV for full details and immediate interviews. We are a preferred supplier to the client.

Job Details

Company
Octopus Computer Associates
Location
Glasgow, Lanarkshire, United Kingdom G32 0
Hybrid / Remote Options
Employment Type
Contract
Salary
GBP Daily
Posted