Senior Technical Lead
Location: Norwich, Norfolk (3 Days a week)
Job Type: Contract (Inside IR35)
Duration: 6 Months The Role
We are looking for a Senior Technical Lead who combines hands-on engineering excellence with strong leadership and stakeholder management. You will own the end-to-end technical delivery of data platforms and pipelines built in AWS-with a focus on AWS Glue, Managed Workflows for Apache Airflow (MWAA), and Python-and collaborate closely with Directors, Senior Architects, and Program Leadership to deliver business outcomes at scale.
This is a player-coach role: you will design, build, review, and optimize complex data workflows while mentoring engineers and driving engineering best practices.
Ideal for: Someone who has delivered multiple production programs in a modern AWS data engineering landscape, can communicate trade-offs clearly to senior stakeholders, and can lead teams through ambiguity to predictable, high-quality outcomes.
Your Responsibilities:- Lead the design and implementation of scalable, secure, and cost-efficient ETL/ELT pipelines using AWS Glue, Python (PySpark), and MWAA (Airflow).
- Define solution architectures, data models, orchestration patterns, and CI/CD for data workflows.
- Own the technical roadmap, decomposition, and delivery plan-including sizing, sprint planning, and risk mitigation.
- Drive performance optimization (e.g., partitioning strategies, Glue job tuning, job bookmarks, dynamic frames vs DataFrames, retry/backoff strategies in Airflow).
- Ensure robust observability (logging, metrics, tracing) and data quality (unit tests, Great Expectations/Deequ-style checks, validations).
- Act as the technical point of contact for Senior Architects, and Program Managers; translate business needs into technical designs and delivery milestones.
- Present architecture decisions, trade-offs, and TCO to senior stakeholders with clarity, data, and rationale.
- Manage vendor/partner coordination where relevant.
- Establish coding standards, code review practices, branching strategies, and secure-by-design principles.
- Implement DevSecOps for data: infrastructure-as-code (IaC), secrets management, environment promotion, and automated testing.
- Ensure compliance with data governance, security, and regulatory requirements (e.g., PII/PCI, encryption, auditability, lineage).
- Mentor and upskill engineers; foster a culture of learning, ownership, and continuous improvement.
Essential skills/knowledge/experience:
- 10+ years of total experience in software/data engineering, with 5+ years leading delivery of production solutions in an AWS data engineering environment.
Advanced hands-on expertise with:
- Python (including PySpark & data engineering patterns)
- AWS Glue (Jobs, Crawlers, Glue Studio, Glue Catalog, PySpark, Job bookmarks)
- MWAA (Apache Airflow) (DAG design, scheduling, sensors, retries, XComs, task isolation, best practices)
Strong across broader AWS services:
- S3, Lambda, Step Functions, IAM, CloudWatch, KMS, Secrets Manager, Athena, EMR (nice to have), Redshift (nice to have)
- Proven experience delivering multiple end-to-end programs (architecture build test deploy operate) with measurable outcomes (SLAs, cost targets, performance).
- Excellent stakeholder communication and executive presence; able to engage Directors, Senior Architects, and Program Leadership.
- Solid grounding in data modeling, data governance, security/compliance, and cost optimization on AWS.
- Experience with CI/CD (e.g., CodePipeline/GitHub Actions/Bitbucket Pipelines), IaC (CloudFormation/Terraform), and containerization (Docker).
- Architectural thinking: designs for scale, reliability, cost, and evolvability.
- Delivery excellence: breaks down complex work, sets milestones, manages risks, and delivers on time.
- Communication & influence: distills complexity for senior stakeholders; backs decisions with data.
- Hands-on leadership: sets the technical bar through reviews, pairing, and exemplars.
- Ownership & clarity: aligns teams on problem statements, success criteria, and measurable outcomes.
Languages:
- Python (PySpark), SQL
AWS:
- Glue, MWAA (Airflow), S3, IAM, KMS, CloudWatch, Lambda, Step Functions, Athena, Redshift (nice), EMR (nice)
DevOps:
- Git, CI/CD (CodePipeline/GitHub Actions), Terraform/CloudFormation, Docker
Data Quality/Observability:
- Great Expectations/Deequ (nice), OpenLineage (nice)
- Domain experience in BFSI (risk, pricing, regulatory reporting, underwriting, fraud, payments, or actuarial data).
- Experience with event-driven and near-real-time pipelines (Kafka/Kinesis, streaming ETL).
- Knowledge of data quality frameworks (Great Expectations, Deequ) and data lineage/catalog (Atlas, Alation, Collibra).
- Exposure to Databricks or EMR for advanced Spark workloads.
- Certifications: AWS Solutions Architect / Data Analytics / DevOps Engineer.
- Prior experience leading multi-team programs with offshore/nearshore models.
JBRP1_UKTJ