Telford, Shropshire, England, United Kingdom Hybrid / WFH Options
eTeam Inc
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. The successful candidate will also need to be SC Vetted. More ❯
Telford, Shropshire, England, United Kingdom Hybrid / WFH Options
eTeam Inc
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. The successful candidate will also need to be SC Vetted. If you are interested in this position More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
newport, midlands, united kingdom Hybrid / WFH Options
Experis
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
following skills are desired by the client: Experience in public sector environments. Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. If you are interested in this opportunity, please apply now with your More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Creditsafe
Business Analyst to translate business needs into documented technical specifications for bespoke solutions. Design, develop, test and release bespoke solutions to meet client expectations using technologies such as Python, Airflow and S3. Provide pre-sale and post-sale support to clients. Review existing implementations and make recommendations for improvements to the efficiency and effectiveness. Contribute to the continuous improvement More ❯
Telford, Shropshire, West Midlands, United Kingdom
LA International Computer Consultants Ltd
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. Due to the nature and urgency of this post, candidates holding or who have held high level More ❯
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. Due to the nature and urgency of this post, candidates holding or who have held high level More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
a London base (flexibility offered) High-impact role with a growing, values-driven data team Platform-focused, mission-led engineering Work with a modern cloud-native stack (Snowflake, DBT, Airflow, Terraform, AWS) What You'll Be Doing Serve as the technical lead for cross-functional data initiatives Define and champion best practices for building scalable, governed, high-quality data … teams-product managers, analysts, ML engineers, and more What You'll Bring Extensive experience designing and building modern data platforms Strong skills in Python , SQL , and tools like DBT , Airflow , Fivetran Expertise in cloud services (ideally AWS ) and IaC tools like Terraform Deep understanding of data architecture , ELT pipelines, and governance A background in software engineering principles (CI/… technical and non-technical stakeholders A collaborative mindset and passion for coaching others Tech Environment Cloud : AWS (Kinesis, Lambda, S3, ECS, etc.) Data Warehouse : Snowflake Transformation & Orchestration : Python, DBT, Airflow IaC & DevOps : Terraform, GitHub Actions, Jenkins Monitoring & Governance : Monte Carlo, Collate Interested? If you're excited about platform-level ownership, technical influence, and building systems that help people tell More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
using cloud-based architectures and tools Experience delivering data engineering solutions on cloud platforms, preferably Oracle OCI, AWS, or Azure Proficient in Python and workflow orchestration tools such as Airflow or Prefect Expert in data modeling, ETL, and SQL Experience with real-time analytics from telemetry and event-based streaming (e.g., Kafka) Experience managing operational data stores with high … availability, performance, and scalability Expertise in data lakes, lakehouses, Apache Iceberg, and data mesh architectures Proven ability to build, deliver, and support modern data platforms at scale Strong knowledge of data governance, data quality, and data cataloguing Experience with modern database technologies, including Iceberg, NoSQL, and vector databases Embraces innovation and works closely with scientists and partners to explore More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
office) Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real … Excellent experience of DBT, SQL and Python Good customer-facing/pitching experience and being a self-sufficient person A proactive mindset with excellent problem-solving skills Experience of airflow and medallion is desirable A degree in computer science or a related degree is beneficial Benefits: Company bonus scheme (Based in annual profit made by the company) Pension … with data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
Burns Sheehan
Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML We are partnered with a private equity backed company who provide an AI-powered, guided selling platform that helps businesses improve online sales and customer experience. They are looking for a Lead Data Engineer to lead a small team … experience in a Senior Data Engineering role. Comfortable owning and delivering technical projects end-to-end. Strong in Python, SQL, and cloud platforms (AWS or comparable). Experience with Airflow, Snowflake, Docker (or similar). Familiarity with coaching and mentoring more junior engineers, leading 1-1s and check ins. Wider tech stack : AWS, Python, Airflow, Fivetran, Snowflake … Enhanced parental leave and pay If you are interested in finding out more, please apply or contact me directly! Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
ll be involved in designing and building production-grade ETL pipelines, driving DevOps practices across data systems and contributing to high-availability architectures using tools like Databricks, Spark and Airflow- all within a modern AWS ecosystem. Responsibilities Architect and build scalable, secure data pipelines using AWS, Databricks and PySpark. Design and implement robust ETL/ELT solutions for both … structured and unstructured data. Automate workflows and orchestrate jobs using Airflow and GitHub Actions. Integrate data with third-party APIs to support real-time marketing insights. Collaborate closely with cross-functional teams including Data Science, Software Engineering and Product. Champion best practices in data governance, observability and compliance. Contribute to CI/CD pipeline development and infrastructure automation (Terraform More ❯
oversight across the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or … mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of modern More ❯
oversight across the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or … mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of modern More ❯
troubleshoot, and manage daily Data Ops, including ETL workflows and Power BI releases Be the escalation point for Level 2/3 support, resolving issues hands-on across AWS (Airflow, S3, Redshift, Glue) Lead and coordinate with a team of 5–6 offshore data engineers and suppliers Support and automate Power BI deployments and data pipeline releases Own release … communications and issue resolution across stakeholders Work closely with the BI Operations Manager and wider tech/data teams Tech Stack You’ll Use: AWS: Glue, S3, Redshift, Airflow Power BI: Deployments, troubleshooting, performance tuning ETL & Scripting: SQL, Python (desirable) Monitoring & incident response in a live production data environment What You'll Need ✅ Extensive experience in analytics engineering Strong More ❯
Lumi Space is empowering the future prosperity of earth - making space scalable and sustainable using ground-based laser systems. We work with global companies and institutions to build products and services to precisely track satellites and remove the dangers of More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
PEXA Group Limited
end data quality, from raw ingested data to business-ready datasets Optimise PySpark-based data transformation logic for performance and reliability Build scalable and maintainable pipelines in Databricks and Airflow Implement and uphold GDPR-compliant processes around PII data Collaborate with stakeholders to define what "business-ready" means, and confidently sign off datasets as fit for consumption Put testing … internal and external customers Skills & Experience Required Extensive hands-on experience with PySpark, including performance optimisation Deep working knowledge of Databricks (development, architecture, and operations) Proven experience working with Airflow for orchestration Proven track record in managing and securing PII data, with GDPR compliance in mind Experience in data governance processes; Alation experience preferred, but similar toolswelcome Strong SQL More ❯