Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
Experis
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
newport, midlands, united kingdom Hybrid / WFH Options
Experis
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Creditsafe
Business Analyst to translate business needs into documented technical specifications for bespoke solutions. Design, develop, test and release bespoke solutions to meet client expectations using technologies such as Python, Airflow and S3. Provide pre-sale and post-sale support to clients. Review existing implementations and make recommendations for improvements to the efficiency and effectiveness. Contribute to the continuous improvement More ❯
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
fully documented and meet appropriate standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and VPC networking. Experience integrating data … Terraform (CDKtf) and AWS CDK with TypeScript. Ability to clearly document technical solutions and communicate with both technical and non-technical stakeholders. Desirable: Experience with job orchestration tools (e.g., Airflow, AWS Step Functions) Exposure to finance data structures or ERP systems (e.g., Oracle Fusion) Familiarity with CI/CD pipelines and deployment strategies in a cloud environment Monitoring and More ❯
class-leading data and ML platform infrastructure, balancing maintenance with exciting greenfield projects. develop and maintain our real-time model serving infrastructure, utilising technologies such as Kafka, Python, Docker, Apache Flink, Airflow, and Databricks. Actively assist in model development and debugging using tools like PyTorch, Scikit-learn, MLFlow, and Pandas, working with models from gradient boosting classifiers to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
a London base (flexibility offered) High-impact role with a growing, values-driven data team Platform-focused, mission-led engineering Work with a modern cloud-native stack (Snowflake, DBT, Airflow, Terraform, AWS) What You'll Be Doing Serve as the technical lead for cross-functional data initiatives Define and champion best practices for building scalable, governed, high-quality data … teams-product managers, analysts, ML engineers, and more What You'll Bring Extensive experience designing and building modern data platforms Strong skills in Python , SQL , and tools like DBT , Airflow , Fivetran Expertise in cloud services (ideally AWS ) and IaC tools like Terraform Deep understanding of data architecture , ELT pipelines, and governance A background in software engineering principles (CI/… technical and non-technical stakeholders A collaborative mindset and passion for coaching others Tech Environment Cloud : AWS (Kinesis, Lambda, S3, ECS, etc.) Data Warehouse : Snowflake Transformation & Orchestration : Python, DBT, Airflow IaC & DevOps : Terraform, GitHub Actions, Jenkins Monitoring & Governance : Monte Carlo, Collate Interested? If you're excited about platform-level ownership, technical influence, and building systems that help people tell More ❯
and data best practices that will be used across the organisation, including taking ownership of our data transformation and orchestration tooling; batch and streaming infrastructure and exploration tools (Databricks, Airflow, dbt) and look after our Datalake (ingestion, storage, governance, privacy). You'll work with a modern, cutting-edge data stack and play a key role in shaping data … data users: ML Engineers, analysts, analytics engineers and have a strong grasp of their needs and how they operate Big data technologies, with expertise in tools & platforms such as Airflow, dbt, Kafka, Databricks and data observability & catalogue) solutions (e.g. Monte Carlo, Atlan, Datahub) Cloud Platform Proficiency: Familiarity with AWS, GCP, or Microsoft Azure, with hands-on experience building scalable More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
support experimentation and deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference … Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery 🎁 What’s on More ❯
support experimentation and deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference … Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery 🎁 What’s on More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
Brighton, Sussex, United Kingdom Hybrid / WFH Options
Burns Sheehan
Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML We are partnered with a private equity backed company who provide an AI-powered, guided selling platform that helps businesses improve online sales and customer experience. They are looking for a Lead Data Engineer to lead a small team … experience in a Senior Data Engineering role. Comfortable owning and delivering technical projects end-to-end. Strong in Python, SQL, and cloud platforms (AWS or comparable). Experience with Airflow, Snowflake, Docker (or similar). Familiarity with coaching and mentoring more junior engineers, leading 1-1s and check ins. Wider tech stack : AWS, Python, Airflow, Fivetran, Snowflake … Enhanced parental leave and pay If you are interested in finding out more, please apply or contact me directly! Lead Data Engineer £75,000-£85,000 ️ AWS, Python, SQL, Airflow Brighton, hybrid working Analyse customer behaviour using AI & ML Burns Sheehan Ltd will consider applications based only on skills and ability and will not discriminate on any grounds. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
ll be involved in designing and building production-grade ETL pipelines, driving DevOps practices across data systems and contributing to high-availability architectures using tools like Databricks, Spark and Airflow- all within a modern AWS ecosystem. Responsibilities Architect and build scalable, secure data pipelines using AWS, Databricks and PySpark. Design and implement robust ETL/ELT solutions for both … structured and unstructured data. Automate workflows and orchestrate jobs using Airflow and GitHub Actions. Integrate data with third-party APIs to support real-time marketing insights. Collaborate closely with cross-functional teams including Data Science, Software Engineering and Product. Champion best practices in data governance, observability and compliance. Contribute to CI/CD pipeline development and infrastructure automation (Terraform More ❯
trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform s that enables a connected collection of … and driv ing a culture of iterative improvemen t. Modern data stack - hands-on deploy ment and govern ance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran , Airflow, AWS , GitHub, Terraform, etc ) for self-service workloads . Thought leadership and influencing - deep interest in data platforms landscape to build well-articulated proposals that are supported by strong More ❯
Join our rapidly expanding team as a hands-on Cloud Data Analytics Platform Engineer and play a pivotal role in shaping the future of data at Citi. We're building a cutting-edge, multi-cloud data analytics platform that empowers More ❯
Engineering Manager, Martech page is loaded Engineering Manager, Martech Apply remote type Partially Remote locations London, UK Remote, UK time type Full time posted on Posted 30+ Days Ago job requisition id JR4087 Company Description Depop is the community-powered More ❯
Senior Data Engineer 100% Remote B2B Contract Full-time position with flexible working hours (overlap with US required) We're looking for a Senior Data Engineer for a company that facilitates freelancing and remote work. Their platform provides a marketplace More ❯
position? We are seeking a passionate DataOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure. Key Responsibilities: Design, build, and optimize data pipelines using Airflow, DBT, and Databricks. Monitor and improve pipeline performance to support real-time and batch processing. Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as … experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance. Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent work experience. Minimum of … years of experience in DataOps or similar. Proficiency in key technologies, including Airflow, Snowflake, and SageMaker. Certifications in AWS/Snowflake/other technologies a plus. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and manage multiple priorities effectively. What's in it for you? We offer our employees more than just competitive compensation. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Harnham
and resolve ETL failures or data issues Collaborate with cross-functional and offshore teams, as well as suppliers Hands-on support for tools like Power BI, AWS, SQL and AirFlow Staying ahead of emerging AI tech and research to propose exciting solutions Proactively manage and escalate data issues SKILLS AND EXPERIENCE Required 5+ years industry experience (flexible depending on … quality of experience) Airflow (must-have), AWS (Redshift, S3, Glue), Power BI Strong SQL, as well as Python AWS ecosystem familiarity is essential Both hands-on and management/leadership experience is required Able to work in a fast-paced, dynamic environment This includes a two-stage interview process! This role cannot sponsor. Apply below More ❯
Karlsruhe, Baden-Württemberg, Germany Hybrid / WFH Options
Cinemo GmbH
.000 € per year Requirements: Minimum 1 to 2 years of proven experience in ML-Ops, including end-to-end machine learning lifecycle management Familiarity with MLOps tools like MLFlow, Airflow, Kubeflow or custom implemented solutions. Experience designing and managing CI/CD pipelines for machine learning projects with experience in CI/CD tools (e.g., Github actions, Bitbucket Pipelines … workflows Automate repetitive and manual processes involved in machine learning operations to improve efficiency Implement and manage in-cloud ML-Ops solutions, leveraging Terraform for infrastructure as code Technologies: Airflow AWS BitBucket CI/CD Cloud Embedded GitHub Support Kubeflow Machine Learning Mobile Python Terraform C++ DevOps More: Cinemo is a global provider of highly innovative infotainment products that More ❯