Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
Experis
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
pressure and manage competing priorities. Desirable Qualifications: Experience in public sector environments. Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. If you are interested in this position and would like to learn More ❯
newport, midlands, united kingdom Hybrid / WFH Options
Experis
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer More ❯
following skills are desired by the client: Experience in public sector environments. Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. If you are interested in this opportunity, please apply now with your More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Creditsafe
Business Analyst to translate business needs into documented technical specifications for bespoke solutions. Design, develop, test and release bespoke solutions to meet client expectations using technologies such as Python, Airflow and S3. Provide pre-sale and post-sale support to clients. Review existing implementations and make recommendations for improvements to the efficiency and effectiveness. Contribute to the continuous improvement More ❯
Telford, Shropshire, West Midlands, United Kingdom
LA International Computer Consultants Ltd
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. Due to the nature and urgency of this post, candidates holding or who have held high level More ❯
with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. Due to the nature and urgency of this post, candidates holding or who have held high level More ❯
with clients - Collaborating with cross-functional teams to deploy and operate solutions in production - Supporting real-time and near-real-time data analytics initiatives - Leveraging orchestration tools such as Airflow, Dagster, Azure Data Factory or Fivetran Required qualifications to be successful in this role - Solid experience designing and delivering Snowflake-based data warehouse solutions - Strong background performing architectural assessments … Python, Java or Scala - Hands-on experience using DBT for pipeline development and transformation - Familiarity with cloud platforms such as AWS, Azure or GCP - Knowledge of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory, Fivetran) Desirable: - Experience deploying AI/ML models in production environments - Familiarity with AWS data services (e.g., S3, Glue, Kinesis, Athena) - Exposure to real-time More ❯
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
fully documented and meet appropriate standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and VPC networking. Experience integrating data … Terraform (CDKtf) and AWS CDK with TypeScript. Ability to clearly document technical solutions and communicate with both technical and non-technical stakeholders. Desirable: Experience with job orchestration tools (e.g., Airflow, AWS Step Functions) Exposure to finance data structures or ERP systems (e.g., Oracle Fusion) Familiarity with CI/CD pipelines and deployment strategies in a cloud environment Monitoring and More ❯
class-leading data and ML platform infrastructure, balancing maintenance with exciting greenfield projects. develop and maintain our real-time model serving infrastructure, utilising technologies such as Kafka, Python, Docker, Apache Flink, Airflow, and Databricks. Actively assist in model development and debugging using tools like PyTorch, Scikit-learn, MLFlow, and Pandas, working with models from gradient boosting classifiers to More ❯
respond to suspicious activity. We own the end-to-end platform that powers our real-time and batch monitoring capabilities, including: A custom alerting and orchestration platform, built on Airflow, that enables scalable, auditable detection pipelines. Data pipelines in DBT and Snowflake that serve both ML models and rule-based logic. Backend services and APIs that handle case management … case management, and customer termination. Build and maintain robust, well-tested code with a focus on performance, reliability, and operational efficiency. Maintain and evolve the engineering infrastructure behind our Airflow-based alerting platform, enabling analysts to deploy and manage DAGs safely and effectively. Contribute to the development and maintenance of DBT models and data pipelines integrated with Snowflake to … software design principles. Proficiency in at least one of the following languages: Python, Golang, Java. Experience with multiple languages is a plus. Familiarity with data pipeline tooling such as Airflow and DBT, and cloud data warehouses like Snowflake. Understanding of testing strategies, including unit, integration, and system testing (TDD/BDD is a plus). Experience with CI/ More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
a London base (flexibility offered) High-impact role with a growing, values-driven data team Platform-focused, mission-led engineering Work with a modern cloud-native stack (Snowflake, DBT, Airflow, Terraform, AWS) What You'll Be Doing Serve as the technical lead for cross-functional data initiatives Define and champion best practices for building scalable, governed, high-quality data … teams-product managers, analysts, ML engineers, and more What You'll Bring Extensive experience designing and building modern data platforms Strong skills in Python , SQL , and tools like DBT , Airflow , Fivetran Expertise in cloud services (ideally AWS ) and IaC tools like Terraform Deep understanding of data architecture , ELT pipelines, and governance A background in software engineering principles (CI/… technical and non-technical stakeholders A collaborative mindset and passion for coaching others Tech Environment Cloud : AWS (Kinesis, Lambda, S3, ECS, etc.) Data Warehouse : Snowflake Transformation & Orchestration : Python, DBT, Airflow IaC & DevOps : Terraform, GitHub Actions, Jenkins Monitoring & Governance : Monte Carlo, Collate Interested? If you're excited about platform-level ownership, technical influence, and building systems that help people tell More ❯
and data best practices that will be used across the organisation, including taking ownership of our data transformation and orchestration tooling; batch and streaming infrastructure and exploration tools (Databricks, Airflow, dbt) and look after our Datalake (ingestion, storage, governance, privacy). You'll work with a modern, cutting-edge data stack and play a key role in shaping data … data users: ML Engineers, analysts, analytics engineers and have a strong grasp of their needs and how they operate Big data technologies, with expertise in tools & platforms such as Airflow, dbt, Kafka, Databricks and data observability & catalogue) solutions (e.g. Monte Carlo, Atlan, Datahub) Cloud Platform Proficiency: Familiarity with AWS, GCP, or Microsoft Azure, with hands-on experience building scalable More ❯
and DataOps as well as System engineers to support both data and application integrations using bespoke tools written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/Snaplogic, Apache NIFI, and Kafka, ensuring a robust, well-modelled, and scalable data analytics infrastructure running on MySQL and Postgres style databases primarily. Requirements: Advanced SQL … compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build tool) Proficiency with business intelligence tools (Power BI, Tableau, SAP BI, or similar). Integration & Programming Hands-on experience with API development and … integration (REST/SOAP) Proficiency in at least 1 object/procedural/functional language (e.g: Java, PHP, Python) Familiarity with EAI tools such as MuleSoft/SnapLogic or Apache NiFi Experience with infrastructure-as-code tools such as Terraform and Ansible Experience with version control (e.g. Git, SVN) and CI/CD workflows for deployment Experience scraping external More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
using cloud-based architectures and tools Experience delivering data engineering solutions on cloud platforms, preferably Oracle OCI, AWS, or Azure Proficient in Python and workflow orchestration tools such as Airflow or Prefect Expert in data modeling, ETL, and SQL Experience with real-time analytics from telemetry and event-based streaming (e.g., Kafka) Experience managing operational data stores with high … availability, performance, and scalability Expertise in data lakes, lakehouses, Apache Iceberg, and data mesh architectures Proven ability to build, deliver, and support modern data platforms at scale Strong knowledge of data governance, data quality, and data cataloguing Experience with modern database technologies, including Iceberg, NoSQL, and vector databases Embraces innovation and works closely with scientists and partners to explore More ❯
ideal candidate will have good software development experience with Python coupled with strong SQL skills. In addition, you will also have a strong desire to work with Docker, Kubernetes, Airflow and the AWS data technologies such as Athena, Redshift, EMR and various other tools in the AWS ecosystem. You would be joining a team of 25+ engineers across mobile … skills Familiarity with continuous integration, unit testing tools and related practices Understanding of Agile Scrum software development lifecycle What you'll be doing: Implementing and maintaining ETL pipelines using Airflow & AWS technologies Contributing to data-driven tools owned by the data engineering team, including content personalisation Responsibility of ingestion framework and processes Helping monitor and look after our data More ❯
platforms with demonstrated ability to solve complex distributed systems problems independently Experience building infrastructure for large-scale data processing pipelines (both batch and streaming) using tools like Spark, Kafka, Apache Flink, Apache Beam, and with proprietary solutions like Nebius Experience designing and implementing large-scale data storage systems (feature stores, timeseries DBs) for ML use cases, with strong … versioning, point-in-time correctness, and evolving schemas Strong distributed systems and infrastructure skills - comfortable scaling and debugging Kubernetes services, writing Terraform, and working with orchestration tools like Flyte, Airflow, or Temporal Experience with cloud platforms (AWS, GCP, Azure) and container technologies Strong software engineering skills with ability to write easy-to-extend and well-tested code Excellent communication More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
a petabyte-scale Data Lake and create secure, efficient, and scalable environments for our data platforms. Leveraging cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, you will prepare to adopt Apache Iceberg for greater performance and flexibility. You'll address high-performance data workloads, ensuring seamless execution of massive queries More ❯
office) Due to the nature of some of the companies clients. you must have a minimum of 5 years continuous UK residency. Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Are you passionate about building scalable data solutions that drive real … Excellent experience of DBT, SQL and Python Good customer-facing/pitching experience and being a self-sufficient person A proactive mindset with excellent problem-solving skills Experience of airflow and medallion is desirable A degree in computer science or a related degree is beneficial Benefits: Company bonus scheme (Based in annual profit made by the company) Pension … with data? Apply now and become part of a business where data drives every decision. Please send your CV to peter.hutchins @ circlerecruitment.com Data Engineer, Data Platform, Azure, Google Cloud, Airflow, DBT, Medallion, SQL, Python, Data Engineering, Dashboards, Data implementation, CI/CD, Data Pipelines, GCP, Cloud, Data Analytics Circle Recruitment is acting as an Employment Agency in relation to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
support experimentation and deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference … Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery 🎁 What’s on More ❯
support experimentation and deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference … Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery 🎁 What’s on More ❯