on experience with CI/CD pipelines, using tools such as GitLab CI, Jenkins, or ArgoCD. * Knowledge of containerization and orchestration, including Docker and Kubernetes. * Strong scripting skills in Python, Bash, or PowerShell for automation. * Understanding of AWS networking concepts, including VPCs, subnets, security groups. * Experience with monitoring and logging solutions, such as Prometheus, Grafana, ELK Stack, or AWS CloudWatch. More ❯
releases Support live systems and troubleshoot issues Skills Needed AWS (Azure/GCP a plus) Git, GitLab CI/CD, Terraform, Ansible Docker, Kubernetes Security tools (SonarQube, vulnerability management) Python and Linux scripting Monitoring tools (Prometheus, Grafana) Defence experience is a bonus. Must be proactive and collaborative. Reasonable Adjustments: Respect and equality are core values to us. We are proud More ❯
on experience with Databricks, PySpark, Delta Lake, MLflow . Experience with LLMs (Hugging Face, LangChain, Azure OpenAI) . Strong MLOps, CI/CD, and model monitoring experience. Proficiency in Python, PyTorch/TensorFlow, FastAPI/Flask . Cloud architecture experience: Azure preferred, AWS/GCP acceptable . Skilled in Docker, Kubernetes, Helm, Terraform, IaC for deploying ML and web apps. More ❯
Manchester, England, United Kingdom Hybrid/Remote Options
Naimuri
dives into data and presented the results of analysis and modelling using tools like Jupyter Notebooks. Has experience designing and developing data ingestion and transformation pipelines in languages like Python, potentially using cloud solutions in AWS, Azure, or GCP. Is familiar with the full lifecycle of ML/AI models, including collating training data, design, training, evaluation, and deploying automated … Nice to haves: Experience with any of the following specialisms: Data Synthesis, Test and Evaluation, AI Assurance, Knowledge Graphs and Ontologies, Data Governance and Compliance, or Deepfake Detection. Creating Python-based applications and/or APIs. A degree in a field like data science, physics, computational science, mathematics, or statistics (though we value demonstrable experience just as much!). Location More ❯
the following: ESRI technologies (ESRI ArcGIS API for JavaScript/REST, WebApp builder, ArcGIS Runtime DSK for .Net) Open-source technology (Open Layers, Geo Server, PostgreSQL) Experience in using python to develop geoprocessing solutions and using FME/Erdas Imagine Excellent skills in terraform and ansible Experience working on CI/CD pipeline setup using Jenkins with Continuous testing and More ❯
technologies and best practices. Required Skills & Experience: Strong experience in automation engineering for infrastructure and container platforms. Hands-on experience with OpenShift and Kubernetes environments. Proficiency in scripting languages (Python, Bash) and configuration management tools (Ansible, Puppet). Familiarity with Infrastructure-as-Code tools (Terraform, CloudFormation). Knowledge of CI/CD pipelines and DevOps practices. Understanding of Linux systems More ❯
Corsham, Wiltshire, United Kingdom Hybrid/Remote Options
CBSbutler Holdings Limited trading as CBSbutler
Skills & Experience Solid experience with Infrastructure as Code frameworks. Strong Linux, networking, and cloud architecture expertise. Hands-on experience supporting applications in PHP and JavaScript environments. Strong scripting skills (Python, Bash, etc.). Comfort working in high-security, high-accountability settings. Ability to automate repetitive tasks out of pure principle. Desirable Skills & Qualifications The following experience is beneficial: Previous work More ❯
Sheffield, South Yorkshire, Orchard Square, United Kingdom
CBSbutler Holdings Limited trading as CBSbutler
technologies and best practices. Required Skills & Experience: Strong experience in automation engineering for infrastructure and container platforms. Hands-on experience with OpenShift and Kubernetes environments. Proficiency in scripting languages (Python, Bash) and configuration management tools (Ansible, Puppet). Familiarity with Infrastructure-as-Code tools (Terraform, CloudFormation). Knowledge of CI/CD pipelines and DevOps practices. Understanding of Linux systems More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom
Crimson
Skills & Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T-SQL). Skilled in data modelling, quality, and metadata management. Experience with CI/CD and Infrastructure as Code using Azure DevOps and Terraform. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Alexander Edward James Consulting Limited
SaaS environment, balancing stability and innovation. Desirable Previous experience in SaaS or high-growth technology companies. Knowledge of containerisation (e.g., Kubernetes, Docker). Scripting/programming skills (e.g., PowerShell, Python, Bash). Seniority Level Mid-Senior level Industry Software Development IT Services and IT Consulting Business Consulting and Services Employment Type Full-time Job Functions Information Technology Skills Networking Management More ❯
administration in cloud or virtualised environments. Basic understanding of networking concepts such as VPCs, subnets, routing, firewalls, and VPNs. Familiarity with CI/CD pipelines Exposure to scripting languages (Python, Bash, PowerShell) for automation tasks. Understanding of monitoring and logging tools (CloudWatch, Azure Monitor, Stackdriver, or similar). Experience working collaboratively in Agile or DevOps environments (advantageous). Relevant cloud More ❯
Banbury, Oxfordshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Ensure adherence to governance standards. Line-manage and mentor a small team of Data Engineers. What We're Looking For Extensive Databricks experience, including Unity Catalog. Strong skills in Python, Spark, SQL and experience with SQL databases. Terraform experience for cloud infrastructure as code. Experience with Azure and workflow tools (Airflow, ADF). Excellent problem-solving ability, communication skills, and More ❯
and ensure minimal downtime for critical AI services. Required Skills Strong hands-on experience with GCP services : Compute Engine, Kubernetes, Cloud Storage, BigQuery, Cloud Run. Proficient in scripting with Python or Bash . Deep understanding of Docker and Kubernetes for containerization and orchestration. Expertise in CI/CD tools : Google Cloud Build, Jenkins, GitHub Actions. Proven experience with Terraform and More ❯
Hook Norton, Oxfordshire, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Ensure adherence to governance standards. Line-manage and mentor a small team of Data Engineers. What We're Looking For Extensive Databricks experience, including Unity Catalog. Strong skills in Python, Spark, SQL and experience with SQL databases. Terraform experience for cloud infrastructure as code. Experience with Azure and workflow tools (Airflow, ADF). Excellent problem-solving ability, communication skills, and More ❯
as AWS CodePipeline, Jenkins, GitHub Actions, or GitLab CI. - Knowledge of monitoring and logging tools, including CloudWatch, ELK stack, and Prometheus/Grafana. - Strong coding skills in languages like Python, Java, Node.js, or Go. - Ability to design scalable, secure, and resilient cloud-native applications. Additional Experience Required: - Ability to act on own authority to manage and provide direction to projects More ❯
ability to orchestrate and automate end-to-end CI/CD pipelines for software deployment, testing, and monitoring. 5. Scripting and Programming: Proficient in multiple programming languages (e.g., Terraform, Python, Bash, Ansible). Able to design, code, test, correct, and document programs and scripts. Experience reviewing specifications and defining test conditions and procedures. Have directed code reviews and promote refactoring More ❯
on experience with CI/CD pipelines using tools such as GitLab CI, Jenkins, or ArgoCD. Knowledge of containerisation and orchestration , including Docker and Kubernetes. Strong scripting skills in Python, Bash, or PowerShell for automation. Understanding of AWS networking concepts , including VPCs, subnets, and security groups. Experience with monitoring and logging tools such as Prometheus, Grafana, ELK Stack, or CloudWatch. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
contributing to sprint planning, peer reviews and continuous improvement initiatives Essential Skills for the AWS Data Engineer: Extensive hands-on experience with AWS data services Strong programming skills in Python (including libraries such as PySpark or Pandas) Solid understanding of data modelling, warehousing and architecture design within cloud environments Experience building and managing ETL/ELT workflows and data pipelines More ❯
Azure Storage. Strong SQL skills and expertise in dimensional modelling (e.g., star/snowflake schemas). Familiarity with Power BI dataflows, DAX, and RLS setup. Hands-on experience with Python, PySpark, or T-SQL for data processing and automation. Understanding of CI/CD practices in data projects using Git and YAML pipelines. Solid knowledge of data security, governance, and More ❯
data generation, and prompt engineering. Mandatory Skills: Cloud Platforms:Deep experience with AWS (S3, Lambda, Glue, Redshift) and/or Azure (Data Lake, Synapse). Programming & Scripting:Proficiency in Python, SQL, PySpark etc. ETL/ELT & Streaming:Expertise in technologies like Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI More ❯
Experience Proven experience designing and implementing cloud-based solutions (Azure, AWS, or GCP). Strong grasp of architecture principles, automation tools, containerisation, and DevOps . Hands-on experience with Python, Java, or scripting languages . Deep understanding of security and compliance frameworks in complex environments. Ability to translate complex technical concepts into clear, actionable insights for stakeholders. Excellent problem-solving More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid/Remote Options
Accenture
You’ll work with client teams to deliver intelligent data products, leveraging modern cloud and AI technologies. Key Responsibilities Design and implement robust data pipelines and ML workflows using Python, SQL, Spark, and Databricks. Develop and deploy machine learning models (including NLP, deep learning, and agentic AI) in production environments. Integrate data from diverse sources, including streaming and batch ingestion More ❯
City Of Westminster, London, United Kingdom Hybrid/Remote Options
Additional Resources
Infrastructure Engineer, Cloud Data Engineer, DataOps Engineer, Data Pipeline Engineer, Devops Engineer or in a similar role. Proven experience with Azure cloud platforms and related architecture. Highly skilled in Python for data engineering, scripting, and automation. Strong working knowledge of Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience More ❯
Westminster, City of Westminster, Greater London, United Kingdom Hybrid/Remote Options
Additional Resources
Infrastructure Engineer, Cloud Data Engineer, DataOps Engineer, Data Pipeline Engineer, Devops Engineer or in a similar role. Proven experience with Azure cloud platforms and related architecture. Highly skilled in Python for data engineering, scripting, and automation. Strong working knowledge of Kubernetes, Docker, and cloud-native data ecosystems. Demonstrable experience with Infrastructure as Code tools (Terraform, Ansible). Hands-on experience More ❯