Containerizing applications with Docker and deploying them as microservices on serverless platforms like Google Cloud Run and orchestrators like GKE. Writing automation scripts and infrastructure-related code, primarily in Python . Ensuring the high availability, performance, and security of our production systems that support everything from data processing to model deployment. Collaborating closely with AI and backend engineers to streamline … our team to innovate at an unprecedented pace. 🛠️ Skills and Experience 5+ years of experience in a Cloud Infrastructure, DevOps, or Site Reliability Engineering (SRE) role. Strong proficiency with Python for scripting and automation. Extensive hands-on experience with Google Cloud Platform (GCP) and its core services (Cloud Run, GKE, IAM, Cloud Storage). Expertise in writing production-grade Infrastructure More ❯
Containerizing applications with Docker and deploying them as microservices on serverless platforms like Google Cloud Run and orchestrators like GKE. Writing automation scripts and infrastructure-related code, primarily in Python . Ensuring the high availability, performance, and security of our production systems that support everything from data processing to model deployment. Collaborating closely with AI and backend engineers to streamline … our team to innovate at an unprecedented pace. 🛠️ Skills and Experience 5+ years of experience in a Cloud Infrastructure, DevOps, or Site Reliability Engineering (SRE) role. Strong proficiency with Python for scripting and automation. Extensive hands-on experience with Google Cloud Platform (GCP) and its core services (Cloud Run, GKE, IAM, Cloud Storage). Expertise in writing production-grade Infrastructure More ❯
Collaborating with other Engineers in the team, developing and implementing AI-driven software solutions built on a modern, cloud native architecture. Developing high quality, production-ready code primarily in Python Making some contributions to our simulation engine, written in Rust. Helping define and develop the architecture for the team's deliverables. Engaging in code reviews & pair programming with other engineers … with cross-functional teams, including data scientists, project managers, and business stakeholders, to understand customer needs and translate them into technical requirements. What we are looking for: Proficiency in Python, and its use in building modern web applications using frameworks such as FastAPI. A familiarity with Frontend technologies such Typescript or React Knowledge of at least one IaC tool (Terraform More ❯
Stratford-upon-avon, Warwickshire, United Kingdom Hybrid / WFH Options
Ccl Solutions Group
fundamentals (Linux, Windows, Mac, TCP/IP stack). Knowledge of common attack techniques and mitigations (MITRE ATT&CK, OWASP Top 10). Familiarity with scripting and automation using Python, Bash, or PowerShell. Strong understanding of Active Directory attack chains and common privilege escalation paths. Experience interpreting logs and event outputs from OS and security appliances. Certifications: OSCP, OSEP, CRTO … or other advanced offensive security qualifications. Programming/scripting in Python, Ruby, Go, C#, or Java. Experience in red teaming, threat emulation, or purple teaming. Agile experience and knowledge of the common production frameworks is highly desired. Other Role Requirements: Must have been resident in the UK for a minimum of 5 years. Full UK driving licence. Ability to obtain More ❯
communities Experience of mentoring less-experienced developers Significant hands-on experience with the Azure Data Stack, critically ADF and Synapse (experience with Microsoft Fabric is a plus) Highly developed python and data pipeline development knowledge, must include substantial PySpark experience Demonstrable DevOps and DataOps experience with an understanding of best practices for engineering, test and ongoing service delivery An understanding … Medalion Architecture Experience building Semantic, Metric or Analytic models Experience of building Machine Learning models Any experience in MLOps or operationalising Machine Learning Knowledge of Data Quality Frameworks in Python Qualifications: Industry focused degree or equivalent working experience Azure certifications are desirable Developing Others Working Proactively Creativity and Innovation Problem Solving and Judgement Communication and Confidence About Us Life, Work More ❯
Continuous improvement : Stay up-to-date with the latest developments in AI and software engineering, and continuously improve our systems and processes. ️ Is this you? Language fundamentals : Proficient in Python, with a strong understanding of data types, string manipulation, type casting and conversions. Functions : Experienced in calling functions, using default and variable arguments, and writing lambda functions. Virtual environments : Familiar … implementing logging best practices to maintain code clarity and traceability. Inheritance and method overriding : Experienced with object-oriented programming concepts, including inheritance and method overriding. Magic methods: Familiar with Python's magic methods to enhance the functionality of custom classes. Code formatting : Adheres to code formatting standards using tools like black, isort, flake8, and pylint. Testing : Strong background in testing More ❯
and implementation, with experience in hands-on implementation with Data lake and Data Warehouse. - Experience in developing software code using one or more programming languages/frameworks such as Python, Spark, SQL, etc. - Experience leading large-scale full-cycle MPP enterprise Data Warehousing (EDW), Data Lake and Analytics projects - AWS experience preferred, with proficiency in a wide range of AWS … services (e.g., EC2, S3, RDS, Lambda, IAM, VPC, CloudFormation) - AWS Professional level certifications (e.g., Solutions Architect Associate, Speciality or Professional) preferred - Experience with automation and scripting (e.g., Terraform, Python) - Knowledge of security and compliance standards (e.g., HIPAA, GDPR) - Strong communication skills with the ability to explain technical concepts to both technical and non-technical audiences - Experience with migrating mission critical More ❯
rota. Requirements: Bachelor’s or Master’s in Computer Science, Engineering, or a related field. 2+ years of experience in data engineering, preferably within financial services. Proficiency in SQL, Python, and big data tools like Spark. Strong understanding of cloud platforms (AWS, GCP, or Azure). Excellent problem-solving skills and a collaborative mindset. More ❯
rota. Requirements: Bachelor’s or Master’s in Computer Science, Engineering, or a related field. 2+ years of experience in data engineering, preferably within financial services. Proficiency in SQL, Python, and big data tools like Spark. Strong understanding of cloud platforms (AWS, GCP, or Azure). Excellent problem-solving skills and a collaborative mindset. More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
life, support and mentor others, and continuously develop your skills in a collaborative, hybrid working environment. About you Role Responsibilities: Design, build, and maintain scalable machine learning pipelines using Python and PySpark. Work within Databricks to develop, schedule, and monitor data workflows, utilising Databricks Asset Bundles. Collaborate with data analysts, engineers, and other scientists to deliver clean, reliable, and well … ensuring successful integration of outputs into business processes. Essential Skills and Experience 2-5 years of experience in data science or a closely related field. Strong programming skills in Python and PySpark. Solid data science modelling skills with a problem-solving mindset. Strong analytical and communication skills, with the ability to tailor complex insights for both technical and non-technical More ❯
solutions (ideally Azure) as part of a team. Experience in using an agile methodology to deliver infrastructure integration projects within a distributed team. Demonstrable scripting or programming skills (e.g., Python, BASH, Terraform, GitHub, etc.). Main duties of the job We are using modern cloud technology to build the Barts Health Data Platform and its 'Secure Data Environment' (SDE) containing … communicating these within a team. Experience in using an agile methodology to deliver projects within an organization that uses a structured design methodology. Demonstrable scripting or programming skills (e.g., Python, BASH, Terraform, GitHub, etc.) Ability to use continuous integration and distribution pipelines (DevOps) to deploy applications using Infrastructure as Code model and the use of containers in support such a … and ISO/IEC 27001:2005 Knowledge Essential Highly developed and demonstrable enterprise architecture and cloud architecture framework knowledge across multiple deployments. Technical understanding of key scripting languages (Python, BASH, etc.), cloud networking, data storage, security. Aptitude to design technical/security solutions that will fit into long-term business objectives and strategy. Desirable Understanding of digital health innovations. Awareness More ❯
models, matrix factorization, deep learning, etc Experience using statistical and machine learning models to contribute to company growth efforts, impacting revenue and other key business outcomes Advanced understanding of Python and the machine learning ecosystem in Python (Numpy, Pandas, Scikit-learn, LightGBM, PyTorch) Knowledge of SQL and experience with relational databases Agile, action-oriente Nice to have Experience working in More ❯
and Playwright or similar testing frameworks. REST APIs: Strong understanding of integrating and working with RESTful services. Data Skills: Experience in data wrangling/analysis (e.g., using SQL or Python, Jupyter Notebook). Collaboration: Experience working in an Agile environment (Scrum/Kanban). Problem-Solving: Strong analytical and troubleshooting skills. Desirable Skills Familiarity with state management libraries (MobX, Redux … . Exposure to financial data or market analytics projects. Experience with data engineering tools (DuckDB, PySpark, etc.). Knowledge of automated testing frameworks (Playwright, Cypress). Experience of WebAssembly. Python programming experience for data manipulation or API development. Use of AI for creating visualisations. Soft Skills: Exceptional leadership skills with the ability to lead a team of software developers effectively. More ❯
CloudFormation, Terraform or AWS CDK. Configure and manage core AWS services including EC2, S3, IAM, RDS, Lambda, CloudFront, ECS/EKS, and CloudTrail. Develop automation scripts using AWS CLI, Python, Bash, or PowerShell to streamline infrastructure provisioning, deployment, and maintenance tasks. Implement automated CI/CD pipelines for cloud infrastructure deployments using tools like AWS CodePipeline, CodeBuild, or integration with … EC2, S3, IAM, RDS, Lambda, CloudFormation, CloudWatch, CloudTrail, ECS/EKS, Glue, etc. Experience with Infrastructure as Code tools (Terraform, CloudFormation, AWS CDK etc). Proficiency in scripting languages: Python, Bash, PowerShell, and using AWS CLI. Familiarity with CI/CD tools (e.g. AWS CodePipeline, Jenkins, GitLab CI). Solid understanding of AWS security best practices and compliance principles. Strong More ❯
CloudFormation, Terraform or AWS CDK. Configure and manage core AWS services including EC2, S3, IAM, RDS, Lambda, CloudFront, ECS/EKS, and CloudTrail. Develop automation scripts using AWS CLI, Python, Bash, or PowerShell to streamline infrastructure provisioning, deployment, and maintenance tasks. Implement automated CI/CD pipelines for cloud infrastructure deployments using tools like AWS CodePipeline, CodeBuild, or integration with … EC2, S3, IAM, RDS, Lambda, CloudFormation, CloudWatch, CloudTrail, ECS/EKS, Glue, etc. Experience with Infrastructure as Code tools (Terraform, CloudFormation, AWS CDK etc). Proficiency in scripting languages: Python, Bash, PowerShell, and using AWS CLI. Familiarity with CI/CD tools (e.g. AWS CodePipeline, Jenkins, GitLab CI). Solid understanding of AWS security best practices and compliance principles. Strong More ❯
and scaling) of new and existing systems. Experience in automating, deploying, and supporting large-scale infrastructure. Experience programming with at least one modern language such as C++, C#, Java, Python, Golang, PowerShell, Ruby. Experience with distributed systems at scale. Experience working in an Agile environment. Experience building services using AWS products. Bachelor's or Master degree in Engineering or related … pace DevOps. Experience utilizing AWS cloud solutions in a DevOps environment. Experience working in an Agile environment using the Scrum methodology. Knowledge of and proficiency in the use of Python scripting language. Experience on highly concurrent, high throughput systems and knowledge of complex distributed systems. Knowledge of AWS services and concepts. Experience of working and collaborating with people in different More ❯
process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing … technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute to building robust data solutions. Develop code that adheres to high-quality standards More ❯
process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing … technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute to building robust data solutions. Develop code that adheres to high-quality standards More ❯
our bespoke Conversion Framework. Build new and maintain existing bespoke systems. Implement .NET-based microservices with strong observability and integration with data platforms. Develop custom ETL pipelines using AWS, Python, and MySQL. Implement governance, lineage, and monitoring to ensure high availability and traceability. AI & Advanced Analytics Integration: Collaborate with AI/ML teams to enable model training pipelines with robust … in software or data engineering, with strong recent experience in cloud data migrations. Proficient with MySQL, OOP, AWS (EC2, S3, Lambda). Experience with PHP, C#, .NET Core, React, Python, AWS EKS. Experience working on AI projects ideal! Strong knowledge of Git, CI/CD pipelines, and containerization (Docker/Kubernetes). Experience working in environments with AI/ML More ❯
process alignment and shared value creation. As a Data Engineer in the Commercial team, your key responsibilities are as follows: 1. Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing … technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute to building robust data solutions. Develop code that adheres to high-quality standards More ❯
and build on continued success on a global scale. Your key responsibilities Designing, building, deploying and/or managing AI solutions using industry leading approaches across technologies such as Python, Git, SQL and other relevant technologies Engaging with clients at all levels across the organisation both business & technology functions Nurturing long-term trusted advisor relationships Training and managing junior staff … Technology or related fields Relevant experience in delivery of AI design, build, deployment or management Proficiency or certification in Microsoft Office tools, as well as relevant technologies such as Python, TensorFlow, Jupiter Notebook, Spark, Azure Cloud, Git, Docker and/or any other relevant technologies Strong analytical and problem-solving skills, with the ability to work on complex projects and More ❯
Work closely with product and leadership to align tech with business goals; support sales and customer success. Requirements: 7+ years in software engineering and architecture. Strong experience with Azure, Python and infrastructure-as-code (e.g., Terraform). Proven leadership in scaling modern systems and teams. Familiarity with AI/ML pipelines. Excellent communication and stakeholder engagement skills. More ❯
Azure) 💡 Experienced in IaC tools (IE Terraform or similar) 💡 Experience deploying and scaling PHP/Laravel applications 💡 Familiar with Docker, Kubernetes, ECS or EKS 💡 Proficient in Scripting languages like Python and Bash 💡 Understanding of SOC2 and DevSecOps Interested? Reach out to Billy @ Loop for more info More ❯
Work closely with product and leadership to align tech with business goals; support sales and customer success. Requirements: 7+ years in software engineering and architecture. Strong experience with Azure, Python and infrastructure-as-code (e.g., Terraform). Proven leadership in scaling modern systems and teams. Familiarity with AI/ML pipelines. Excellent communication and stakeholder engagement skills. More ❯
achieve in the cloud. BASIC QUALIFICATIONS - 3+ years of experience in software development or related field with proficiency in at least one modern programming language such as Java, Typescript, Python, or Ruby - 3+ years of experience with Linux, using the command line and basic administration, and computer networking fundamentals - Able to troubleshoot at all levels, from network to operating systems … DHCP, TCP/IP, routing, load balancing, load shedding) and experience with monitoring frameworks (such as CloudWatch, Datadog, Grafana, Elastic or similar) - Experience scripting operating system tasks in Bash, Python, etc. and with Infrastructure as Code, (such as CDK, CloudFormation, Puppet, Chef, Ansible, or similar) - Experience operating services in AWS Amazon is an equal opportunities employer. We believe passionately that More ❯