london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
tuning and optimisation Solid understanding of data warehousing principles, data modelling practice, Excellent knowledge of creation and maintenance of data pipelines - ETL Tools (e.g. Apache Airflow) and Streaming processing tools (e.g. Kinesis) Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve complex data-related issues More ❯
and ML scientists to plan the architecture for end-to-end machine learning workflows. Implement scalable training and deployment pipelines using tools such as Apache Airflow and Kubernetes. Perform comprehensive testing to ensure reliability and accuracy of deployed models. Develop instrumentation and automated alerts to manage system health and More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Data Engineer
excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... More ❯
Stroud, south east england, united kingdom Hybrid / WFH Options
Data Engineer
excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/Apache Spark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... More ❯
experience deploying ML models into production environments, including both batch and real-time/streaming contexts Proficiency working with distributed computing frameworks such as Apache Spark , Dask, or similar Experience with cloud-native ML deployment , particularly on AWS , using services like ECS, EKS, Fargate, Lambda, S3, and more Familiarity More ❯
AWS, or Azure. Experience with CI/CD pipelines for machine learning (e.g., Vertex AI). Experience with data processing frameworks and tools, particularly Apache Beam/Dataflow is highly desirable. Knowledge of monitoring and maintaining models in production. Proficiency in employing containerization tools, including Docker, to streamline the More ❯
Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., Apache NiFi, Talend, Informatica). • Proficiency in data integration tools and technologies. • Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) is a More ❯
knowledge of protocols such as HTTP, SSL and IPv4/v6. Working knowledge of Java, web containers (e.g. Tomcat, JBoss) and web servers (e.g. Apache, IIS). Experience of virtualisation in an enterprise environment. Cloud-based platform services - AWS, Azure, Google Cloud Platform. Preferred: Networking infrastructure - Proxies, Load balancers More ❯
PySpark Delta Lake Bash (both CLI usage and scripting) Git Markdown Scala DESIRABLE Azure SQL Server as a HIVE Metastore DESIRABLE TECHNOLOGIES Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Integration/Data Ingestion) JIRA If you meet the above requirements, please apply for the vacancy More ❯
Romsey, Hampshire, South East, United Kingdom Hybrid / WFH Options
Robert Half
C, or PHP Cloud: Hands-on experience with AWS or similar cloud platforms Containers: Strong understanding of Docker & Kubernetes LAMP Stack: Experience managing Linux, Apache, MySQL, PHP environments Security & Monitoring: Deep knowledge of server security best practices APPLY NOW! Don't miss this chance to join an innovative SaaS More ❯
Lake - Bash (both CLI usage and scripting) - Git - Markdown - Scala (bonus, not compulsory) - Azure SQL Server as a HIVE Metastore (bonus) Technologies - Azure Databricks - Apache Spark - Delta Tables - Data processing with Python - PowerBI (Integration/Data Ingestion) - JIRA Due to the nature and urgency of this post, candidates holding More ❯
Messaging and Streaming services such as Kinesis Data Streams, Simple Queue Service (SQS), Simple Notification Service (SNS), Amazon MQ, and Amazon Managed service for Apache Flink (MSF). We are looking for a technical expert who brings a mix of operations and networking expertise and shares our passion to More ❯
On, Password Management, and Passwordless Authentication (FIDO2) solutions. Exposure to supporting Web Access Management solutions, such as Ping Access or CA SiteMinder. Experience with Apache and IIS solutions. Understanding of the OSI model. Knowledge of the Software Development Life Cycle. Familiarity and understanding of high-availability environments. Skills: Analytical More ❯
Data Engineering Lead. This role is fully remote and is paying up to £85,000 on the base plus bonus and benefits. Experience Required: Apache Spark, Kafka, and Airflow for data ingestion and orchestration. Designing and deploying solutions on Azure Kubernetes Service (AKS). Knowledge of the Medallion Architecture More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Peaple Talent
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's More ❯
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's More ❯
london, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's More ❯
backend & frontend Familiarity with computer vision libraries and frameworks such as OpenCV, TensorFlow, and PyTorch. Knowledge of databases (e.g., MySQL, MongoDB), web servers (e.g., Apache), and UI/UX design principles. Relevant BSc/MSc degree eg Computer Science, ML, Computer Vision or a relevant tech subject. This is More ❯
Languages : Python, Bash, Go Network Modelling : YANG API Protocols : RESTCONF, NETCONF Platforms : ServiceNow, GitHub, Azure, AWS Data : XML, JSON Other : Azure DevOps, GIT, Linux, Apache, MySQL Ideal Candidate Strong experience in automation and systems integration. Proficient in Python and automation using Ansible. Familiarity with ServiceNow, GitHub workflows, and network More ❯
or similar role. Proficiency with Databricks and its ecosystem. Strong programming skills in Python, R, or Scala. Experience with big data technologies such as Apache Spark, Databricks. Knowledge of SQL and experience with relational databases. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Strong analytical and problem More ❯
applications Experience with NoSQL and in-memory databases (MongoDB, CouchDB, Redis, or others) Experience with analytical databases and processing large scales of data (ClickHouse, Apache Druid, or others) Experience with analyzing and tuning database queries Experience with Event-Driven Architecture Can't find the position you're looking for More ❯
PHP Developer – Insurtech | LAMP Stack | OOPHP | Remote UK 🚀 Salary: £45,000 – £55,000 Location: Fully Remote (UK only) Tech Stack: PHP (OOPHP), MySQL, Apache, Linux, JavaScript, HTML/CSS The Company We're working with a fast-growing Insurtech company that’s reshaping the insurance experience through smarter tech More ❯
and Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean More ❯
experience in data engineering, with a strong understanding of modern data technologies (e.g., cloud platforms like AWS, Azure, GCP, and data tools such as Apache Spark, Kafka, dbt, etc.). Proven track record of leading and managing data engineering teams in a consultancy or similar environment. Strong expertise in More ❯
london, south east england, united kingdom Hybrid / WFH Options
Solytics Partners
and Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean More ❯