Cambourne, England, United Kingdom Hybrid / WFH Options
Remotestar
skills: Strong knowledge of Scala. Familiarity with distributed computing frameworks such as Spark, KStreams, Kafka. Experience with Kafka and streaming frameworks. Understanding of monolithic vs. microservice architectures. Familiarity with Apache ecosystem including Hadoop modules (HDFS, YARN, HBase, Hive, Spark) and Apache NiFi. Experience with containerization and orchestration tools like Docker and Kubernetes. Knowledge of time-series or analytics More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
You're really awesome at: Object-oriented programming (Java) Data modeling using various database technologies ETL processes (transferring data in-memory, moving away from traditional ETLs) and experience with Apache Spark or Apache NiFi Applied understanding of CI/CD in change management Dockerized applications Using distributed version control systems Being an excellent team player Meticulous and passionate More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo SpA
exciting and critical challenges to the UK's digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
RemoteStar
Computing like Spark/KStreams/Kafka. Connect and Streaming Frameworks such as Kafka. Knowledge on Monolithic versus Microservice Architecture concepts for building large-scale applications. Familiar with the Apache suite including Hadoop modules such as HDFS, Yarn, HBase, Hive, Spark as well as Apache NiFi. Familiar with containerization and orchestration technologies such as Docker, Kubernetes. Familiar with More ❯
using data manipulation and machine learning libraries in one or more programming languages. Keen interest in some of the following areas: Big Data Analytics (e.g. Google BigQuery/BigTable, Apache Spark), Parallel Computing (e.g. Apache Spark, Kubernetes, Databricks), Cloud Engineering (AWS, GCP, Azure), Spatial Query Optimisation, Data Storytelling with (Jupyter) Notebooks, Graph Computing, Microservices Architectures Modelling & Statistical Analysis More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Searchability NS&D
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe … hearing from you. KEY SKILLS: DATA ENGINEER/DATA ENGINEERING/DEFENCE/NATIONAL SECURITY/DATA STRATEGY/DATA PIPELINES/DATA GOVERNANCE/SQL/NOSQL/APACHE/NIFI/KAFKA/ETL/GLOUCESTER/DV/SECURITY CLEARED/DV CLEARANCE Seniority level Seniority level Not Applicable Employment type Employment type Full-time Job More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Newcastle Building Society
candidates with experience from other cloud providers like AWS or GCP. Strong technical experience with ETL techniques, programming languages and data engineering tools such as SQL, Python, R and Apache is fundamental for the role, as is good knowledge of data integration and data quality management best practices. You’ll be an experienced agile practitioner, with understanding of methodologies … candidates with experience from other cloud providers like AWS or GCP. Strong technical experience with ETL techniques, programming languages and data engineering tools such as SQL, Python, R and Apache is fundamental for the role, as is good knowledge of data integration and data quality management best practices. You’ll be an experienced agile practitioner, with understanding of methodologies More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Newcastle Building Society
candidates with experience from other cloud providers like AWS or GCP. Strong technical experience with ETL techniques, programming languages and data engineering tools such as SQL, Python, R and Apache is fundamental for the role, as is good knowledge of data integration and data quality management best practices. You’ll have strong communication skills and the ability to effectively … candidates with experience from other cloud providers like AWS or GCP. Strong technical experience with ETL techniques, programming languages and data engineering tools such as SQL, Python, R and Apache is fundamental for the role, as is good knowledge of data integration and data quality management best practices. You’ll have strong communication skills and the ability to effectively More ❯
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
modern data lake/lakehouse architectures Strong grasp of cloud data platforms (AWS, Azure, GCP, Snowflake) Understanding of Data Mesh , Data Fabric , and data product-centric approaches Familiarity with Apache Spark , Python , and ETL/ELT pipelines Strong knowledge of data governance, lifecycle management, and compliance (e.g. GDPR) Consulting experience delivering custom data solutions across sectors Excellent leadership, communication More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Signify Technology
bring: Proven experience managing large-scale data infrastructure or platform teams, ideally in a consumer tech or marketplace environment. Deep understanding of distributed systems and modern data tools (e.g., Apache Spark, Kafka, DBT, Databricks). Experience with both batch and real-time data processing architectures. Strong programming background in Python, Scala, or Java. Familiarity with cloud platforms such as More ❯
bring: Proven experience managing large-scale data infrastructure or platform teams, ideally in a consumer tech or marketplace environment. Deep understanding of distributed systems and modern data tools (e.g., Apache Spark, Kafka, DBT, Databricks). Experience with both batch and real-time data processing architectures. Strong programming background in Python, Scala, or Java. Familiarity with cloud platforms such as More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to 95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
London, England, United Kingdom Hybrid / WFH Options
TripAdvisor LLC
are constantly looking for components to adopt in order to enhance our platform. What you’ll do: Develop across our evolving technology stack - we’re using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that environment. You will have the More ❯
London, England, United Kingdom Hybrid / WFH Options
FDM Group
large-scale, high-performance services using Kubernetes, Kafka, Spring Boot, .NET, Node.js, React JS, serverless functions, and event-driven architecture Create real-time streaming and batch data pipelines with Apache Spark, Kafka, Lambda, Step Functions, and Snowflake Develop infrastructure with Kubernetes, Lambda, Terraform, Cloud Custodian, and AWS Transit Gateway Implement connectivity solutions using Cisco, F5, and Direct Connect About More ❯
Jenkins, TeamCity Scripting languages such as PowerShell, bash Observability/Monitoring: Prometheus, Grafana, Splunk Containerisation tools such as Docker, K8S, OpenShift, EC, containers Hosting technologies such as IIS, nginx, Apache, App Service, LightSail Analytical and creative approach to problem solving We encourage you to apply , even if you don't meet all of the requirements. We value your growth More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Mirai Talent
Microsoft Azure and application workload management. Proficiency in object-oriented development techniques. Understanding of modern data processing architectures such as Data Lakehouse. Experience in data engineering technologies such as Apache Spark, Databricks, Python, and Scala. Experience working collaboratively as part of an Agile development squad. Degree in Computer Science or related subject. Bonus Points: Experience in the analytics industry. More ❯
London, England, United Kingdom Hybrid / WFH Options
FIND | Creating Futures
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, Apache Spark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or informal More ❯
London, England, United Kingdom Hybrid / WFH Options
Derisk360
in Neo4j such as fraud detection, knowledge graphs, and network analysis. Optimize graph database performance, ensure query scalability, and maintain system efficiency. Manage ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments. Implement metadata management, security, and data governance using Data Catalog and IAM. Collaborate with cross-functional teams and clients across diverse EMEA More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Data Intellect Limited
Python, SQL, and/or Scala. Knowledge of two or more common Cloud ecosystems (Azure, AWS, GCP) with expertise in at least one. Deep experience with distributed computing with Apache Spark. Working knowledge of CI/CD for production deployments. Working knowledge of MLOps. Familiarity with designing and deploying performant end-to-end data architectures. Experience with technical project More ❯
London, England, United Kingdom Hybrid / WFH Options
Arreoblue
or more of the following technologies: Databricks Dedicated SQL Pools Synapse Analytics Data Factory To set yourself up for success, you should have in-depth knowledge of the languages Apache Spark, SQL, and Python, along with solid development practices. Additionally, you will be required to have in-depth knowledge of supporting Azure platforms such as Data Lake, Key Vault More ❯
future-proofing of the data pipelines. ETL and Automation Excellence: Lead the development of specialized ETL workflows, ensuring they are fully automated and optimized for performance using tools like Apache Airflow, Snowflake, and other cloud-based technologies. Drive improvements across all stages of the ETL cycle, including data extraction, transformation, and loading. Infrastructure & Pipeline Enhancement: Spearhead the upgrading of More ❯