data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning … development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with ApacheSpark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the … data privacy, handling of sensitive data (e.g. GDPR) Experience in event-driven architecture, ingesting data in real time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam. Understanding of the challenges faced in the design and development of a streaming data pipeline and the different options More ❯
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (ApacheSpark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis … years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apachespark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF More ❯
communication skills and demonstrated ability to engage with business stakeholders and product teams. Experience in data modeling , data warehousing (e.g., Snowflake , AWS Glue , EMR , ApacheSpark ), and working with data pipelines . Leadership experience—whether technical mentorship, team leadership, or managing critical projects. Familiarity with Infrastructure as Code More ❯
Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and ApacheSpark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Allianz Popular SL
like Pypdf, Camelot and OCR based services like Azure Document intelligence would be advantageous. Familiarity with Azure cloud platform and distributed computing frameworks (e.g., ApacheSpark) is a plus. Experience in building and working with knowledge graphs, such as Neo4j, or GraphDB would be desirable. What We Will More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
Allianz
like Pypdf, Camelot and OCR based services like Azure Document intelligence would be advantageous. Familiarity with Azure cloud platform and distributed computing frameworks (e.g., ApacheSpark) is a plus. Experience in building and working with knowledge graphs, such as Neo4j, or GraphDB would be desirable. What We Will More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
Allianz UK
like Pypdf, Camelot and OCR based services like Azure Document intelligence would be advantageous. Familiarity with Azure cloud platform and distributed computing frameworks (e.g., ApacheSpark) is a plus. Experience in building and working with knowledge graphs, such as Neo4j, or GraphDB would be desirable. What We Will More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
software practices (SCRUM/Agile, microservices, containerization like Docker/Kubernetes). we'd also encourage you to apply if you possess: Experience with Spark/Databricks. Experience deploying ML models via APIs (e.g., Flask, Keras). Startup experience or familiarity with geospatial and financial data. The Interview Process More ❯
CD and containerised environments Skilled at working with both structured and unstructured data to unlock insights and power models Hands-on experience with Databricks, ApacheSpark, or similar tools used in large-scale data processing Exposure to machine learning model deployment using APIs or lightweight serving frameworks like More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
Allianz
with a focus on delivering strategic value. Proficiency in Cloud technologies (AWS, Azure) and a comprehensive understanding of data technologies, including big data platforms (SPARK, Kafka), relational/non-relational databases (Postgres, MongoDB, Cassandra), open-source languages (Java, Python, Scala), adoption of LLM in the solution Experience in mentoring More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
exchange connectivity Scripting abilities in Python, Bash, or similar languages Knowledge of monitoring tools and alerting frameworks Exposure to data technologies such as Kafka, Spark or Delta Lake is useful but not mandat Bachelor's degree in Computer Science, Engineering, or related technical field This role offers competitive compensation More ❯
track record in networks, telecom, or customer experience domains (preferred). Proficiency in cloud platforms like GCP, AWS, or Azure; plus tools like Kafka, Spark, Snowflake, Databricks. Skilled collaborator with excellent stakeholder and vendor management capabilities. Confident communicator with the ability to bridge technical and business audiences. #J More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high … experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in secure deployments Experience … with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of More ❯
Guildford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
career progression opportunities across Group, which includes several high profile household name What you'll bring: Experience with Cloud and big data technologies (e.g. Spark/Databricks/Delta Lake/BigQuery). Familiarity with eventing technologies (e.g. Event Hubs/Kafka) and file formats such as Parquet/ More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks … analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity More ❯