Apache Jobs in England

301 to 325 of 866 Apache Jobs in England

Senior Data Engineer

West Bromwich, England, United Kingdom
Hybrid / WFH Options
Leonardo UK Ltd
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
Posted:

Senior Python Developer

City Of London, England, United Kingdom
developrec
engineers + external partners) across complex data and cloud engineering projects Designing and delivering distributed solutions on an AWS-centric stack, with open-source flexibility Working with Databricks, Apache Iceberg, and Kubernetes in a cloud-agnostic environment Guiding architecture and implementation of large-scale data pipelines for structured and unstructured data Steering direction on software stack, best practices, and … especially AWS), and orchestration technologies Proven delivery of big data solutions—not necessarily at FAANG scale, but managing high-volume, complex data (structured/unstructured) Experience working with Databricks, Apache Iceberg, or similar modern data platforms Experience of building software environments from the ground up, setting best practice and standards Experience leading and mentoring teams Worked in a startup More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

London, UK
Vallum Associates
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, London, United Kingdom
Vallum Associates
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions More ❯
Posted:

Data Engineer

London, England, United Kingdom
GSR
a focus on data quality and reliability. Infrastructure & Architecture Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and … ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical and non-technical teams. … Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomaly detection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent company culture founded on Integrity, Innovation and More ❯
Posted:

Solutions Architect (Data Analytics)

Slough, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure More ❯
Posted:

Data Architect (Trading)

London, England, United Kingdom
Hybrid / WFH Options
Experteer Italy
Expertise in data warehousing, data modelling, and data integration. * Experience in MLOps and machine learning pipelines. * Proficiency in SQL and data manipulation languages. * Experience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS. Education & Qualifications * Bachelor's or Master's degree in Computer Science, Engineering, or a More ❯
Posted:

Machine Learning Engineer, II

London, United Kingdom
Spotify
Python, or similar languages. Experience with TensorFlow, PyTorch, Scikit-learn, etc. is a strong plus. You have some experience with large scale, distributed data processing frameworks/tools like Apache Beam, Apache Spark, or even our open source API for it - Scio, and cloud platforms like GCP or AWS. You care about agile software processes, data-driven development More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

Bristol, England, United Kingdom
Hybrid / WFH Options
Leonardo
exciting, and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK-based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with Apache NiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control Familiarity with data governance More ❯
Posted:

Solutions Architect (Data Analytics)- Presales, RFP creation

City of London, England, United Kingdom
JR United Kingdom
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for Azure More ❯
Posted:

Snowflake Architect

Basildon, England, United Kingdom
TestYantra Software Solutions
data warehousing concepts and data modeling . Excellent problem-solving and communication skills focused on delivering high-quality solutions. Understanding or hands-on experience with orchestration tools such as Apache Airflow . Deep knowledge of non-functional requirements such as availability , scalability , operability , and maintainability . #J-18808-Ljbffr More ❯
Posted:

Data Engineer

London, England, United Kingdom
Sporty
relational and NoSQL databases. Experience with data modelling. General understanding of data architectures and event-driven architectures. Proficient in SQL. Familiarity with one scripting language, preferably Python. Experience with Apache Airflow & Apache Spark. Solid understanding of cloud data services: AWS services such as S3, Athena, EC2, RedShift, EMR (Elastic MapReduce), EKS, RDS (Relational Database Services) and Lambda. Nice More ❯
Posted:

Data Engineer - Manager

London, United Kingdom
Cloud Decisions
Azure, AWS, GCP) Hands-on experience with SQL, Data Pipelines, Data Orchestration and Integration Tools Experience in data platforms on premises/cloud using technologies such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer (Remote) - UK Software Engineering London

London, United Kingdom
Hybrid / WFH Options
Alphasights
and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, Apache Airflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing demands of data … pipelines , data warehouses , and leveraging AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using Apache Airflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You've made a demonstrable impact in More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Big Data Engineer - AI Forecasting

London, England, United Kingdom
ASOS.com
in Spark/PySpark, Azure data technologies, Python or Scala, SQL. Experience with testing frameworks like pytest or ScalaTest. Knowledge of open table formats such as Delta, Iceberg, or Apache Hudi. Experience with CI/CD workflows using Azure DevOps, GitHub Actions, and version control systems like GIT. Understanding of cloud infrastructure and Infrastructure as Code (Terraform or Bicep … Scrum or Kanban. Nice to have skills: Experience in retail or e-commerce. Knowledge of Big Data and Distributed Computing. Familiarity with streaming technologies like Spark Structured Streaming or Apache Flink. Additional programming skills in PowerShell or Bash. Understanding of Databricks Ecosystem components. Experience with Data Observability or Data Quality frameworks. Additional Information What's in it for you More ❯
Posted:

Senior Big Data Engineer/Architect – Scala – Distributed Systems

London, England, United Kingdom
Deutsche Bank AG, Frankfurt am Main
Build, modernise, and re-architect enterprise data systems. Migrate on-prem systems to Google Cloud Platform (GCP), leveraging Dataproc, BigQuery, and other GCP-native tooling. Use technologies such as Apache Spark, Hadoop, and Scala to process large-scale distributed datasets. Contribute to infrastructure automation (CI/CD) and hybrid cloud deployment pipelines using tools such as GitHub Actions, Renovate … and, if required, step up to contribute to architecture and system design. Essential Technical Criteria: Strong Scala development skills with experience in large-scale Big Data environments. Proficiency in Apache Spark; working knowledge of Hadoop. Familiarity with GCP (Dataproc, BigQuery) or other public cloud platforms (AWS, Azure). Experience with Kubernetes or OpenShift (on-prem or hybrid environments). More ❯
Posted:

Sr. Data Engineer

London, England, United Kingdom
EOG Resources, Inc
Metrics instances, on-prem S3, and codebases in Python, Typescript, and Kotlin. Maintain and manage pipeline between real-time data from streaming/non-streaming data sources such as Apache Pulsar/Oracle/MemSQL to application data stores and services. JOB REQUIREMENTS MINIMUM EDUCATION: Bachelor's Degree in Computer Science, CIS, MIS or Information Technology. MINIMUM EXPERIENCE: Five More ❯
Posted:

Software Engineer

City of London, London, United Kingdom
Anson McCade
teams • Mentor junior developers Requirements: • British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, Apache NiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet all requirements still feel free to More ❯
Posted:

Software Engineer

London Area, United Kingdom
Anson McCade
teams • Mentor junior developers Requirements: • British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, Apache NiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet all requirements still feel free to More ❯
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Tenzo Limited
professional to make a significant impact at Tenzo. This role is pivotal in shaping how our product integrates and interacts with external systems, partners, and platforms. Our Tech Stack: Apache Airflow Python Django AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Your mission Design and develop data pipelines, orchestrating key activities such as … in SQL and experience with relational databases such as PostgreSQL , including database administration, tuning, and optimisation ( Highly desirable ). Experience with data pipeline and workflow management tools such as Apache Airflow ( Nice to have ). Proficiency in Git ( Important ). Ability and eagerness to write high-quality code, technical documentation, architecture diagrams, and production plans ( Important ). Strong understanding More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer

London, England, United Kingdom
Anson McCade
in Agile (SCRUM) teams Requirements: British-born sole UK National with active SC or DV Clearance Strong Java skills, familiarity with Python Experience in Linux, Git, CI/CD, Apache NiFi Knowledge of Oracle, MongoDB, React, Elasticsearch Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) If you do not meet all requirements still feel free to apply. Benefits: Bonus More ❯
Posted:

Data & Analytics Senior Data Engineer Professional Multiple Cities

Leicester, Leicestershire, United Kingdom
Avature
technical and professional experience Preferred Skills: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. Certifications:While not required, the following certifications would be highly beneficial: Experience … working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global professional services business, with market leading capabilities in More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Software Engineer - Defence - 4 day week

Gloucester, Gloucestershire, South West, United Kingdom
Hybrid / WFH Options
Anson Mccade
tools like JUnit, Git, Jira, MongoDB, and React Familiarity with cloud platforms (especially AWS), microservices, and containerisation DV clearance (or eligibility to obtain it) Nice to Have: Experience with Apache NiFi, JSF, Hibernate, Elasticsearch, Kibana, or AWS services like EC2, Lambda, EKS CI/CD pipeline expertise using GitLab Knowledge of secure, scalable architectures for cloud deployments O.K. I More ❯
Employment Type: Permanent, Work From Home
Salary: £75,000
Posted:

Data Engineer

London, England, United Kingdom
Rise Technical
ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as Apache Airflow, Athena, or SageMaker, is essential for the is a fantastic opportunity for a Data Engineer to join a rapidly expanding start-up at an important time where you More ❯
Posted:

Snowflake Architect

Basildon, England, United Kingdom
Test Yantra
of data warehousing concepts and data modeling. Excellent problem-solving and communication skills focused on delivering high-quality solutions. Understanding or hands-on experience with orchestration tools such as Apache Airflow. Deep knowledge of non-functional requirements such as availability, scalability, operability, and maintainability. Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job function More ❯
Posted:
Apache
England
10th Percentile
£37,574
25th Percentile
£65,000
Median
£110,000
75th Percentile
£125,000
90th Percentile
£138,750