Apache Job Vacancies

101 to 125 of 927 Apache Jobs

Data Consultant(s) - Data Engineer

Liverpool, Lancashire, United Kingdom
Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Sr. Data Engineer

Edinburgh, Scotland, United Kingdom
Addepar
Are A degree in computer science, engineering, mathematics or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge of financial More ❯
Posted:

Senior DevOps Engineer

National, United Kingdom
NHS England
OCI Images), GitHub Actions, Gradle, Jenkins (legacy, moving towards GitHub Actions), Maven, SonarCloud Data : Elasticsearch, MongoDB, MySQL, Neo4J IaC : Ansible, Terraform Languages : Java, Python, TypeScript Monitoring : Grafana, Prometheus Misc : Apache (legacy, moving towards AWS CloudFront/API Gateway), Git (GitHub), Linux (Ubuntu), RabbitMQ We are looking to start or make more use of the following AWS services : CloudTrail Secrets More ❯
Employment Type: Permanent
Salary: £62215.00 - £72293.00 a year
Posted:

Lead Data Engineer Remote/Home Based, UK

London, England, United Kingdom
Hybrid / WFH Options
Aker Systems Limited
exploring new technologies and methodologies to solve complex data challenges. Proven experience leading data engineering projects or teams. Expertise in designing and building data pipelines using frameworks such as Apache Spark, Kafka, Glue, or similar. Solid understanding of data modelling concepts and experience working with both structured and semi-structured data. Strong knowledge of public cloud services, especially AWS More ❯
Posted:

Lead Data Engineer

Manchester, England, United Kingdom
Hybrid / WFH Options
Made Tech
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a cloud More ❯
Posted:

Senior Software Engineer, Infrastructure

London, England, United Kingdom
Hybrid / WFH Options
Intercom
might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful. Experience More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Deel
based languages is a plus. Data Warehousing: Hands-on experience with cloud-based data warehouses. Data Modeling: Proficiency in designing efficient database schemas. Workflow Orchestration: Familiarity with tools like Apache Airflow. Data Streaming: Experience with data streaming and Change Data Capture (CDC). Infrastructure: Proficiency in Terraform and GitHub Actions. Compliance: Experience in setting up PII anonymization and RBAC. More ❯
Posted:

Java UI Developer

England, United Kingdom
Bank of America
as required; comfortable building multi page Web Applications from scratch. Expertise with Application Server integration; JBoss 7, SpringBoot or later preferred. Proficient in developing microservices with SpringBoot Knowledge of Apache Web Server preferred. Database Skills with working knowledge of Structured Query Language (e.g. SQL/NoSQL commands and queries). 2+ years Working with Oracle, MySQL, MS SQL and … as required; comfortable building multi page Web Applications from scratch. Expertise with Application Server integration; JBoss 7, SpringBoot or later preferred. Proficient in developing microservices with SpringBoot Knowledge of Apache Web Server preferred. Database Skills with working knowledge of Structured Query Language (e.g. SQL/NoSQL commands and queries). 2+ years Working with Oracle, MySQL, MS SQL and More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Principal Data Engineer

London, England, United Kingdom
Epam
code to ensure high availability and accessibility Requirements Minimum of 8 years of experience in data engineering At least 5 years of hands-on experience with Azure data services (Apache Spark, Azure Data Factory, Synapse Analytics, RDBMS such as SQL Server) Proven leadership and management experience in data engineering teams Proficiency in PySpark, Python (with Pandas), T-SQL, SparkSQL … Ability to manage multiple projects and meet deadlines Azure certifications such as Microsoft Certified: Azure Data Engineer Associate or Azure Solutions Architect Nice to Have Experience with Scala for Apache Spark Knowledge of other cloud platforms like AWS or GCP Our Benefits Include Group pension plan, life assurance, income protection, and critical illness cover Private medical insurance and dental More ❯
Posted:

Backend Python Developer

London Area, United Kingdom
Hybrid / WFH Options
Roc Search
Manage deployments with Helm and configuration in YAML. Develop shell scripts and automation for deployment and operational workflows. Work with Data Engineering to integrate and manage data workflows using Apache Airflow and DAG-based models. Perform comprehensive testing, debugging, and optimization of backend components. Required Skills Bachelor's degree in Computer Science, Software Engineering, or a related field (or … and YAML for defining deployment configurations and managing releases. Proficiency in shell scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing workflows. Familiarity with database systems (SQL and NoSQL) and proficiency in writing efficient queries. Solid understanding of software development best practices, including version More ❯
Posted:

Backend Python Developer

City of London, London, United Kingdom
Hybrid / WFH Options
Roc Search
Manage deployments with Helm and configuration in YAML. Develop shell scripts and automation for deployment and operational workflows. Work with Data Engineering to integrate and manage data workflows using Apache Airflow and DAG-based models. Perform comprehensive testing, debugging, and optimization of backend components. Required Skills Bachelor's degree in Computer Science, Software Engineering, or a related field (or … and YAML for defining deployment configurations and managing releases. Proficiency in shell scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing workflows. Familiarity with database systems (SQL and NoSQL) and proficiency in writing efficient queries. Solid understanding of software development best practices, including version More ❯
Posted:

Data Architect

London, England, United Kingdom
Hybrid / WFH Options
Nadara
lake, data warehouse, lakehouse, and cloud-native designs. Experience with Inmon, Data Vault 2.0, Kimball, and dimensional modelling. Knowledge of integration patterns, ETL/ELT processes, and tools (e.g., Apache Airflow, Azure Data Factory, Informatica, Talend) to orchestrate data workflows. Familiarity with DevOps/MLOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation). Basic More ❯
Posted:

Senior Technical Lead - Compute Services, SVP

London, United Kingdom
Hybrid / WFH Options
Citigroup Inc
technical direction to a growing team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose of this role is to More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer with Security Clearance

Boston, Massachusetts, United States
Eliassen Group
Responsibilities: Develop, optimize, and maintain data ingest flows using Apache Kafka, Apache Nifi and MySQL/PostGreSQL Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena Communicate with data owners to set up and ensure configuration parameters Document SOP related to streaming configuration, batch configuration or API … machine learning techniques Strong understanding of programming languages like Python, R, and Java Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi Proficient in programming languages like Java, Scala, or Python Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Data Engineer (Python/Snowflake/Kafka) REMOTE UK, £70k

Nottingham, England, United Kingdom
Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS Platform) Technologies: Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS Environment: Large-scale data environment, Fully remote UK, Microservices architecture About the Role Are you a Data Engineering enthusiast who thrives on designing and implementing robust ETL processes, highly scalable data structures, and data pipelines within an enterprise-scale data processing … infrastructure background, understanding of system migrations, and experience with data warehousing concepts. Technical Skills Deep understanding of SQL and NoSQL databases (MongoDB or similar) Experience with streaming platforms like Apache Kafka Development and maintenance of ELT pipelines Knowledge of data warehousing best practices High proficiency in Apache Kafka and Apache Airflow Strong AWS experience Additional Attributes Agile More ❯
Posted:

GCP Cloud Architect

London, England, United Kingdom
ZipRecruiter
data programs. 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or … Composer (Apache Airflow). Proficiency in at least one scripting/programming (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large datasets. Expertise in More ❯
Posted:

Senior Data Engineer

Bedford, England, United Kingdom
ZipRecruiter
the contract. Benefits include Medical, Dental, Vision, 401k with company matching, and life insurance. Rate: $80 - $86/hr W2 Responsibilities: Develop, optimize, and maintain data ingestion flows using Apache Kafka, Apache Nifi, and MySQL/PostgreSQL. Develop within AWS cloud services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena. Coordinate with data owners to ensure proper … analysis, data visualization, and machine learning techniques. Proficiency in programming languages such as Python, R, and Java. Experience in building modern data pipelines and ETL processes with tools like Apache Kafka and Apache Nifi. Proficiency in Java, Scala, or Python programming. Experience managing or testing API Gateway tools and Rest APIs. Knowledge of traditional databases like Oracle, MySQL More ❯
Posted:

Data Engineer (Python/Snowflake/Kafka) REMOTE UK, £70k

Nottingham, Nottinghamshire, United Kingdom
Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have experience with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. You will understanding Apache Kafka to … a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, you will be an AWS enthuiast! Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on Agile Delivery and Scrum - so it's importantly you share a similar mind-set and More ❯
Employment Type: Permanent
Salary: £65000 - £70000/annum
Posted:

Senior Engineer - Data

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
Eden Scott
cutting-edge technologies. About the Role You’ll be part of an agile, cross-functional team building a powerful data platform and intelligent search engine. Working with technologies like Apache Lucene, Solr, and Elasticsearch, you'll contribute to the design and development of scalable systems, with opportunities to explore machine learning, AI-driven categorisation models, and vector search. What … You’ll Be Doing Design and build high-performance data pipelines and search capabilities. Develop solutions using Apache Lucene, Solr, or Elasticsearch. Implement scalable, test-driven code in Java and Python. Work collaboratively with Business Analysts, Data Engineers, and UI Developers. Contribute across the stack – from React/TypeScript front end to Java-based backend services. Leverage cloud infrastructure … leading data sets. Continuous improvements to how data is processed, stored, and presented. Your Profile Strong experience in Java development, with some exposure to Python. Hands-on knowledge of Apache Lucene, Solr, or Elasticsearch (or willingness to learn). Experience in large-scale data processing and building search functionality. Skilled with SQL and NoSQL databases. Comfortable working in Agile More ❯
Posted:

Senior Backend Developer (TS/SCI) with Security Clearance

Springfield, Virginia, United States
IntelliBridge
utilizing the Django web framework for the backends and React for developing the client facing portion of the application Create, extract, transform, and load (ETL) pipelines using Hadoop and Apache Airflow for various production big data sources to fulfill intelligence data availability requirements Automate retrieval of data from various sources via API and direct database queries for intelligence analysts … for military personnel Required Qualifications: Active TS/SCI Required 7-10 years experience Preferred Qualifications: Bachelor's degree in related field preferred Windows 7/10, MS Project Apache Airflow Python, Java, JavaScript, React, Flask, HTML, CSS, SQL, R, Docker, Kubernetes, HDFS, Postgres, Linux AutoCAD JIRA, Gitlab, Confluence About Us: IntelliBridge delivers IT strategy, cloud, cybersecurity, application, data More ❯
Employment Type: Permanent
Salary: USD Annual
Posted:

Data Engineer

Lisburn, Northern Ireland, United Kingdom
JR United Kingdom
Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like Apache Airflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse and lakehouse architectures (e.g., Databricks, Snowflake … or similar). Experience with big data frameworks (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). #J-18808-Ljbffr More ❯
Posted:

Machine Learning Engineer

London, England, United Kingdom
Hybrid / WFH Options
Trudenty
real-time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like Apache Hadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference More ❯
Posted:

Head of Data & Analytics Architecture and AI

London, United Kingdom
pladis Foods Limited
Data Storage & Databases: SQL & NoSQL Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer (Remote) - UK Software Engineering London

London, United Kingdom
Hybrid / WFH Options
Alphasights
and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, Apache Airflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing demands of data … pipelines , data warehouses , and leveraging AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using Apache Airflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You've made a demonstrable impact in More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineering Consultant

London, England, United Kingdom
Hybrid / WFH Options
Endava Limited
delivering high-quality solutions aligned with business objectives. Key Responsibilities Architect, implement, and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake, or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL … security measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Data Modelling: Designing dimensional, relational, and hierarchical data models. Scalability & Performance: Building fault-tolerant, highly available data architectures. Security & Compliance: Enforcing role-based access More ❯
Posted:
Apache
10th Percentile
£37,574
25th Percentile
£60,375
Median
£110,000
75th Percentile
£122,500
90th Percentile
£138,750