Apache Job Vacancies

26 to 50 of 927 Apache Jobs

Sr. Software Engineer, Infrastructure

London, England, United Kingdom
Hybrid / WFH Options
Circadia Technologies Ltd
frameworks such as Boost.Test, Google Test, etc. Nice to Haves: Experience with Azure services for managing GPT pipelines and multi-cloud infrastructure. Familiarity with big data technologies such as Apache Spark, Kafka, and MSK for large-scale data processing. Experience with boost libraries (asio, beast). Advanced experience in cost optimization strategies for cloud infrastructure and database performance tuning. More ❯
Posted:

Senior Software Engineer in Test (SDET)

London, England, United Kingdom
Hybrid / WFH Options
Hargreaves Lansdown
or a related field, or equivalent experience. Experience : Advanced experience in test automation development using tools like Selenium, JUnit, TestNG, Cypress, etc. Familiarity with performance testing tools such as Apache Bench, JMeter, or LoadRunner, or modern alternatives like K6, Gatling, Locust. Familiarity with BDD tools like Cucumber or SpecFlow. Skills : Proficiency in programming languages such as Java, Python, or More ❯
Posted:

Senior Database Engineer

Cardiff, Wales, United Kingdom
Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
Posted:

Senior Database Engineer

Gloucester, England, United Kingdom
Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
Posted:

Senior Software Engineer (Viator)

London, England, United Kingdom
Hybrid / WFH Options
Tripadvisor
RDS, S3, CloudWatch, puppet, docker Experience building and running monitoring infrastructure at a large scale. For example, Elasticsearch clusters, Prometheus, Kibana, Grafana, etc Web applications and HTTP servers – Java, apache, nginx Load balancers – ELB, HAProxy, nginx Experience in running SQL/NoSQL data stores – RDS, DynamoDB, ElastiCache, Solr Perks of Working at Viator Competitive compensation packages , including base salary More ❯
Posted:

Senior Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
Employment Type: Permanent
Posted:

Senior Software Engineer

City Of London, England, United Kingdom
Hybrid / WFH Options
Paul Murphy Associates
support market surveillance and compliance efforts. The platform leverages advanced analytics and machine learning to identify trading behaviors that could trigger regulatory attention. The tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS, and other cloud-native tools. You’ll work alongside a high-impact engineering team to build fault-tolerant … data pipelines and services that process massive time-series datasets in both real-time and batch modes. Key Responsibilities: Design and build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for large-scale time-series processing Build event-driven and batch workflows using AWS Lambda, SNS/SQS, and … and non-technical stakeholders Qualifications: Strong backend software development experience, especially in distributed systems and large-scale data processing Advanced Java programming skills (multithreading, concurrency, performance tuning) Expertise in Apache Spark and Spark Streaming Proficiency with AWS services such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on Python experience, particularly More ❯
Posted:

Senior Data Engineer (Remote)

London, England, United Kingdom
Hybrid / WFH Options
Circana
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
Posted:

Senior Software Engineer

City of London, England, United Kingdom
Hybrid / WFH Options
Paul Murphy Associates
support market surveillance and compliance efforts. The platform leverages advanced analytics and machine learning to identify trading behaviors that could trigger regulatory attention. The tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS, and other cloud-native tools. You’ll work alongside a high-impact engineering team to build fault-tolerant … data pipelines and services that process massive time-series datasets in both real-time and batch modes. Key Responsibilities: Design and build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for large-scale time-series processing Build event-driven and batch workflows using AWS Lambda, SNS/SQS, and … and non-technical stakeholders Qualifications: Strong backend software development experience, especially in distributed systems and large-scale data processing Advanced Java programming skills (multithreading, concurrency, performance tuning) Expertise in Apache Spark and Spark Streaming Proficiency with AWS services such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on Python experience, particularly More ❯
Posted:

Senior Java Software Engineer

Glasgow, UK
Amici Procurement Solutions
modern technology stacks to build and optimize a powerful data platform and search engine. With an opportunity to explore vector search, machine learning, and large-scale data processing using Apache Lucene, Solr, or Elasticsearch. What you’ll be doing: Design, build, and optimize a high-performance data platform and search solution. Develop robust search capabilities using Apache Lucene … and search technologies. Role Profile You have strong experience in Java development and exposure to Python. Have experience with large-scale data processing and search technologies. An expert in Apache Lucene, Solr, Elasticsearch, if not you have the appetite to learn more. Hands on experience with SQL and NoSQL databases under your belt. Hold a degree in Computer Science More ❯
Employment Type: Full-time
Posted:

Senior Data Engineer (Databricks) - UK

London, England, United Kingdom
Hybrid / WFH Options
Datapao
work for the biggest multinational companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. Additionally, at this seniority level … no shortage of learning opportunities at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for learning during work hours, and access … year , depending on your assessed seniority level during the selection process. About DATAPAO At DATAPAO, we are delivery partners and the preferred training provider for Databricks, the creators of Apache Spark. Additionally, we are Microsoft Gold Partners in delivering cloud migration and data architecture on Azure. Our delivery partnerships enable us to work in a wide range of industries More ❯
Posted:

Data Architect (Bristol - Hybrid, 2 Office Days)

London, England, United Kingdom
Hybrid / WFH Options
SBS
and Lakehouse Design: Strong data modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, Apache Airflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and analytical skills with attention to detail. More ❯
Posted:

Senior Software Engineer

Manchester, England, United Kingdom
Datalex
experience working as a Software Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito, PyTest, Selenium. Strong working knowledge … experience working as a Software Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
Posted:

Senior Software Engineer, Python

London, England, United Kingdom
Hybrid / WFH Options
Cboe Global Markets, Inc
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order management More ❯
Posted:

Software Engineer, Python

London, England, United Kingdom
Hybrid / WFH Options
Cboe Global Markets
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order management More ❯
Posted:

Senior Data Engineer - Snowflake - £100,000 - London - Hybrid

London, South East, England, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance to More ❯
Employment Type: Full-Time
Salary: £85,000 - £100,000 per annum
Posted:

Senior Data Engineer - Snowflake - £100,000 - London - Hybrid

City of London, London, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance to More ❯
Employment Type: Permanent
Salary: £85000 - £100000/annum + Top Benefits
Posted:

Senior Data Engineer - Snowflake - £110,000 - London - Hybrid

City of London, England, United Kingdom
Hybrid / WFH Options
Jefferson Frank
the businesses data arm. Requirements: * 3+ Years data engineering experience * Snowflake experience * Proficiency across an AWS tech stack * DBT Expertise * Terraform Experience Nice to Have: * Data Modelling * Data Vault * Apache Airflow Benefits: * Up to 10% Bonus * Up to 14% Pensions Contribution * 29 Days Annual Leave + Bank Holidays * Free Company Shares Interviews ongoing don't miss your chance to More ❯
Posted:

Data Engineer

City Of Bristol, England, United Kingdom
Peaple Talent
patterns in pipeline architecture and design. Confident using Git-based version control systems, including Azure DevOps or similar. Skilled in managing and scheduling data workflows using orchestration platforms like Apache AirFlow. Involved in building and optimizing data warehouses on modern analytics platforms like Snowflake, Redshift, or Databricks. Familiar with visual or low-code data integrations tools, including platforms such More ❯
Posted:

Senior QA Engineer - ETL Data Analysis - Contract

London, England, United Kingdom
ECOM
products or platforms Strong knowledge of SQL and experience with large-scale relational and/or NoSQL databases Experience testing data pipelines (ETL/ELT), preferably with tools like Apache Airflow, dbt, Spark, or similar Proficiency in Python or similar scripting language for test automation Experience with cloud platforms (AWS, GCP, or Azure), especially in data-related services Familiarity More ❯
Posted:

Data Architect

London, United Kingdom
Applicable Limited
with interface/API data modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business empowering local teams with exciting More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Software Engineer

Manchester, England, United Kingdom
Datalex
experience working as a Software Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Typeform
and SQL for data pipelines Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake) Strong communication skills and fluency in English Experience with Apache Spark (in both batch and streaming) Experience with a job orchestrator (Airflow, Google Cloud Composer, Flyte, Prefect, Dagster) Hands-on experience with AWS Experience with dbt *Typeform drives hundreds More ❯
Posted:

Data Architect

London, England, United Kingdom
NTT DATA
developing and implementing enterprise data models. Experience with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: We’re a business with a global reach that More ❯
Posted:

Senior Automation Engineer, Python

Manchester, Lancashire, United Kingdom
Roku, Inc
or MS degree in Computer Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for Apache Airflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:
Apache
10th Percentile
£37,574
25th Percentile
£60,375
Median
£110,000
75th Percentile
£122,500
90th Percentile
£138,750