Remote Apache Job Vacancies

26 to 50 of 343 Remote Apache Jobs

Senior Software Engineer

City of London, England, United Kingdom
Hybrid / WFH Options
Paul Murphy Associates
support market surveillance and compliance efforts. The platform leverages advanced analytics and machine learning to identify trading behaviors that could trigger regulatory attention. The tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS, and other cloud-native tools. You’ll work alongside a high-impact engineering team to build fault-tolerant … data pipelines and services that process massive time-series datasets in both real-time and batch modes. Key Responsibilities: Design and build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for large-scale time-series processing Build event-driven and batch workflows using AWS Lambda, SNS/SQS, and … and non-technical stakeholders Qualifications: Strong backend software development experience, especially in distributed systems and large-scale data processing Advanced Java programming skills (multithreading, concurrency, performance tuning) Expertise in Apache Spark and Spark Streaming Proficiency with AWS services such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on Python experience, particularly More ❯
Posted:

Senior Data Engineer (Databricks) - UK

London, England, United Kingdom
Hybrid / WFH Options
Datapao
work for the biggest multinational companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. Additionally, at this seniority level … no shortage of learning opportunities at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for learning during work hours, and access … year , depending on your assessed seniority level during the selection process. About DATAPAO At DATAPAO, we are delivery partners and the preferred training provider for Databricks, the creators of Apache Spark. Additionally, we are Microsoft Gold Partners in delivering cloud migration and data architecture on Azure. Our delivery partnerships enable us to work in a wide range of industries More ❯
Posted:

Data Architect (Bristol - Hybrid, 2 Office Days)

London, England, United Kingdom
Hybrid / WFH Options
SBS
and Lakehouse Design: Strong data modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, Apache Airflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and analytical skills with attention to detail. More ❯
Posted:

Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Prima
Governance techniques. Knowledge on MLOps and Feature engineering. Exposure to common data analysis and ML technologies such as on scikit-learn, pandas, NumPy, XGBoost, LightGBM. Exposure to tools like Apache Oozie, Apache Airflow. Why You’ll Love It Here We want to make Prima a happy and empowering place to work. So if you decide to join us More ❯
Posted:

Senior Software Engineer, Python

London, England, United Kingdom
Hybrid / WFH Options
Cboe
testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarit y with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarit y with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order management More ❯
Posted:

Software Engineer, Python

London, England, United Kingdom
Hybrid / WFH Options
Cboe Global Markets
or other testing methodologies Preferred: Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as Apache Airflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant financial areas (market data, order management More ❯
Posted:

Senior Data Engineer - Snowflake - £100,000 - London - Hybrid

London, South East, England, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance to More ❯
Employment Type: Full-Time
Salary: £85,000 - £100,000 per annum
Posted:

Senior Data Engineer - Snowflake - £100,000 - London - Hybrid

City of London, London, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance to More ❯
Employment Type: Permanent
Salary: £85000 - £100000/annum + Top Benefits
Posted:

Senior Software Engineer

Manchester, England, United Kingdom
Hybrid / WFH Options
QinetiQ
Are Some Things We’ve Worked On Recently That Might Give You a Better Sense Of What You’ll Be Doing Day To Day Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
Posted:

Senior Software Engineer / Dev Team Lead - Remote (UK based)

London, England, United Kingdom
Hybrid / WFH Options
The Perl Programming Language
of learning the rest as needed: • Distributed or large-scale systems • MySQL/SQL database design, query optimisation and admin • Web development in HTML, CSS, JavaScript, Vue/React • Apache web server software and related modules • Cloud platforms and concepts (AWS, Google Cloud, Azure) • Setup, testing and administration of CI/CD pipelines • Networking and firewalling • Natural language processing … of learning the rest as needed: • Distributed or large-scale systems • MySQL/SQL database design, query optimisation and admin • Web development in HTML, CSS, JavaScript, Vue/React • Apache web server software and related modules • Cloud platforms and concepts (AWS, Google Cloud, Azure) • Setup, testing and administration of CI/CD pipelines • Networking and firewalling • Natural language processing More ❯
Posted:

Lead Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
Employment Type: Permanent
Posted:

Senior Engineer - Data

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
Eden Scott
cutting-edge technologies. About the Role You’ll be part of an agile, cross-functional team building a powerful data platform and intelligent search engine. Working with technologies like Apache Lucene, Solr, and Elasticsearch, you'll contribute to the design and development of scalable systems, with opportunities to explore machine learning, AI-driven categorisation models, and vector search. What … You’ll Be Doing Design and build high-performance data pipelines and search capabilities. Develop solutions using Apache Lucene, Solr, or Elasticsearch. Implement scalable, test-driven code in Java and Python. Work collaboratively with Business Analysts, Data Engineers, and UI Developers. Contribute across the stack – from React/TypeScript front end to Java-based backend services. Leverage cloud infrastructure … powered product categorisation. Continuous improvements to how data is processed, stored, and presented. Your Profile Strong experience in Java development, with some exposure to Python. Hands-on knowledge of Apache Lucene, Solr, or Elasticsearch (or willingness to learn). Experience in large-scale data processing and building search functionality. Skilled with SQL and NoSQL databases. Comfortable working in Agile More ❯
Posted:

Lead Data Engineer (Remote)

Bracknell, England, United Kingdom
Hybrid / WFH Options
Circana, LLC
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
Posted:

Senior Software Engineer, 3d/Data Remote UK

London, England, United Kingdom
Hybrid / WFH Options
Autodesk
Linux systems and bash terminals Preferred Qualifications Hands-on experience with: Distributed computing frameworks, such as Ray Data and Spark. Databases and/or data warehousing technologies, such as Apache Hive. Data transformation via SQL and DBT. Orchestration platforms, such as Apache Airflow. Data catalogs and metadata management tools. o Vector data stores. Familiarity With Data lake architectures More ❯
Posted:

Data engineer | NL/UK/CZ

London, England, United Kingdom
Hybrid / WFH Options
ScanmarQED
Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling More ❯
Posted:

Data Engineer (f/m/x) (EN) - Hybrid

Dortmund, Nordrhein-Westfalen, Germany
Hybrid / WFH Options
NETCONOMY
Salary: 50.000 - 60.000 € per year Requirements: • 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles More ❯
Employment Type: Permanent
Salary: EUR 50,000 - 60,000 Annual
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Veeva Systems, Inc
models in close cooperation with our data science team Experiment in your domain to improve precision, recall, or cost savings Requirements Expert skills in Java or Python Experience with Apache Spark or PySpark Experience writing software for the cloud (AWS or GCP) Speaking and writing in English enables you to take part in day-to-day conversations in the More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer (f/m/x)

Austria
Hybrid / WFH Options
NETCONOMY GmbH
data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark Strong programming skills in Python , with experience in data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Senior Software Engineer (Data Engineering)

London, England, United Kingdom
Hybrid / WFH Options
VIOOH
integrating Monitoring Tools (Datadog/Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will work hard to give you a positive experience More ❯
Posted:

Data Engineer (f/m/x)

Wien, Austria
Hybrid / WFH Options
NETCONOMY GmbH
data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark Strong programming skills in Python , with experience in data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Engineer (f/m/x)

Graz, Steiermark, Austria
Hybrid / WFH Options
NETCONOMY GmbH
data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark Strong programming skills in Python , with experience in data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Solutions Architect

London, England, United Kingdom
Hybrid / WFH Options
NTT DATA
on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a More ❯
Posted:

Data Solutions Architect - Engineering

London, England, United Kingdom
Hybrid / WFH Options
NTT DATA
on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a More ❯
Posted:

Data Consultant(s) - Data Engineer

Liverpool, Lancashire, United Kingdom
Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Lead Data Engineer Remote/Home Based, UK

London, England, United Kingdom
Hybrid / WFH Options
Aker Systems Limited
exploring new technologies and methodologies to solve complex data challenges. Proven experience leading data engineering projects or teams. Expertise in designing and building data pipelines using frameworks such as Apache Spark, Kafka, Glue, or similar. Solid understanding of data modelling concepts and experience working with both structured and semi-structured data. Strong knowledge of public cloud services, especially AWS More ❯
Posted:
Apache
10th Percentile
£37,574
25th Percentile
£60,375
Median
£110,000
75th Percentile
£122,500
90th Percentile
£138,750