Apache Jobs in London

1 to 25 of 503 Apache Jobs in London

Senior Python Software Engineer AWS Java Data Finance London

London, United Kingdom
Hybrid / WFH Options
Joseph Harry Ltd
Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset … Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Sandtech
data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for deploying … and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

AWS Data Engineer (Must hold current SC)

London, England, United Kingdom
Amber Labs Limited
/Scrum environment. Preferred Qualifications AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect – Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits Competitive salary and performance-based bonus structure. Join a More ❯
Posted:

Lead Engineer - Full Stack Developer -Python/SQL/React

London, England, United Kingdom
JPMorgan Chase & Co
ability to effectively collaborate with stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). About Us J.P. Morgan is a global More ❯
Posted:

Lead Engineer - Full Stack Developer -Python/SQL/React

Westminster Abbey, England, United Kingdom
J.P. MORGAN-1
ability to effectively collaborate with stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). About Us J.P. Morgan is a global More ❯
Posted:

Principle Data Engineer ( AWS & Airflow )

London, United Kingdom
Hybrid / WFH Options
83zero Ltd
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Employment Type: Permanent
Salary: £115000 - £125000/annum 10% Bonus
Posted:

Data Engineer

London, United Kingdom
Sandtech
Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying and … managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Cloud Data Engineer (AWS), Flutter Functions

London, England, United Kingdom
Hybrid / WFH Options
Flutter
fully documented and meet appropriate standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and VPC networking. Experience integrating data More ❯
Posted:

Data Engineering Consultant

London, England, United Kingdom
Hybrid / WFH Options
Endava
solutions aligned with business objectives. Key Responsibilities Data Pipeline Development Architect, implement and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL … security measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage: Relational (PostgreSQL, SQL Server), NoSQL (MongoDB, Cassandra), Dimensional More ❯
Posted:

Data Engineer

London, England, United Kingdom
twentyAI
days a week on site) Contract: 6-month sign-on Interview process: 2 stages twentyAI’s customer is building the next generation of their data platform, with Databricks and Apache Spark at the core. A PoC is already in place — now they’re looking for someone who’s done this before to lead the full-scale build and help … IaC, CI/CD, automation) Exposure to AWS-based environments Familiarity with financial/trading systems and working in regulated industries (e.g., banking, commodities) Tech Environment: Primary Platform: Databricks, Apache Spark Other Tech: DBT, Airflow, Python, PySpark Cloud: AWS (preferred), private cloud storage Data Sources: Financial/trading systems #J-18808-Ljbffr More ❯
Posted:

Senior Software Engineer (Java, Python, Spark) - SaaS Software (Trade Surveillance & Complaince)

City Of London, England, United Kingdom
Sterlings
critical. The platform also leverages machine learning to help them to detect trading behaviour that may trigger regulatory inquiries. In terms of the technical stack, this includes Java, Python, Apache Spark (on Serverless EMR), AWS, DynamoDB, S3, SNS/SQS. Experience Required; Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Experience in financial … markets, specifically across trade surveillance or compliance software Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with AWS services, ideally including tools such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on with Python, especially in data handling (pandas, scikit-learn More ❯
Posted:

Principal Data Engineer

London, United Kingdom
Hybrid / WFH Options
83zero Limited
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Employment Type: Permanent, Work From Home
Posted:

Principal Data Engineer

City of London, London, United Kingdom
Hybrid / WFH Options
83data
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Posted:

Principal Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
83zero Limited
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Posted:

Senior Software Engineer - Compliance

London, United Kingdom
Hybrid / WFH Options
Trading Technologies International
award-winning trading and surveillance platform, including TT Trade Surveillance , which leverages machine learning to detect trading behavior that may trigger regulatory inquiries. Our tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS , and other cloud-native services. As part of a high-impact engineering team, you'll help design and … problems at scale in a domain where precision, performance, and reliability are critical. What Will You be Involved With? Design and build scalable, distributed systems using Java , Python , and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for processing large-scale time-series datasets Build event-driven and batch processing workflows using Lambda , SNS/SQS … to the Table? Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with AWS services , including Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on with Python , especially in More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Principal Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
83data
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Posted:

Senior Data Engineer

London, England, United Kingdom
Hybrid / WFH Options
Artefact
leading data projects in a fast-paced environment. Key Responsibilities Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark. Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms. Implement continuous integration and continuous deployment (CI/CD More ❯
Posted:

Senior Data Engineer

London Area, United Kingdom
Hybrid / WFH Options
OTA Recruitment
modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/ More ❯
Posted:

Senior Data Engineer

City of London, London, United Kingdom
Hybrid / WFH Options
OTA Recruitment
modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/ More ❯
Posted:

Data Engineer

London, England, United Kingdom
Aviva
writing clean, maintainable code in SQL or Python . Experience with Scala , Java , or similar languages is a plus. Hands-on experience with data pipeline orchestration tools such as Apache Airflow and Azure DevOps . Strong knowledge of cloud-based data engineering, particularly in AWS environments. Experience in data quality assessment (profiling, anomaly detection) and data documentation (schemas, dictionaries More ❯
Posted:

Sr. Software Engineer, Infrastructure

London, England, United Kingdom
Hybrid / WFH Options
Circadia Technologies Ltd
frameworks such as Boost.Test, Google Test, etc. Nice to Haves: Experience with Azure services for managing GPT pipelines and multi-cloud infrastructure. Familiarity with big data technologies such as Apache Spark, Kafka, and MSK for large-scale data processing. Experience with boost libraries (asio, beast). Advanced experience in cost optimization strategies for cloud infrastructure and database performance tuning. More ❯
Posted:

Senior Software Engineer in Test (SDET)

London, England, United Kingdom
Hybrid / WFH Options
Hargreaves Lansdown
or a related field, or equivalent experience. Experience : Advanced experience in test automation development using tools like Selenium, JUnit, TestNG, Cypress, etc. Familiarity with performance testing tools such as Apache Bench, JMeter, or LoadRunner, or modern alternatives like K6, Gatling, Locust. Familiarity with BDD tools like Cucumber or SpecFlow. Skills : Proficiency in programming languages such as Java, Python, or More ❯
Posted:

Senior Software Engineer

City Of London, England, United Kingdom
Hybrid / WFH Options
Paul Murphy Associates
support market surveillance and compliance efforts. The platform leverages advanced analytics and machine learning to identify trading behaviors that could trigger regulatory attention. The tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS, and other cloud-native tools. You’ll work alongside a high-impact engineering team to build fault-tolerant … data pipelines and services that process massive time-series datasets in both real-time and batch modes. Key Responsibilities: Design and build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for large-scale time-series processing Build event-driven and batch workflows using AWS Lambda, SNS/SQS, and … and non-technical stakeholders Qualifications: Strong backend software development experience, especially in distributed systems and large-scale data processing Advanced Java programming skills (multithreading, concurrency, performance tuning) Expertise in Apache Spark and Spark Streaming Proficiency with AWS services such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on Python experience, particularly More ❯
Posted:

Senior Data Engineer (Remote)

London, England, United Kingdom
Hybrid / WFH Options
Circana
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
Posted:

Senior Software Engineer

City of London, England, United Kingdom
Hybrid / WFH Options
Paul Murphy Associates
support market surveillance and compliance efforts. The platform leverages advanced analytics and machine learning to identify trading behaviors that could trigger regulatory attention. The tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS, and other cloud-native tools. You’ll work alongside a high-impact engineering team to build fault-tolerant … data pipelines and services that process massive time-series datasets in both real-time and batch modes. Key Responsibilities: Design and build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for large-scale time-series processing Build event-driven and batch workflows using AWS Lambda, SNS/SQS, and … and non-technical stakeholders Qualifications: Strong backend software development experience, especially in distributed systems and large-scale data processing Advanced Java programming skills (multithreading, concurrency, performance tuning) Expertise in Apache Spark and Spark Streaming Proficiency with AWS services such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on Python experience, particularly More ❯
Posted:
Apache
London
10th Percentile
£80,000
25th Percentile
£105,000
Median
£110,000
75th Percentile
£135,000
90th Percentile
£138,750