Permanent Apache Job Vacancies

1 to 25 of 468 Permanent Apache Jobs

Senior Python Software Engineer AWS Java Data Finance London

London, United Kingdom
Hybrid / WFH Options
Joseph Harry Ltd
Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset … Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Data Engineer

London, United Kingdom
Sandtech
data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for deploying … and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Lead Engineer - Full Stack Developer -Python/SQL/React

Westminster Abbey, England, United Kingdom
J.P. MORGAN-1
ability to effectively collaborate with stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). About Us J.P. Morgan is a global More ❯
Posted:

Senior Data Engineer

Glasgow, UK
Hybrid / WFH Options
Hypercube Consulting
and other DevOps practices such as IaC Testing Nice to have - Additional experience with the following would be beneficial but not essential: Data modelling approaches (Kimball, Imnon) Orchestration tools - Apache Airflow, Prefect or cloud-native tools Backend software development (Java, APIs, Scalability, Logging and Monitoring etc.) MLFlow and other MLOps/Machine Learning Engineering processes to support advanced analytical More ❯
Employment Type: Full-time
Posted:

Principle Data Engineer ( AWS & Airflow )

London, United Kingdom
Hybrid / WFH Options
83zero Ltd
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Employment Type: Permanent
Salary: £115000 - £125000/annum 10% Bonus
Posted:

Data Engineer

London, United Kingdom
Sandtech
Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS, Azure, or GCP for deploying and … managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
Hypercube Consulting
etc.) Relational, NoSQL, graph and vector databases Streaming technologies (Kafka, Kinesis, Flink, etc.) Containers and related services (Docker, Kubernetes, container Registries, etc) Data modelling approaches (Kimball, Imnon) Orchestration tools - Apache Airflow, Prefect or cloud-native tools Backend software development (Java, APIs, Scalability, Logging and Monitoring etc.) MLFlow and other MLOps/Machine Learning Engineering processes to support advanced analytical More ❯
Posted:

Principal Data Engineer

London, United Kingdom
Hybrid / WFH Options
83zero Limited
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Employment Type: Permanent, Work From Home
Posted:

Principal Data Engineer

City of London, London, United Kingdom
Hybrid / WFH Options
83data
warehouse solutions for BI and analytics. Define and drive the long-term architecture and data strategy in alignment with business goals. Own orchestration of ETL/ELT workflows using Apache Airflow , including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in … Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensional modelling, and metadata management. Experience with cloud platforms More ❯
Posted:

Senior Software Engineer - Compliance

London, United Kingdom
Hybrid / WFH Options
Trading Technologies International
award-winning trading and surveillance platform, including TT Trade Surveillance , which leverages machine learning to detect trading behavior that may trigger regulatory inquiries. Our tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS , and other cloud-native services. As part of a high-impact engineering team, you'll help design and … problems at scale in a domain where precision, performance, and reliability are critical. What Will You be Involved With? Design and build scalable, distributed systems using Java , Python , and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for processing large-scale time-series datasets Build event-driven and batch processing workflows using Lambda , SNS/SQS … to the Table? Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with AWS services , including Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on with Python , especially in More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Software Engineer - Java Microservices

Manchester, Lancashire, United Kingdom
Roku, Inc
Deep understanding in software architecture, object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design Demonstrated ability to be proactive … HR related applications Experience with following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for Apache Airflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetize large audiences, and provide More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Staff Software Engineer (Data Platform) - 3-6 months Contract

London, United Kingdom
Gorilla
Ability to manage complex systems and troubleshoot production issues effectively. Experience working in an agile, cross-functional team environment. Nice to Have: Experience with big data tools such as Apache Spark, Kafka, or other data processing frameworks or platforms like Databricks, Snowflake. Knowledge of data governance , data security practices, and best practices for managing large data sets that use More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer

London, United Kingdom
Visa Inc
qualifications: Bachelor's Degree in an analytical field such as computer science, statistics, finance, economics or relevant area. Entry level experience of Hadoop ecosystem and associated technologies, (For e.g. Apache Spark, MLlib, GraphX, iPython, sci-kit,Pandas etc.) Working knowledge in writing and optimizing efficient SQL queries with Python, Hive, Scala handling Large Data Sets in Big-Data Environments. More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Senior Java Software Engineer

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
Eden Scott
of production environments Your Background You have strong experience in Java development and exposure to Python. Have experience with large-scale data processing and search technologies. An expert in Apache Lucene, Solr, Elasticsearch, if not you have the appetite to learn more. Hands on experience with SQL and NoSQL databases under your belt. Hold a degree in Computer Science More ❯
Posted:

Senior Data Engineer

London Area, United Kingdom
Hybrid / WFH Options
OTA Recruitment
modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/ More ❯
Posted:

Senior Data Engineer

City of London, London, United Kingdom
Hybrid / WFH Options
OTA Recruitment
modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/ More ❯
Posted:

Python & UI Lead Software Engineer

Glasgow, Scotland, United Kingdom
J.P. MORGAN-1
Complex SQL Queries and ensuring optimal data storage and retrieval. Expertise in working with agile projects to automated testing/dev ops environments. Knowledge of big data technologies such Apache Spark or Pyspark Hands-on experience with containerization technologies like Docker and Kubernetes (EKS). Ability to guide and coach teams on approach to achieve goals aligned against a More ❯
Posted:

Sr Cloud Data Architect / Data Engineer

United Kingdom
iXceed Solutions
for large datasets. Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning. Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation … Git). 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or … Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL and experience with schema design and query optimization for large datasets. Expertise More ❯
Posted:

Senior Data Engineer (Remote)

South East, United Kingdom
Hybrid / WFH Options
Circana
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and Apache Airflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and a … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
Employment Type: Permanent
Posted:

Senior Software Engineer

City Of London, England, United Kingdom
Hybrid / WFH Options
Paul Murphy Associates
support market surveillance and compliance efforts. The platform leverages advanced analytics and machine learning to identify trading behaviors that could trigger regulatory attention. The tech stack includes Java, Python, Apache Spark (on Serverless EMR), AWS Lambda, DynamoDB, S3, SNS/SQS, and other cloud-native tools. You’ll work alongside a high-impact engineering team to build fault-tolerant … data pipelines and services that process massive time-series datasets in both real-time and batch modes. Key Responsibilities: Design and build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for large-scale time-series processing Build event-driven and batch workflows using AWS Lambda, SNS/SQS, and … and non-technical stakeholders Qualifications: Strong backend software development experience, especially in distributed systems and large-scale data processing Advanced Java programming skills (multithreading, concurrency, performance tuning) Expertise in Apache Spark and Spark Streaming Proficiency with AWS services such as Lambda, DynamoDB, S3, SNS, SQS, and Serverless EMR Experience with SQL and NoSQL databases Hands-on Python experience, particularly More ❯
Posted:

Senior Software Engineer (Java, Spark) - SaaS Software (Trade Surveillance & Complaince)

City Of London, England, United Kingdom
Sterlings
are critical. The platform also leverages machine learning to help them to detect trading behaviour that may trigger regulatory inquiries. In terms of the technical stack, this includes Java, Apache Spark (on Serverless EMR), AWS, DynamoDB, S3, SNS/SQS. Experience Required; Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Experience in financial … markets, specifically across trade surveillance or compliance software Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with cloud, ideally AWS services Experience with SQL and NoSQL databases Any Python experience beneficial, especially in data handling (pandas, scikit-learn, etc.) Familiarity with RESTful web services and event-driven architectures More ❯
Posted:

Senior Java Software Engineer

Glasgow, UK
Amici Procurement Solutions
modern technology stacks to build and optimize a powerful data platform and search engine. With an opportunity to explore vector search, machine learning, and large-scale data processing using Apache Lucene, Solr, or Elasticsearch. What you’ll be doing: Design, build, and optimize a high-performance data platform and search solution. Develop robust search capabilities using Apache Lucene … and search technologies. Role Profile You have strong experience in Java development and exposure to Python. Have experience with large-scale data processing and search technologies. An expert in Apache Lucene, Solr, Elasticsearch, if not you have the appetite to learn more. Hands on experience with SQL and NoSQL databases under your belt. Hold a degree in Computer Science More ❯
Employment Type: Full-time
Posted:

Senior Data Engineer - Snowflake - £100,000 - London - Hybrid

London, South East, England, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance to More ❯
Employment Type: Full-Time
Salary: £85,000 - £100,000 per annum
Posted:

Senior Data Engineer - Snowflake - £100,000 - London - Hybrid

City of London, London, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don't miss your chance to More ❯
Employment Type: Permanent
Salary: £85000 - £100000/annum + Top Benefits
Posted:

Data Engineer

City Of Bristol, England, United Kingdom
Peaple Talent
patterns in pipeline architecture and design. Confident using Git-based version control systems, including Azure DevOps or similar. Skilled in managing and scheduling data workflows using orchestration platforms like Apache AirFlow. Involved in building and optimizing data warehouses on modern analytics platforms like Snowflake, Redshift, or Databricks. Familiar with visual or low-code data integrations tools, including platforms such More ❯
Posted:
Apache
10th Percentile
£37,574
25th Percentile
£60,375
Median
£110,000
75th Percentile
£122,500
90th Percentile
£138,750