Apache Job Vacancies

76 to 100 of 927 Apache Jobs

Senior QA Engineer - ETL Data Migration

London, South East, England, United Kingdom
Interquest
products or platforms Strong knowledge of SQL and experience with large-scale relational and/or NoSQL databases Experience testing data pipelines (ETL/ELT), preferably with tools like Apache Airflow, dbt, Spark, or similar Proficiency in Python or similar scripting language for test automation Experience with cloud platforms (AWS, GCP, or Azure), especially in data-related services Familiarity More ❯
Employment Type: Contractor
Rate: £450 - £475 per day
Posted:

Lead Full Stack Software Engineer

Gloucester, Gloucestershire, South West, United Kingdom
Anson Mccade
Python Strong experience developing on Linux Version control using Git Agile development (SCRUM) Working with both relational databases (Oracle) and NoSQL (MongoDB) Experience with GitLab CI/CD Pipelines , Apache NiFi , and Atlassian tools (Jira, Bitbucket, Confluence) Front-end skills: JavaScript/TypeScript, React Search and analytics tools: Elasticsearch, Kibana Nice to Have: Experience developing for AWS Cloud (EC2 More ❯
Employment Type: Permanent
Posted:

Lead Full Stack Software Engineer

cheltenham, south west england, united kingdom
Anson Mccade
Python Strong experience developing on Linux Version control using Git Agile development (SCRUM) Working with both relational databases (Oracle) and NoSQL (MongoDB) Experience with GitLab CI/CD Pipelines , Apache NiFi , and Atlassian tools (Jira, Bitbucket, Confluence) Front-end skills: JavaScript/TypeScript, React Search and analytics tools: Elasticsearch, Kibana Nice to Have: Experience developing for AWS Cloud (EC2 More ❯
Posted:

Data engineer | NL/UK/CZ

London, England, United Kingdom
Hybrid / WFH Options
ScanmarQED
Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling More ❯
Posted:

Data Engineer (f/m/x) (EN) - Hybrid

Dortmund, Nordrhein-Westfalen, Germany
Hybrid / WFH Options
NETCONOMY
Salary: 50.000 - 60.000 € per year Requirements: • 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles More ❯
Employment Type: Permanent
Salary: EUR 50,000 - 60,000 Annual
Posted:

Scientific Data Engineer

Boston, England, United Kingdom
Arrayo
Familiarity with scientific data standards, ontologies, and best practices for metadata capture. Understanding of data science workflows in computational chemistry, bioinformatics, or AI/ML-driven research. Orchestration & ETL: Apache Airflow, Prefect Scientific Libraries (Preferred): RDKit, Open Babel, CDK Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job function Job function Engineering, Research, and More ❯
Posted:

Data Engineer

London, United Kingdom
Hybrid / WFH Options
Veeva Systems, Inc
models in close cooperation with our data science team Experiment in your domain to improve precision, recall, or cost savings Requirements Expert skills in Java or Python Experience with Apache Spark or PySpark Experience writing software for the cloud (AWS or GCP) Speaking and writing in English enables you to take part in day-to-day conversations in the More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Engineer (f/m/x)

Austria
Hybrid / WFH Options
NETCONOMY GmbH
data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark Strong programming skills in Python , with experience in data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Senior Software Engineer (Data Engineering)

London, England, United Kingdom
Hybrid / WFH Options
VIOOH
integrating Monitoring Tools (Datadog/Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will work hard to give you a positive experience More ❯
Posted:

Data Engineer (f/m/x)

Wien, Austria
Hybrid / WFH Options
NETCONOMY GmbH
data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark Strong programming skills in Python , with experience in data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Engineer (f/m/x)

Graz, Steiermark, Austria
Hybrid / WFH Options
NETCONOMY GmbH
data engineering, and cloud technologies to continuously improve our tools and approaches Profil Essential Skills: 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark Strong programming skills in Python , with experience in data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and More ❯
Employment Type: Permanent
Salary: EUR Annual
Posted:

Data Engineer

London, United Kingdom
LGT Vestra LLP
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Risk - Software Engineer - Vice President - Birmingham

Birmingham, Staffordshire, United Kingdom
WeAreTechWomen
with multiple languages • Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Sr. Data Engineer

Edinburgh, Scotland, United Kingdom
Addepar
Are A degree in computer science, engineering, mathematics or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge of financial More ❯
Posted:

Risk Division - Software Engineer - Vice President - London London · United Kingdom · Vice President

Birmingham, England, United Kingdom
Goldman Sachs Bank AG
multiple programming languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file More ❯
Posted:

Data Engineer

Sheffield, England, United Kingdom
Tes
translate concepts into easily understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres)). Apache Airflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker for local development. Apply engineering best practices to More ❯
Posted:

Equity Research Data Strategist

London, England, United Kingdom
Jefferies
level of proficiency in Python and SQL A deep understanding of generative AI and experience building applications with large foundation models Proficiency in Databricks, BitBucket/Jira/Confluence, Apache Spark (PySpark) and AWS is advantageous Degree/Master’s educated in Data Science or a related subject – e.g. Statistics, Computer Science, etc. A working knowledge of fundamental equity More ❯
Posted:

Data Engineer

Greater London, England, United Kingdom
Kharon
data ecosystem (e.g., Pandas, NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization Familiarity with graph databases (e.g., Neo4j, Memgraph) or search platforms (e.g., Elasticsearch, OpenSearch) to support complex data More ❯
Posted:

Risk - Software Engineer - Vice President - Birmingham

Birmingham, England, United Kingdom
Goldman Sachs, Inc
with multiple languages • Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory More ❯
Posted:

Senior Software Engineer - Clean Room

London, United Kingdom
LiveRamp
Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or Google Cloud. Strong proficiency in designing, developing, and deploying microservices architecture, with a deep understanding of inter-service More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Cloud Data Platform Engineer

England, United Kingdom
BMC Software, Inc
serverless services and enables powerful querying and analytics through Amazon Athena. In this role, you'll work on a system that combines streaming ingestion (Firehose), data lake technologies (Parquet, Apache Iceberg), scalable storage (S3), event-driven processing (Lambda, EventBridge), fast access databases (DynamoDB), and robust APIs (Spring Boot microservices on EC2). Your role will involve designing, implementing, and More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:

Data Product Engineer

Luton, England, United Kingdom
easyJet
field. Technical Skills Required Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or More ❯
Posted:

Data Solutions Architect

London, England, United Kingdom
Hybrid / WFH Options
NTT DATA
on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a More ❯
Posted:

Data Solutions Architect - Engineering

London, England, United Kingdom
Hybrid / WFH Options
NTT DATA
on platforms such as AWS, Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a More ❯
Posted:

Data Platform Engineer

London, England, United Kingdom
easyJet
the data field. Technical Skills Required ·Hands-on software development experience with Python and experience with modern software development and release engineering practices (, CI/CD). ·Experience with Apache Spark or any other distributed data programming frameworks. ·Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. ·Experience with cloud infrastructure like AWS or More ❯
Posted:
Apache
10th Percentile
£37,574
25th Percentile
£60,375
Median
£110,000
75th Percentile
£122,500
90th Percentile
£138,750