development experience, with extensive expertise in: Back-end data processing Data lakehouse architecture Hands-on experience with Big Data open-source technologies such as: Apache Airflow Apache Kafka Apache Pekko Apache Spark & Spark Structured Streaming Delta Lake AWS Athena Trino MongoDB AWS S3, MinIO S3 Proven More ❯
London, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as Apache NiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such … as Kafka, Apache NiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and enhance Elasticsearch … Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with Apache NiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka More ❯
performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired More ❯
practices include OWASP guidelines/top 10, SOC 2, and NCSC cloud security principles. Experience in data and orchestration tools including some of dbt, Apache Airflow, Azure Data Factory. Experience in programming languages including some of Python, Typescript, Javascript, R, Java, C#, producing services, APIs, Function Apps or Lambdas. More ❯
London, England, United Kingdom Hybrid / WFH Options
VIOOH
Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will work hard to More ❯
stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). About Us More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Eden Scott
You have strong experience in Java development and exposure to Python. Have experience with large-scale data processing and search technologies. An expert in Apache Lucene, Solr, Elasticsearch, if not you have the appetite to learn more. Hands on experience with SQL and NoSQL databases under your belt. Hold More ❯
London, England, United Kingdom Hybrid / WFH Options
VIOOH
Kibana, Grafana, or Prometheus. Proficiency with Terraform, Docker, and Kubernetes. Software development experience in Java, Scala, or Python. Desirable (but not required): Experience with Apache Spark jobs and pipelines. Knowledge of functional programming languages. Understanding of database design concepts. Ability to write and analyze SQL queries. Application Process Our More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance More ❯
communication skills and demonstrated ability to engage with business stakeholders and product teams. Experience in data modeling , data warehousing (e.g., Snowflake , AWS Glue , EMR , Apache Spark ), and working with data pipelines . Leadership experience—whether technical mentorship, team leadership, or managing critical projects. Familiarity with Infrastructure as Code (IaC More ❯
City of Westminster, England, United Kingdom Hybrid / WFH Options
VIOOH
Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will work hard to More ❯
diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on Apache Airflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks like More ❯
London, England, United Kingdom Hybrid / WFH Options
Applicable Limited
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes More ❯
technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or Google Cloud. Strong proficiency in designing, developing, and deploying microservices architecture, with a More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown
equivalent experience. Experience : Advanced experience in test automation development using tools like Selenium, JUnit, TestNG, Cypress, etc. Familiarity with performance testing tools such as Apache Bench, JMeter, or LoadRunner, or modern alternatives like K6, Gatling, Locust. Familiarity with BDD tools like Cucumber or SpecFlow. Skills : Proficiency in programming languages More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, England, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown
equivalent experience. Experience : Advanced experience in test automation development using tools like Selenium, JUnit, TestNG, Cypress, etc. Familiarity with performance testing tools such as Apache Bench, JMeter, or LoadRunner, or modern alternatives like K6, Gatling, Locust. Familiarity with BDD tools like Cucumber or SpecFlow. Skills : Proficiency in programming languages More ❯
factor app development standards Experience building modern enterprise applications and deploying to public or private clouds including AWS Experience in distributed cache systems like Apache Ignite or Redis Experience in big data platforms and technologies such as Hadoop, Hive, HDFS, Presto/Starburst, Spark, and Kafka Experience in Spring More ❯
London, England, United Kingdom Hybrid / WFH Options
Flutter
standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and More ❯
source platform that we will teach you. Read more on Bloomberg . Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You'll be involved in building the next generation of finance systems More ❯
London, England, United Kingdom Hybrid / WFH Options
Hargreaves Lansdown
equivalent experience. Experience : Advanced experience in test automation development using tools like Selenium, JUnit, TestNG, Cypress, etc. Familiarity with performance testing tools such as Apache Bench, JMeter, or LoadRunner, or modern alternatives like K6, Gatling, Locust. Familiarity with BDD tools like Cucumber or SpecFlow. Skills : Proficiency in programming languages More ❯
require the highest data throughput in Java. Within Data Engineering we use Dataiku, Snowflake, Prometheus, and ArcticDB heavily. We use Kafka for data pipelines, Apache Beam for ETL, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Navtech, Inc
engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner More ❯