multitasking Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and More ❯
languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
includes a firm-wide data catalogue detailing key data elements. Skills Required Java and/or Scala Backends Distributed compute concepts: Spark, Data Bricks, Apache Beam etc Full Stack experience Azure Cloud Technologies Angular/Similar front end Cloud DB/Relational DB: Snowflake/Sybase Unix experience Job More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid / WFH Options
Omega Resource Group
development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in Python More ❯
develop proficiency in backend technologies (e.g., Python with Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like Apache Airflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
McGregor Boyall
focus on public cloud onboarding. The platform is a Greenfield build using modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premises and in AWS utilising technologies such as EKS, S3, FSX. Objectives Steering platform More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Revolent Group
/data warehouse/data archive/dashboards Experience with databases (e.g. Oracle, MongoDB, RDBMS, SQL) Experience with BI tools (e.g. Tableau) Knowledge of Apache (Kafka) Plus, attitude is key. We’re looking for someone adaptable and resilient, who’s invested in their growth and passionate about taking their More ❯
frameworks. Proficient in the use of Linux, Git, MongoDB, and OpenSearch. Exposure to cloud platforms (AWS), containerisation technologies (Docker, Kubernetes) and data processing solutions (Apache NiFi). Experience with AI/ML systems Familiarity with React Contexts would be advantageous. * Please only apply if you have an active eDV More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, Apache NiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) If you do not meet all requirements still feel More ❯
east sussex, south east england, united kingdom Hybrid / WFH Options
McCabe & Barton
adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency : Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like Apache Airflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise : Familiarity with cloud platforms like Snowflake, Databricks, or AWS/ More ❯
london, south east england, united kingdom Hybrid / WFH Options
Owen Thomas | Pending B Corp™
. Experience with CI/CD pipelines and version control. Proficiency in data visualisation tools (e.g. Tableau, PowerBI). Exposure to tools like DBT, Apache Airflow, Docker. Experience working with large-scale datasets (terabyte-level or higher). Excellent problem-solving capabilities. Strong communication and collaboration skills. Proficiency in More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience More ❯
Java development with strong Spring Boot expertise. Solid background in building cloud-native applications , particularly using AWS services such as S3, SQS, Kinesis, or Apache Flink. Strong knowledge of stream processing , low-latency systems, and microservice architectures. Demonstrated leadership experience, including mentoring and guiding junior developers. Proficient in designing More ❯
Qualifications 5+ years of experience in Data Engineering, with a strong background in building data pipelines at scale. Proficiency with modern data technologies (e.g Apache Airflow, Spark, Kafka, Snowflake, or similar). Strong SQL skills and experience with cloud databases and data warehouses (AWS, GCP, or Azure ecosystems). More ❯
MiFID II or EMIR Experience building real-time applications based on a messaging paradigm Experience building large-scale data processing pipelines (for e.g. using Apache Spark, etc) Experience building Highly Available and High Performance applications Experience of FIX messaging protocol FX Options/Derivatives experience Senior Python Developer More ❯
in Next.js Experience with testing frameworks like Jest, Cypress, or React Testing Library. Experience with authentication strategies using OAuth, JWT, or Cognito Familiarity with Apache Spark/Flink for real-time data processing is an advantage. Hands-on experience with CI/CD tools Commercial awareness and knowledge of More ❯
usage, and data governance considerations, promoting transparency and responsible AI use. 7. Automate ETL pipeline orchestration and data processing workflows: Leverage orchestration tools like Apache Airflow, Perfect to schedule, automate, and manage ETL jobs, reducing manual intervention and improving operational reliability. 8. Implement monitoring, alerting, and troubleshooting for data More ❯