development experience, with extensive expertise in: Back-end data processing Data lakehouse architecture Hands-on experience with Big Data open-source technologies such as: Apache Airflow Apache Kafka Apache Pekko Apache Spark & Spark Structured Streaming Delta Lake AWS Athena Trino MongoDB AWS S3, MinIO S3 Proven More ❯
London, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as Apache NiFi and Apache Airflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such … as Kafka, Apache NiFi, and Apache Flink to handle real-time data streams effectively. Security and Compliance: Manage security setups and access controls, incorporating tools like Keycloak to protect data integrity and comply with legal standards across all data platforms. Data Search and Analytics: Oversee and enhance Elasticsearch … Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience with Apache NiFi and Apache Airflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka More ❯
performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired More ❯
practices include OWASP guidelines/top 10, SOC 2, and NCSC cloud security principles. Experience in data and orchestration tools including some of dbt, Apache Airflow, Azure Data Factory. Experience in programming languages including some of Python, Typescript, Javascript, R, Java, C#, producing services, APIs, Function Apps or Lambdas. More ❯
understanding of tradable financial instruments (securities, derivatives) and capital markets Computer Science, Math, or Financial Engineering degree Strong knowledge of data orchestration technologies – e.g., Apache Airflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. Role More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
Spark/Scala/Kafka Unix: Scripting and Config Other Highly Valued Skills Include Automation - Python/Bash Scripting DataBase - Teradata, Oracle Workflow Management: Apache Airflow You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business More ❯
technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
factor app development standards Experience building modern enterprise applications and deploying to public or private clouds including AWS Experience in distributed cache systems like Apache Ignite or Redis Experience in big data platforms and technologies such as Hadoop, Hive, HDFS, Presto/Starburst, Spark, and Kafka Experience in Spring More ❯
Splunk, etc). - Strong and demonstrable experience writing regular expressions and/or JSON parsing, etc. - Strong experience in log processing (Cribl, Splunk, Elastic, Apache NiFi etc.) - Expertise in the production of dashboard/insight delivery - Be able to demonstrate a reasonable level of security awareness (An understanding of More ❯
source platform that we will teach you. Read more on Bloomberg . Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You'll be involved in building the next generation of finance systems More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
stakeholders at all levels, provide training, and solicit feedback. Preferred qualifications, capabilities, and skills Experience with big-data technologies, such as Splunk, Trino, and Apache Iceberg. Data Science experience. AI/ML experience with building models. AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer). #J More ❯
AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance More ❯
London, England, United Kingdom Hybrid / WFH Options
VIOOH
Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will work hard to More ❯
diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on Apache Airflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks like More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
London, England, United Kingdom Hybrid / WFH Options
Flutter
standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and More ❯
and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend, Informatica, Apache NiFi). Knowledge of programming languages such as SQL, Python, or Java. Experience with BI tools (e.g., Power BI, Tableau) and data visualisation best More ❯
languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
London, England, United Kingdom Hybrid / WFH Options
VIOOH
Kibana, Grafana, or Prometheus. Proficiency with Terraform, Docker, and Kubernetes. Software development experience in Java, Scala, or Python. Desirable (but not required): Experience with Apache Spark jobs and pipelines. Knowledge of functional programming languages. Understanding of database design concepts. Ability to write and analyze SQL queries. Application Process Our More ❯
Strong communication skills and the ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to More ❯
modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business More ❯