performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired More ❯
multitasking Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and More ❯
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and More ❯
languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks like More ❯
diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on Apache Airflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks like More ❯
includes a firm-wide data catalogue detailing key data elements. Skills Required Java and/or Scala Backends Distributed compute concepts: Spark, Data Bricks, Apache Beam etc Full Stack experience Azure Cloud Technologies Angular/Similar front end Cloud DB/Relational DB: Snowflake/Sybase Unix experience Job More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid / WFH Options
Omega Resource Group
development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS More ❯
Strong communication skills and the ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to More ❯
modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business More ❯
AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance More ❯
experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure More ❯
quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in Python More ❯
team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose More ❯
focus on public cloud onboarding. The platform is a Greenfield build using modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premises and in AWS utilising technologies such as EKS, S3, FSX. Objectives Steering platform More ❯
in mentoring junior team members Experience in Oracle/Relational Databases and/or Mongo Experience in GitLab CI/CD Pipelines Knowledge of Apache NiFi Experience in JavaScript/TypeScript & React Experience of Elasticsearch and Kibana Knowledge of Hibernate Proficiency in the use of Atlassian Suite - Bitbucket, Jira More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don More ❯