Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) - e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies - e.g., Apache Airflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. Commitment More ❯
performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired More ❯
technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX markets and financial instruments would be beneficial. What we'll More ❯
factor app development standards Experience building modern enterprise applications and deploying to public or private clouds including AWS Experience in distributed cache systems like Apache Ignite or Redis Experience in big data platforms and technologies such as Hadoop, Hive, HDFS, Presto/Starburst, Spark, and Kafka Experience in Spring More ❯
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and More ❯
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Contra Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and More ❯
Experience in fintech or trading industries. Strong object-oriented development skills and software engineering fundamentals. Hands-on experience with cloud data-engineering technologies like Apache Airflow, K8S, Clickhouse, Snowflake, Redis, caching technologies, and Kafka. Proficiency in relational and non-relational databases, SQL, and query optimization. Experience designing infrastructure to More ❯
Strong communication skills and the ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to More ❯
AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance More ❯
experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure More ❯
team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose More ❯
hands-on experience with exposure to Snowflake and the Big Data tech stack Technical Skills: Python PySpark DBT Airflow AWS, especially S3 and Glue Apache Iceberg SQL Git Kubernetes and Docker Shell Snowflake and Big Data tech stack More ❯
hands-on experience with exposure to Snowflake and the Big Data tech stack Technical Skills: Python PySpark DBT Airflow AWS, especially S3 and Glue Apache Iceberg SQL Git Kubernetes and Docker Shell Snowflake and Big Data tech stack More ❯
in mentoring junior team members Experience in Oracle/Relational Databases and/or Mongo Experience in GitLab CI/CD Pipelines Knowledge of Apache NiFi Experience in JavaScript/TypeScript & React Experience of Elasticsearch and Kibana Knowledge of Hibernate Proficiency in the use of Atlassian Suite - Bitbucket, Jira More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don More ❯
in mentoring junior team members Experience in Oracle/Relational Databases and/or Mongo Experience in GitLab CI/CD Pipelines Knowledge of Apache NiFi Experience in JavaScript/TypeScript & React Experience of Elasticsearch and Kibana Knowledge of Hibernate Proficiency in the use of Atlassian Suite - Bitbucket, Jira More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes More ❯
working with hierarchical reference data models. Proven expertise in handling high-throughput, real-time market data streams. Familiarity with distributed computing frameworks such as Apache Spark. Operational experience supporting real-time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
Proficiency in version control tools like Git ensures effective collaboration and management of code and data models. Experience with workflow automation tools, such as Apache Airflow, is crucial for streamlining and orchestrating complex data processes. Skilled at integrating data from diverse sources, including APIs, databases, and third-party systems More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start-up More ❯
. Proficiency in data visualization libraries (Plotly, Seaborn). Solid understanding of database design principles and normalization. Experience with ETL tools and processes and Apache Oozie or similar workflow management tools. Understanding of Machine Learning and AI concepts is a plus. Leadership & Interpersonal Skills: Proven track record of managing More ❯
in multiple operating systems, including OS performance monitoring, setup, configuration, tuning, and troubleshooting. Proficiency in web or web server technologies: Java, Node.js, Tomcat, IIS, Apache/nginx, MySQL, PostgreSQL, etc., including being able to perform basic setup, configuration, and troubleshooting. Understanding of internet technologies and network protocols, including HTTPMore ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, Apache NiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) If you do not meet all requirements still feel More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, Apache NiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) If you do not meet all requirements still feel More ❯