South East London, England, United Kingdom Hybrid / WFH Options
Hartree Partners
/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
the big 3 cloud ML stacks (AWS, Azure, GCP). Hands-on experience with open-source ETL, and data pipeline orchestration tools such as ApacheAirflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration … tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with NoSQL and graph databases. Unix server administration and shell scripting experience. Experience in building scalable data pipelines for highly unstructured data. Experience in building DWH and data lakes architectures. Experience in working in cross More ❯
Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like ApacheAirflow and dbt, Spark/Python or Spark/Scala More ❯
Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like ApacheAirflow and dbt, Spark/Python or Spark/Scala More ❯
of SQL and experience in testing databases and data warehouses with dBt (e.g., Snowflake - Preferred, Redshift, BigQuery) Strong Knowledge of workload automation platforms like ApacheAirflow and dBt (Data Build Tool) Familiarity with CI/CD tools (e.g. Azure DevOps – Preferred, Jenkins) and experience integrating automated tests into … and frameworks. Proficiency with automation testing frameworks (Cucumber, Gherkin, TestNG, or similar) for data testing workloads. Knowledge of performance testing and load testing tools (Apache JMeter or Gatling) Experience: Proven track record in supporting and improving test processes in data-related projects. Experience in leadership, mentoring, or training roles More ❯
object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design … following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for ApacheAirflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build More ❯
insights Build and deploy infrastructure with Terraform Implement DDL, DML with Iceberg Do code reviews for your peers Orchestrate your pipelines with DAGs on Airflow Participate in SCRUM ceremonies (standups, backlogs, demos, retros, planning) Secure data with IAM and AWS Lake formation Deploy your changes with Jenkins and GitHub … diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on ApacheAirflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks More ❯
Infrastructure as Code and secrets management MongoDB – Key-value/NoSQL database knowledge GCP (Google Cloud Platform) – Experience managing and supporting workloads in GCP ApacheAirflow – Workflow orchestration using Python/SQL Apache Kafka – Event streaming with Java, Python, or Scala Cisco NSO – YANG modelling, service development More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What More ❯
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., ApacheAirflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What More ❯
languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like ApacheAirflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing frameworks More ❯
working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in More ❯
Newbury, England, United Kingdom Hybrid / WFH Options
Intuita
working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing You have experience implementing and maintaining scalable data pipelines using tools like dbt, ApacheAirflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and More ❯
Solid experience with ETL/ELT pipelines using dbt Hands-on experience with cloud platforms (AWS, Azure, or GCP) Familiarity with orchestration tools like Airflow, Dagster, or Azure Data Factory Desirable Skills Background in software engineering or DevOps practices (CI/CD, version control, testing) Experience with streaming technologies More ❯
London, England, United Kingdom Hybrid / WFH Options
Winston Fox
Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg More ❯
Kinesis, Lambda, etc. Strong knowledge of data lake concepts, architectures, and design patterns. Experience in building and managing data pipelines using tools such as Airflow, Spark, Kinesis etc. Experience in working with structured, semi-structured, and unstructured data sources such as relational databases, NoSQL databases, APIs, web logs, etc. More ❯
and optimising data warehouses and ELT pipelines Solid experience across cloud platforms – ideally AWS, Snowflake, or Databricks Comfortable working with automation/integration tools (Airflow, Fivetran, Astronomer) Hands-on with Terraform, Docker, Kubernetes and modern CI/CD tools (GitHub Actions, Jenkins, CircleCI, etc.) Experience with real-time pipelines More ❯
solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯