performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , ApacheAirflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. More ❯
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) - e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies - e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech More ❯
the big 3 cloud ML stacks (AWS, Azure, GCP). Hands-on experience with open-source ETL, and data pipeline orchestration tools such as ApacheAirflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration … tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with NoSQL and graph databases. Unix server administration and shell scripting experience. Experience in building scalable data pipelines for highly unstructured data. Experience in building DWH and data lakes architectures. Experience in working in cross More ❯
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to More ❯
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to More ❯
of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on ApacheAirflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks More ❯
Proficiency in version control tools like Git ensures effective collaboration and management of code and data models. Experience with workflow automation tools, such as ApacheAirflow, is crucial for streamlining and orchestrating complex data processes. Skilled at integrating data from diverse sources, including APIs, databases, and third-party More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
across the company. Role requirements 4+ years of experience You have an understanding of developing ETL pipelines using Python frameworks such as luigi or airflow; You have experience with the development of Python-based REST APIs/services and their integration with databases (e.g. Postgres); You are familiar with More ❯
MPP databases such as Redshift, BigQuery, or Snowflake, and modern transformation/query engines like Spark, Flink, Trino. Familiarity with workflow management tools (e.g., Airflow) and/or dbt for transformations. Comprehensive understanding of modern data platforms, including data governance and observability. Experience with cloud platforms (AWS, GCP, Azure More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
navigate client relationships and translate technical insights into business value. Experience with cloud platforms (e.g., Snowflake, AWS) and ETL/ELT pipeline tools like Airflow/dbt. Benefits £6,000 per annum training & conference budget to help you up-skill and elevate your career Pension contribution scheme (up to More ❯
architectures and CAP theorem. A good understanding of functional paradigms and type theory. Confident JVM knowledge. Modern Java, Ruby, or Clojure knowledge. Experience with Airflow or other Python-based workflow orchestration tools. Exposure to Kubernetes, Docker, Linux, Kafka, RabbitMQ, or git. Knowledge of financial concepts, exchange trading, or physical More ❯
maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
Python, R, and Java. Experience scaling machine learning on data and compute grids. Proficiency with Kubernetes, Docker, Linux, and cloud computing. Experience with Dask, Airflow, and MLflow. MLOps, CI, Git, and Agile processes. Why you do not want to miss this career opportunity? We are a mission-driven firm More ❯
transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯