or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech More ❯
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Java Developer (Software Engineer Programmer Developer Java Fixed Income … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
integrate data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Block MB
integrate data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc. More ❯
East London, London, United Kingdom Hybrid / WFH Options
Richard Wheeler Associates
lead developer of the company data pipelines and warehouses, based around Snowflake Completing the process to move legacy ETL pipelines from Python, SQL, PostgreSQL, Airflow to Snowflake ELT Owning data quality, availability and security Proactively analysing and improving the quality of our data products - including performance, scalability, maintainability, test More ❯
in the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Winston Fox
in the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Datatech Analytics
skills for data transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯
transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic More ❯
for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes More ❯
London, England, United Kingdom Hybrid / WFH Options
WA Consultants
brokers such as AWS SQS. *Own and evolve containerised deployment pipelines using Docker and CI/CD principles. *Develop and manage data pipelines with ApacheAirflow, with data transformation using Python and Pandas. *Guide and mentor a team of engineers, setting high standards for clean code, testing, and More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as ApacheAirflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal More ❯
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Saragossa
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the More ❯
work cross-functionally in an Agile environment Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI More ❯
practices and a strong engineering culture Take ownership of critical platform components, ensuring performance, reliability, and security Manage and optimise complex data pipelines using Airflow within a Google Cloud Platform environment Drive continuous improvements to architecture, infrastructure, and workflow automation Core Tech Stack: Must-have : Google Cloud Platform (GCP … ApacheAirflow Nice-to-have : dbt, Terraform, Kubernetes Bonus : Familiarity or curiosity about generative AI tools (e.g. ChatGPT) Ideal Candidate: 4+ Years experience in a Data/Platform engineering role Proven experience leading data engineering initiatives and mentoring engineering teams Strong understanding of cloud-based architecture and workflow More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
Data Scientist with Machine Learning experience ** Strong understanding and experience with ML models and ML observability tools ** Strong Python and SQL experience ** Spark/ApacheAirflow ** ML frame work experience (PyTorch/TensorFlow/Scikit-Learn) ** Experience with cloud platforms (preferably AWS) ** Experience with containerisation technologies Useful information More ❯