Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused: Data Pipeline Orchestration and ELT tooling such as ApacheAirflow, Apache NiFi, Airbyte, and Singer Message Brokers and streaming data processors like Apache Kafka Object Storage solutions such as S3 More ❯
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to More ❯
platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software engineering background Exposure to building or deploying AI/ML models More ❯
skills in languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Block MB
integrate data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc. More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
platforms (AWS preferred) and Infrastructure-as-Code tools. Solid understanding of relational databases and SQL. Proven track record building robust ETL pipelines , ideally using Airflow or a similar tool. Familiarity with best practices in software engineering: version control, testing, packaging, and code reviews. Quantitative problem-solving skills with an More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Winston Fox
in the Data space. This role will also allow the successful individual to cross-train into modern Data Engineering tools and Technologies such as Airflow, dbt, Snowflake, as well further develop their skills Python, SQL and Market Data platforms. The firm work on a hybrid working schedule (three days More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Datatech Analytics
skills for data transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯
Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS More ❯
Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and More ❯
and flexible systems. Influence Opinion and decision-making across AI and ML Skills Python SQL/Pandas/Snowflake/Elasticsearch Docker/Kubernetes Airflow/Spark Familiarity with GenAI models/libraries Requirements 6+ years of relevant software engineering experience post-graduation A degree (ideally a Master’s More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Saragossa
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
multi step pipeline) About you: Strong Python Experience building complex data transformation pipelines Experience with Databricks at scale, preferably experience with Iceberg Experience with Airflow or dagster Experience with AWS & open source technologies on top of dataweave Desirable - Medical data exposure is useful but video/image data, not More ❯
Pandas, Matplotlib). Comfortable deploying applications in cloud ecosystems, particularly AWS, and using infrastructure automation tools. Experienced with building data workflows and managing databases; Airflow and SQL are especially valued. A pragmatic engineer who follows clean development practices—version control, testing, packaging, etc. Analytical and curious, with a background More ❯
experience. Experience building ETL pipelines using Python. Experience of SQL and relational databases. Experience with AWS or similar Cloud technology. Experience with S3, Kafka, Airflow, and Iceberg will be beneficial. Experience in the financial markets with a focus on securities & derivatives trading. Exceptional communication skills, attention to detail, and More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
functional teams Tech You'll Work With ML & Data Science Python (primary language) TensorFlow, PyTorch, or Keras NumPy, pandas Data pipelines (Azure Data Factory, Airflow, etc.) Applied ML: NLP, CV, transformers, GANs, time series, etc. Engineering & Cloud Azure (or similar cloud platforms like AWS, GCP) Microservices and event-driven More ❯
Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't be daunting More ❯
analysis, platform selection, technical architecture design, application design and development, testing, and deployment. Leveraging your proficiency in tools such as Snowflake, DBT, Glue, and Airflow, you will help define the technical strategy and ensure scalable, high-performing data architecture. Your Profile Essential skills/knowledge/experience: Extensive experience More ❯
field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯