navigate client relationships and translate technical insights into business value. Experience with cloud platforms (e.g., Snowflake, AWS) and ETL/ELT pipeline tools like Airflow/dbt. Benefits £6,000 per annum training & conference budget to help you up-skill and elevate your career Pension contribution scheme (up to More ❯
architectures and CAP theorem. A good understanding of functional paradigms and type theory. Confident JVM knowledge. Modern Java, Ruby, or Clojure knowledge. Experience with Airflow or other Python-based workflow orchestration tools. Exposure to Kubernetes, Docker, Linux, Kafka, RabbitMQ, or git. Knowledge of financial concepts, exchange trading, or physical More ❯
maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like ApacheAirflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
Python, R, and Java. Experience scaling machine learning on data and compute grids. Proficiency with Kubernetes, Docker, Linux, and cloud computing. Experience with Dask, Airflow, and MLflow. MLOps, CI, Git, and Agile processes. Why you do not want to miss this career opportunity? We are a mission-driven firm More ❯
transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery processes Hands on experience of best engineering practices (handling More ❯
for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes More ❯
environments (e.g., Snowflake). Proficiency in SQL and at least one scripting language (e.g., Python, Bash). Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data quality principles and best practices. Excellent communication and collaboration skills. Working with AWS, Twilio Segment and Google Analytics. More ❯
or Java-comfortable reviewing code, guiding design decisions, and supporting technical excellence. Big Data Expertise: In-depth experience with tools like Kafka, Flink, dbt, Airflow, and Airbyte, and a solid understanding of building and operating modern data ecosystems. Cloud Experience: Hands-on experience with AWS, GCP, or both-comfortable More ❯
develop and deploy Feature Engineering and Modeling applications to data platforms built on Databricks or similar platforms and platform components (e.g., Snowflake, ML Flow, Airflow, etc.). Demonstrated experience in using Azure-based cloud applications, services and infrastructure or significant, transferrable experience with other Cloud Providers (e.g., AWS or More ❯
Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh More ❯
technical leadership role on projects and experience with transitioning projects into a support program. Experience with Google Cloud Platform (GCP) services, including Cloud Composer (ApacheAirflow) for workflow orchestration. Strong experience in Python with demonstrable experience in developing and maintaining data pipelines and automating data workflows. Proficiency in … e.g., Git). Strong expertise in Python, with a particular focus on libraries and tools commonly used in data engineering, such as Pandas, NumPy, Apache Airflow. Experience with data pipelines, ELT/ETL processes, and data wrangling. Dashboard analytics (PowerBI, Looker Studio or Tableau) experience. Excellent English, written and More ❯
is required for the role. Equally, proficiency in Python and SQL is essential, ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as ApacheAirflow, Athena, or SageMaker More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as ApacheAirflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯