experience with AWS data services (Redshift, S3, Glue, Lambda); Proficiency with Python for data processing and pipeline development; Experience with workflow orchestration tools like Airflow or similar; Knowledge of streaming data technologies such as Kafka; Familiarity with infrastructure-as-code tools (Terraform, CloudFormation); Experience with version control systems (Git More ❯
love to talk to you if: You've got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
love to talk to you if: You’ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You’re comfortable working across cloud platforms – especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have More ❯
a data warehouse, ideally Snowflake Experience building and/or maintaining a CI/CD pipeline Experience using modern orchestration tooling e.g. Prefect, Luigi, Airflow Experience developing infrastructure in Terraform or a similar IAC tool Experience using Docker A positive and proactive attitude to problem solving A team player More ❯
a related field, with a focus on building scalable data systems and platforms. Expertise in modern data tools and frameworks such as Spark, dbt, Airflow, Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
You’ve led technical delivery of data engineering projects in a consultancy or client-facing environment You’re experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns – including Lakehouse and modern warehouse design (e.g. More ❯
scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if someone's circumstances or timescales More ❯
Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github Actions/Jenkins Nice to Have Experience Understanding of various data architecture paradigms (e.g., Data Lakehouse, Data Warehouse, Data Mesh More ❯
London, England, United Kingdom Hybrid / WFH Options
Our Future Health UK
will already have some of the following skills: Hands-on experience of working with open source data orchestration systems such as Dagster, Prefect, or Airflow Solid understanding of distributed compute engines such as Spark/Databricks Confidence using Docker, Kubernetes, and Helm in cloud environments Experience building software for More ❯
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
London, Manchester, North West Hybrid / WFH Options
Starling Bank
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Starling Bank Limited
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Starling Bank Limited
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
SQL, Azure functions with Python, Azure Purview, and Cosmos DB. They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure DataBricks with Spark, and other open source technologies like ApacheAirflow and dbt, Spark/Python, or Spark/Scala. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and … Python Data Modelling Data Vault ApacheAirflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage interviews next week so if you’re interest, get in touch ASAP with a copy of your most More ❯
migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and … Python Data Modelling Data Vault ApacheAirflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage interviews next week so if you’re interest, get in touch ASAP with a copy of your most More ❯
London, England, United Kingdom Hybrid / WFH Options
Connexity
or a related field. Solid programming skills in both Python and SQL. Proven work experience in Google Cloud Platform or other clouds, developing batch (ApacheAirflow) and streaming (Dataflow) scalable data pipelines. Experience processing large datasets at scale (BigQuery, Apache Druid, Elasticsearch) Familiarity with Terraform, DBT & Looker More ❯
promptly, and maintain comprehensive data documentation. What You’ll Bring Technical Expertise: Proficiency in Python and SQL; experience with data processing frameworks such as Airflow, Spark, or TensorFlow. Data Engineering Fundamentals: Strong understanding of data architecture, data modelling, and scalable data solutions. Backend Development: Willingness to develop proficiency in … backend technologies (e.g., Python with Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like ApacheAirflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation More ❯
Phase 2: Legacy Pipeline Migration (Months 2-3) Analyze and understand existing R-based data pipelines created by data scientists. Migrate these pipelines into Airflow, dbt, and Terraform workflows. Modernize and scale legacy infrastructure running on AWS. Collaborate with engineering teams to ensure a smooth transition and system stability. … beneficial for interpreting existing scripts) Cloud & Infrastructure: AWS services including Lambda, API Gateway, S3, CloudWatch, Kinesis Firehose Terraform for infrastructure as code Orchestration & Transformation: ApacheAirflow dbt CRM & Marketing Tools: Braze (preferred) Familiarity with other CRM/marketing automation tools such as Iterable or Salesforce Marketing Cloud is More ❯
is required for the role. Equally, proficiency in Python and SQL is essential, ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as ApacheAirflow, Athena, or SageMaker More ❯
role. Equally, strong ML experience, proficiency in Python and SQL knowledge is essential, ideally with experience using data processing frameworks such as Kafka, NoSQL, Airflow, TensorFlow, or Spark. Finally, experience with cloud platforms like AWS or Azure, including data services such as ApacheAirflow, Athena, or SageMaker More ❯