analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
london (city of london), south east england, united kingdom
HCLTech
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using ApacheAirflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with ApacheAirflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with ApacheAirflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., ApacheAirflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
improvement and operational excellence. Deep expertise in data compliance frameworks, cost management, and platform optimisation. Strong hands-on experience with modern cloud data warehouses (Databricks, Snowflake, AWS), SQL, Spark, Airflow, Terraform. Advanced Python skills with orchestration tooling; solid experience in CI/CD (Git, Jenkins). Proven track record in data modelling, batch/real-time integration, and large More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally ApacheAirflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems More ❯
and scalability Contribute to the overall data strategy and architecture 🔹 Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in More ❯
and scalability Contribute to the overall data strategy and architecture 🔹 Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in More ❯
london (city of london), south east england, united kingdom
Roc Search
and scalability Contribute to the overall data strategy and architecture 🔹 Tech Stack You’ll be working with: Programming: Python, SQL, Scala/Java Big Data: Spark, Hadoop, Databricks Pipelines: Airflow, Kafka, ETL tools Cloud: AWS, GCP, or Azure (Glue, Redshift, BigQuery, Snowflake) Data Modelling & Warehousing 🔹 What’s on Offer 💷 £80,000pa (Permanent role) 📍 Hybrid – 2 days per week in More ❯
role 60% hands-on technically/40% hands-off leadership and strategy Proven experience designing scalable data architectures and pipelines Strong Python, SQL, and experience with tools such as Airflow, dbt, and Spark Cloud expertise (AWS preferred), with Docker/Terraform A track record of delivering in fast-paced, scale-up environments Nice to have: Experience with streaming pipelines More ❯
Employment Type: Full-Time
Salary: £110,000 - £120,000 per annum, Inc benefits
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
teams Strong technical background (5+ years) in building scalable data platforms Excellent communication and stakeholder management skills Hands-on experience with modern data tools and technologies — Python, SQL, Snowflake, Airflow, dbt, Spark, AWS, Terraform A collaborative mindset and a passion for mentoring and developing others Comfortable balancing technical decisions with business needs Nice to have: experience with Data Mesh More ❯
uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics, or More ❯
uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics, or More ❯
london (city of london), south east england, united kingdom
TGS International Group
uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics, or More ❯
knowledge of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project More ❯
knowledge of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project More ❯
knowledge of Python and SQL (experience with Snowflake highly desirable) Knowledge of BI tools such as Superset, Tableau , PowerBI or similar is desirable Knowledge of orchestration tools such as Airflow, DBT or Google Cloud Dataflow is a bonus Analytical and problem-solving skills, with a deep curiosity for fraud detection Excellent attention to detail to ensure quality of project More ❯