communication skills - able to collaborate across technical and business functions. A passion for data-driven problem solving, innovation, and continuous improvement. Nice-to-haves: Experience with data orchestration tools (Airflow, Prefect). Knowledge of data observability or cataloguing tools (Monte Carlo, OpenMetadata). Familiarity with large-scale consumer data or survey environments. More ❯
perform optimally Tech Stack: Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration: Tool Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal data, please refer to HCLTech More ❯
perform optimally Tech Stack: Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration: Tool Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal data, please refer to HCLTech More ❯
CD. Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent More ❯
CD. Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
business • Expert knowledge of modern cloud data platforms (Databricks, Snowflake, ideally AWS) • Advanced Python programming skills and fluency with the wider Python data toolkit • Strong capability with SQL, Spark, Airflow, Terraform, and workflow orchestration tools • Solid understanding of CICD practices using Git, Jenkins, or equivalent tooling • Hands-on experience building both batch and streaming data integration pipelines • Depth in More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
. Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into More ❯
software development lifecycle, from conception to deployment. Capable of conceptualizing and implementing software architectures spanning multiple technologies and platforms. Technology stack Python Flask Java Spring JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates More ❯
proficiency in SQL and experience with at least one programming language such as Python, Java, etc. Comprehensive understanding of data engineering tooling, including but not limited to BigQuery, Kafka, Airflow, Airbyte, and DataHub. Great stakeholder management, communication, and teamwork skills. What's in it for you? A range of flexible working options to dedicate time to what matters to More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
improvement and operational excellence. Deep expertise in data compliance frameworks, cost management, and platform optimisation. Strong hands-on experience with modern cloud data warehouses (Databricks, Snowflake, AWS), SQL, Spark, Airflow, Terraform. Advanced Python skills with orchestration tooling; solid experience in CI/CD (Git, Jenkins). Proven track record in data modelling, batch/real-time integration, and large More ❯
variety of tool sets and data sources. Data Architecture experience with and understanding of data lakes, warehouses, and/or streaming platforms. Data Engineering experience with tooling, such as Apache Spark and Kafka, and orchestration tools like ApacheAirflow or equivalent. Continuous Integration/Continuous Deployment experience with CI/CD tools like Jenkins or GitLab tailored More ❯
streaming architectures, to support advanced analytics, AI, and business intelligence use cases. Proven experience in designing architectures for structured, semi-structured, and unstructured data , leveraging technologies like Databricks, Snowflake, Apache Kafka , and Delta Lake to enable seamless data processing and analytics. Hands-on experience in data integration , including designing and optimising data pipelines (batch and streaming) and integrating cloud … based platforms (e.g., Azure Synapse, AWS Redshift, Google BigQuery ) with legacy systems, ensuring performance and scalability. Deep knowledge of ETL/ELT processes , leveraging tools like ApacheAirflow, dbt, or Informatica , with a focus on ensuring data quality, lineage, and integrity across the data lifecycle. Practical expertise in data and AI governance , including implementing frameworks for data privacy More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
FENESTRA
Lead code reviews and technical design discussions, providing expert guidance Optimize systems for speed, scalability, and reliability through sophisticated engineering approaches Oversee the evolution of our data infrastructure including Airflow/Google Cloud Composer environment Apply rigorous system design thinking to create elegant, efficient technical solutions Commercial & Client Engagement Represent Fenestra's technical capabilities to clients, partners, and at … production-quality code Very deep proficiency in Python and modern software engineering practices Advanced SQL knowledge with experience in data modelling and optimization Extensive experience with ETL frameworks, particularly ApacheAirflow (3+ years) Strong understanding of CI/CD, Infrastructure as Code (Terraform), and containerization Experience designing and optimizing big data processing systems particularly using Bigquery/GCP More ❯
Candidates must hold active SC Clearance. £75,000 - £85,000 Remote - with occasional client visits Skills : Python and SQL . ApacheAirflow for DAG orchestration and monitoring. Docker for containerisation. AWS data services: Redshift , OpenSearch , Lambda , Glue , Step Functions , Batch . CI/CD pipelines and YAML-based configuration More ❯
databases, data warehousing, and big data (Hadoop, Spark). Proficient in Python, Java, or Scala with solid OOP and design pattern understanding. Expertise in ETL tools and orchestration frameworks (Airflow, Apache NiFi). Hands-on experience with cloud platforms (AWS, Azure, or GCP) and their data services. More ❯
SQL scripting. Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch. Strong programming skills in Python or Scala for data processing. Experience with orchestration tools like ApacheAirflow or AWS Step Functions. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. We are an equal opportunity employer. All aspects of employment including More ❯
in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT and data modeling Nice to have: Iceberg/Delta/Hudi, Airflow/Dagster/Prefect, Python/TypeScript, data governance. More ❯
/frameworks (pandas, numpy, sklearn, TensorFlow, PyTorch) Strong understanding and experience in implementing end-to-end ML pipelines (data, training, validation, serving) Experience with ML workflow orchestration tools (e.g., Airflow, Prefect, Kubeflow) and ML feature or data platforms (e.g., Tecton, Databricks, etc.) Experience with cloud platforms (AWS, GCP/Vertex, Azure), Docker, and Kubernetes Solid coding practices (Git, automated More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯