solutions. Mentor junior engineers and contribute to technical reviews. Requirements: 7+ years in software engineering, including 4+ years in data engineering. Strong experience with data frameworks (eg, Spark, Kafka, Airflow) and ETL workflows. Proficiency with SQL and NoSQL databases, including query optimization. Experience with cloud data services (eg, AWS Redshift, S3, Glue, EMR) and CI/CD for data More ❯
Stockport, England, United Kingdom Hybrid / WFH Options
Gravitas Recruitment Group (Global) Ltd
and other squads to ensure smooth releases and integration. Key Skills Data Modelling Python & SQL AWS/Redshift 3–5+ years of experience in data engineering Nice to Have Airflow, Tableau, Power BI, Snowflake, Databricks Data governance/data quality tooling Degree preferred Atlassian/Jira, CI/CD, Terraform Why Join? Career Growth: Clear progression to Tech Lead. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
with product managers, data & AI engineers to deliver solutions to major business problems, including Generative AI applications What we’re looking for: Hands-on expertise in Python, SQL, Spark, Airflow, Terraform Proven experience with cloud-native data platforms (AWS, Databricks, Snowflake) Strong track record of mentoring or coaching other engineers Knowledge of CI/CD tooling (Git, Jenkins or More ❯
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
models and ETL processes. Minimum Requirements:5+ years' experience working within AWS technologies (S3, Redshift, RDS, etc.).3+ years' experience working with dbt.3+ years of experience with orchestration tooling (Airflow, Prefect, Dagster).Strong programming skills in languages such as Python or Java.Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka, Delta Lake, Iceberg, Arrow, Data Fusion).Familiarity with data More ❯
Lead Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar More ❯
Lead Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar More ❯
platform. What You’ll Get A key engineering role within a world-class technology organisation that values innovation and impact. Exposure to a modern data ecosystem including Iceberg, Kafka, Airflow, and other open-source technologies. A collaborative, intellectually curious culture where engineers are trusted to take ownership and drive outcomes. Excellent compensation package with strong performance incentives. What You More ❯
platform. What You’ll Get A key engineering role within a world-class technology organisation that values innovation and impact. Exposure to a modern data ecosystem including Iceberg, Kafka, Airflow, and other open-source technologies. A collaborative, intellectually curious culture where engineers are trusted to take ownership and drive outcomes. Excellent compensation package with strong performance incentives. What You More ❯
will lead analytics engineering teams who design, build, and enhance essential data platforms and products. You will: Architect and optimize data pipelines and workflows using modern tools (e.g., dbt, Airflow, SQL, Python). Take ownership of large-scale analytics platforms and data warehouse solutions (such as Snowflake, BigQuery, Redshift). Mentor and guide engineers across analytics and data engineering More ❯
will lead analytics engineering teams who design, build, and enhance essential data platforms and products. You will: Architect and optimize data pipelines and workflows using modern tools (e.g., dbt, Airflow, SQL, Python). Take ownership of large-scale analytics platforms and data warehouse solutions (such as Snowflake, BigQuery, Redshift). Mentor and guide engineers across analytics and data engineering More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
communication skills - able to collaborate across technical and business functions. A passion for data-driven problem solving, innovation, and continuous improvement. Nice-to-haves: Experience with data orchestration tools (Airflow, Prefect). Knowledge of data observability or cataloguing tools (Monte Carlo, OpenMetadata). Familiarity with large-scale consumer data or survey environments. More ❯
perform optimally Tech Stack: Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration: Tool Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal data, please refer to HCLTech More ❯
perform optimally Tech Stack: Cloud: On AWS Infrastructure/Access Management: Using Terraform Data Platform: Snowflake Data Integration: Tool Fivetran Data Transformation Tool: DBT core Data Orchestration Tool: MWWA(Airflow managed by AWS) CI/CD Pipelines: Github Actions Program Languages: SQL, Python and Terraform For more information on how we process your personal data, please refer to HCLTech More ❯
CD. Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
business • Expert knowledge of modern cloud data platforms (Databricks, Snowflake, ideally AWS) • Advanced Python programming skills and fluency with the wider Python data toolkit • Strong capability with SQL, Spark, Airflow, Terraform, and workflow orchestration tools • Solid understanding of CICD practices using Git, Jenkins, or equivalent tooling • Hands-on experience building both batch and streaming data integration pipelines • Depth in More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
. Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into More ❯
software development lifecycle, from conception to deployment. Capable of conceptualizing and implementing software architectures spanning multiple technologies and platforms. Technology stack Python Flask Java Spring JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in Los Angeles, California, the company operates More ❯
proficiency in SQL and experience with at least one programming language such as Python, Java, etc. Comprehensive understanding of data engineering tooling, including but not limited to BigQuery, Kafka, Airflow, Airbyte, and DataHub. Great stakeholder management, communication, and teamwork skills. What's in it for you? A range of flexible working options to dedicate time to what matters to More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
improvement and operational excellence. Deep expertise in data compliance frameworks, cost management, and platform optimisation. Strong hands-on experience with modern cloud data warehouses (Databricks, Snowflake, AWS), SQL, Spark, Airflow, Terraform. Advanced Python skills with orchestration tooling; solid experience in CI/CD (Git, Jenkins). Proven track record in data modelling, batch/real-time integration, and large More ❯