Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in More ❯
data applications using the latest open-source technologies. Desired working in offshore model and Managed outcome Develop logical and physical data models for big data platforms. Automate workflows using Apache Airflow. Create data pipelines using Apache Hive, Apache Spark, Apache Kafka. Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support. … of hands-on experience with developing data warehouse solutions and data products. 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required 5+ years of hands-on experience in modelling and designing schema for data lakes or for RDBMS platforms. Experience with programming languages More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
NLB Services
REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as ApacheAirflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing · Strong analytical skills, including a thorough understanding of how to interpret customer business requirements More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as ApacheAirflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing · Strong analytical skills, including a thorough understanding of how to interpret customer business requirements More ❯
REST APIs and integration techniques · Familiarity with data visualization tools and libraries (e.g., Power BI) · Background in database administration or performance tuning · Familiarity with data orchestration tools, such as ApacheAirflow · Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing · Strong analytical skills, including a thorough understanding of how to interpret customer business requirements More ❯
solutions. Mentor junior engineers and contribute to technical reviews. Requirements: 7+ years in software engineering, including 4+ years in data engineering. Strong experience with data frameworks (eg, Spark, Kafka, Airflow) and ETL workflows. Proficiency with SQL and NoSQL databases, including query optimization. Experience with cloud data services (eg, AWS Redshift, S3, Glue, EMR) and CI/CD for data More ❯
Stockport, England, United Kingdom Hybrid / WFH Options
Gravitas Recruitment Group (Global) Ltd
and other squads to ensure smooth releases and integration. Key Skills Data Modelling Python & SQL AWS/Redshift 3–5+ years of experience in data engineering Nice to Have Airflow, Tableau, Power BI, Snowflake, Databricks Data governance/data quality tooling Degree preferred Atlassian/Jira, CI/CD, Terraform Why Join? Career Growth: Clear progression to Tech Lead. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Xcede
with product managers, data & AI engineers to deliver solutions to major business problems, including Generative AI applications What we’re looking for: Hands-on expertise in Python, SQL, Spark, Airflow, Terraform Proven experience with cloud-native data platforms (AWS, Databricks, Snowflake) Strong track record of mentoring or coaching other engineers Knowledge of CI/CD tooling (Git, Jenkins or More ❯
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
models and ETL processes. Minimum Requirements:5+ years' experience working within AWS technologies (S3, Redshift, RDS, etc.).3+ years' experience working with dbt.3+ years of experience with orchestration tooling (Airflow, Prefect, Dagster).Strong programming skills in languages such as Python or Java.Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka, Delta Lake, Iceberg, Arrow, Data Fusion).Familiarity with data More ❯
Lead Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar More ❯
Lead Data Engineer in a fast-scaling tech or data-driven environment Strong proficiency in Python (or Scala/Java) and SQL Deep experience with data pipeline orchestration tools (Airflow, dbt, Dagster, Prefect) Strong knowledge of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift) Hands-on experience with streaming technologies (Kafka, Kinesis, or similar More ❯
platform. What You’ll Get A key engineering role within a world-class technology organisation that values innovation and impact. Exposure to a modern data ecosystem including Iceberg, Kafka, Airflow, and other open-source technologies. A collaborative, intellectually curious culture where engineers are trusted to take ownership and drive outcomes. Excellent compensation package with strong performance incentives. What You More ❯
platform. What You’ll Get A key engineering role within a world-class technology organisation that values innovation and impact. Exposure to a modern data ecosystem including Iceberg, Kafka, Airflow, and other open-source technologies. A collaborative, intellectually curious culture where engineers are trusted to take ownership and drive outcomes. Excellent compensation package with strong performance incentives. What You More ❯
will lead analytics engineering teams who design, build, and enhance essential data platforms and products. You will: Architect and optimize data pipelines and workflows using modern tools (e.g., dbt, Airflow, SQL, Python). Take ownership of large-scale analytics platforms and data warehouse solutions (such as Snowflake, BigQuery, Redshift). Mentor and guide engineers across analytics and data engineering More ❯
will lead analytics engineering teams who design, build, and enhance essential data platforms and products. You will: Architect and optimize data pipelines and workflows using modern tools (e.g., dbt, Airflow, SQL, Python). Take ownership of large-scale analytics platforms and data warehouse solutions (such as Snowflake, BigQuery, Redshift). Mentor and guide engineers across analytics and data engineering More ❯
research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯