fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯
self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Datatech Analytics
self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with More ❯
London, City of London, United Kingdom Hybrid / WFH Options
Datatech
self-organising, fast-paced environment. Nice-to-Have Skills : ·Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). ·Familiarity with data orchestration tools (e.g., Airflow). ·Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). ·Exposure to CI/CD pipelines, ideally using GitLab CI. ·Background working with More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with More ❯
how data flows through everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability … ML-ready datasets Influence technical decisions across the stack — they love new ideas What You Bring Hands-on experience in Python and SQL Experience with modern data tools (dbt, Airflow, Prefect, Dagster, etc.) Knowledge of cloud platforms like AWS , GCP , or Azure An understanding of data modelling and ETL best practices Curiosity, creativity, and a mindset that thrives in More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP OB have partnered with a UK FinTech Unicorn, who's Data function is undergoing rapid growth, and in-turn are looking to grow their Data team, with 2 highly skilled Data Engineers. You'll be working on shaping the companies Data function, driving Data best practices, and … collaborate with a variety of stakeholders to ensure the company are making Data driven decisions. Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP Tech Stack: Python & SQL DBT, AirFlow & BigQuery AWS/GCP ETL Pipelines Base salary of £90k-£115k depending on skills and experience Excellent overall package including a sizeable bonus and stock … working in Central London with 1-2 days a week required You must be UK based, and sadly sponsorship is unavailable Senior Data Engineer - FinTech Unicorn - Python, SQL, DBT, Airflow, AWS/GCP More ❯
of data modelling and data warehousing concepts Familiarity with version control systems, particularly Git Desirable Skills: Experience with infrastructure as code tools such as Terraform or CloudFormation Exposure to Apache Spark for distributed data processing Familiarity with workflow orchestration tools such as Airflow or AWS Step Functions Understanding of containerisation using Docker Experience with CI/CD pipelines More ❯
North London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform. This will … on experience working in Data heavy environments, with Real-Time Data Pipelines, Distributed Streaming Pipelines and strong knowledge of Cloud Environments. Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Key Skills and Experience: Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey Prior experience working as a Lead Engineer or Tech Lead Pays £100k-£130k + … in Central London To be considered, you must be UK based and unfortunately visa sponsorship is unavailable 2-stage interview process! Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey More ❯
and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines … on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have More ❯
Support A/B testing, funnel analysis, and data modelling to enhance performance Contribute to the evolution of the company's data warehouse and pipelines (experience with dbt or Airflow a plus) Collaborate with product, marketing, and commercial teams to translate data into actionable recommendations Communicate insights clearly across teams to influence business outcomes Role Requirements Strong technical skills … in SQL and dashboarding tools (Looker Studio/BigQuery) Experience with A/B testing, funnel analysis, and data modelling Familiarity with data warehouse concepts and pipeline development (dbt, Airflow experience advantageous) Ability to work collaboratively across multiple teams and communicate insights effectively Proactive, detail-oriented, and able to drive impact in a high-growth environment The Company You More ❯